WorldWideScience

Sample records for surveillance methods applied

  1. A Belief Network Decision Support Method Applied to Aerospace Surveillance and Battle Management Projects

    National Research Council Canada - National Science Library

    Staker, R

    2003-01-01

    This report demonstrates the application of a Bayesian Belief Network decision support method for Force Level Systems Engineering to a collection of projects related to Aerospace Surveillance and Battle Management...

  2. Applying GIS and Machine Learning Methods to Twitter Data for Multiscale Surveillance of Influenza.

    Directory of Open Access Journals (Sweden)

    Chris Allen

    Full Text Available Traditional methods for monitoring influenza are haphazard and lack fine-grained details regarding the spatial and temporal dynamics of outbreaks. Twitter gives researchers and public health officials an opportunity to examine the spread of influenza in real-time and at multiple geographical scales. In this paper, we introduce an improved framework for monitoring influenza outbreaks using the social media platform Twitter. Relying upon techniques from geographic information science (GIS and data mining, Twitter messages were collected, filtered, and analyzed for the thirty most populated cities in the United States during the 2013-2014 flu season. The results of this procedure are compared with national, regional, and local flu outbreak reports, revealing a statistically significant correlation between the two data sources. The main contribution of this paper is to introduce a comprehensive data mining process that enhances previous attempts to accurately identify tweets related to influenza. Additionally, geographical information systems allow us to target, filter, and normalize Twitter messages.

  3. Conceptual evaluation of population health surveillance programs: method and example.

    Science.gov (United States)

    El Allaki, Farouk; Bigras-Poulin, Michel; Ravel, André

    2013-03-01

    Veterinary and public health surveillance programs can be evaluated to assess and improve the planning, implementation and effectiveness of these programs. Guidelines, protocols and methods have been developed for such evaluation. In general, they focus on a limited set of attributes (e.g., sensitivity and simplicity), that are assessed quantitatively whenever possible, otherwise qualitatively. Despite efforts at standardization, replication by different evaluators is difficult, making evaluation outcomes open to interpretation. This ultimately limits the usefulness of surveillance evaluations. At the same time, the growing demand to prove freedom from disease or pathogen, and the Sanitary and Phytosanitary Agreement and the International Health Regulations require stronger surveillance programs. We developed a method for evaluating veterinary and public health surveillance programs that is detailed, structured, transparent and based on surveillance concepts that are part of all types of surveillance programs. The proposed conceptual evaluation method comprises four steps: (1) text analysis, (2) extraction of the surveillance conceptual model, (3) comparison of the extracted surveillance conceptual model to a theoretical standard, and (4) validation interview with a surveillance program designer. This conceptual evaluation method was applied in 2005 to C-EnterNet, a new Canadian zoonotic disease surveillance program that encompasses laboratory based surveillance of enteric diseases in humans and active surveillance of the pathogens in food, water, and livestock. The theoretical standard used for evaluating C-EnterNet was a relevant existing structure called the "Population Health Surveillance Theory". Five out of 152 surveillance concepts were absent in the design of C-EnterNet. However, all of the surveillance concept relationships found in C-EnterNet were valid. The proposed method can be used to improve the design and documentation of surveillance programs. It

  4. Design Science Methodology Applied to a Chemical Surveillance Tool

    Energy Technology Data Exchange (ETDEWEB)

    Huang, Zhuanyi; Han, Kyungsik; Charles-Smith, Lauren E.; Henry, Michael J.

    2017-05-11

    Public health surveillance systems gain significant benefits from integrating existing early incident detection systems,supported by closed data sources, with open source data.However, identifying potential alerting incidents relies on finding accurate, reliable sources and presenting the high volume of data in a way that increases analysts work efficiency; a challenge for any system that leverages open source data. In this paper, we present the design concept and the applied design science research methodology of ChemVeillance, a chemical analyst surveillance system.Our work portrays a system design and approach that translates theoretical methodology into practice creating a powerful surveillance system built for specific use cases.Researchers, designers, developers, and related professionals in the health surveillance community can build upon the principles and methodology described here to enhance and broaden current surveillance systems leading to improved situational awareness based on a robust integrated early warning system.

  5. Methods for surveillance of noise signals from nuclear power plants using auto power spectra

    International Nuclear Information System (INIS)

    Streich, M.

    1988-01-01

    A survey of methods for noise diagnostics applied in the nuclear power plant 'Bruno Leuschner' for surveillance of primary circuit is given. Considering a special example concept of surveillance of standard deviations is explained. (author)

  6. Quality assurance applied to an environmental surveillance program

    International Nuclear Information System (INIS)

    Oakes, T.W.; Shank, K.E.; Eldridge, J.S.

    1977-01-01

    A discussion of a quality assurance program applied to environmental surveillance activities is presented. This includes the philosophy and concepts of quality assurance, along with a detailed assessment of the sources of uncertainty in a monitoring program. The role management must play for a successful program is also discussed, and the quality assurance program implemented at Oak Ridge National Laboratory is presented

  7. Applied nonparametric statistical methods

    CERN Document Server

    Sprent, Peter

    2007-01-01

    While preserving the clear, accessible style of previous editions, Applied Nonparametric Statistical Methods, Fourth Edition reflects the latest developments in computer-intensive methods that deal with intractable analytical problems and unwieldy data sets. Reorganized and with additional material, this edition begins with a brief summary of some relevant general statistical concepts and an introduction to basic ideas of nonparametric or distribution-free methods. Designed experiments, including those with factorial treatment structures, are now the focus of an entire chapter. The text also e

  8. A Novel Surveillance System Applied in Civil Airport

    Directory of Open Access Journals (Sweden)

    Sun Hua Bo

    2016-01-01

    Full Text Available Conventional security monitoring of civil airport usually uses a fixed camera to acquire images. There are several problems with performance including difficulties introduced in the information transmission, storage, and analysis of the process. Insect compound eyes offer unique advantages for moving target capture and these have attracted the attention of many researchers in recent years. This paper contributes to this research by proposing a new surveillance system applied in civil airport. We discuss the finished bionic structure of the system, the development of the bionic control circuit, and introduce the proposed mathematical model of bionic compound eyes for data acquisition and image mosaic. Image matching for large view is also illustrated with different conditions. This mode and algorithm effectively achieve safety surveillance of airport with large field of view and high real-time processing.

  9. Continuous surveillance of transformers using artificial intelligence methods; Surveillance continue des transformateurs: application des methodes d'intelligence artificielle

    Energy Technology Data Exchange (ETDEWEB)

    Schenk, A.; Germond, A. [Ecole Polytechnique Federale de Lausanne, Lausanne (Switzerland); Boss, P.; Lorin, P. [ABB Secheron SA, Geneve (Switzerland)

    2000-07-01

    The article describes a new method for the continuous surveillance of power transformers based on the application of artificial intelligence (AI) techniques. An experimental pilot project on a specially equipped, strategically important power transformer is described. Traditional surveillance methods and the use of mathematical models for the prediction of faults are described. The article describes the monitoring equipment used in the pilot project and the AI principles such as self-organising maps that are applied. The results obtained from the pilot project and methods for their graphical representation are discussed.

  10. Applied Bayesian hierarchical methods

    National Research Council Canada - National Science Library

    Congdon, P

    2010-01-01

    ... . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Posterior Inference from Bayes Formula . . . . . . . . . . . . 1.3 Markov Chain Monte Carlo Sampling in Relation to Monte Carlo Methods: Obtaining Posterior...

  11. Methods of applied mathematics

    CERN Document Server

    Hildebrand, Francis B

    1992-01-01

    This invaluable book offers engineers and physicists working knowledge of a number of mathematical facts and techniques not commonly treated in courses in advanced calculus, but nevertheless extremely useful when applied to typical problems in many different fields. It deals principally with linear algebraic equations, quadratic and Hermitian forms, operations with vectors and matrices, the calculus of variations, and the formulations and theory of linear integral equations. Annotated problems and exercises accompany each chapter.

  12. HIV surveillance methods for the incarcerated population.

    Science.gov (United States)

    Dean, Hazel D; Lansky, Amy; Fleming, Patricia L

    2002-10-01

    In the United States, monitoring the HIV/AIDS epidemic among the incarcerated population is done by (a) conducting a census of persons in prisons and jails reported to be infected with HIV or diagnosed with AIDS, (b) seroprevalence surveys in selected correctional facilities, and (c) population-based HIV/AIDS case surveillance by state health departments. We describe methods for HIV/AIDS case surveillance in correctional settings and present data from the HIV/AIDS Reporting System (HARS) and the Supplement to HIV and AIDS Surveillance (SHAS) to describe the demographic, behavioral, and clinical characteristics of HIV-infected persons who were incarcerated at the time of diagnosis. HARS data showed a higher proportion of females and a lower proportion of injection drug users for incarcerated persons diagnosed with HIV (not AIDS) compared to those initially diagnosed with AIDS. The SHAS data showed a high prevalence of injection drug use, crack use, alcohol abuse, and exchanging sex for money or drugs. Together, HARS and SHAS collect fairly comprehensive information of risk behaviors from persons with HIV infection and AIDS. Advances in HIV prevention and care for the incarcerated community will require an accurate and timely description of the magnitude of the HIV epidemic in correctional settings. These data are needed to guide programmatic efforts to reduce HIV transmission in prisons and jails and in the general community upon release and ensure needed risk reduction and health care services for incarcerated persons.

  13. Biotest method in Rhine river surveillance

    International Nuclear Information System (INIS)

    Nolte, M.

    1994-01-01

    Against the background of the 1986 Sandoz chemical accident the national and international commission for the protection of the Rhine river was prompted to construct, a continuous supra-regional surveillance of the river. Its aim is a biological warning system which encompasses the exising chemical-physical monitoring of the water. The Biotest method was newly developed in a joint plan of eight separate projects. The bio-monitors are continuous or semi-continuous systems which make up for the time delay of chemical analyses. (BWI) [de

  14. Cumulative query method for influenza surveillance using search engine data.

    Science.gov (United States)

    Seo, Dong-Woo; Jo, Min-Woo; Sohn, Chang Hwan; Shin, Soo-Yong; Lee, JaeHo; Yu, Maengsoo; Kim, Won Young; Lim, Kyoung Soo; Lee, Sang-Il

    2014-12-16

    Internet search queries have become an important data source in syndromic surveillance system. However, there is currently no syndromic surveillance system using Internet search query data in South Korea. The objective of this study was to examine correlations between our cumulative query method and national influenza surveillance data. Our study was based on the local search engine, Daum (approximately 25% market share), and influenza-like illness (ILI) data from the Korea Centers for Disease Control and Prevention. A quota sampling survey was conducted with 200 participants to obtain popular queries. We divided the study period into two sets: Set 1 (the 2009/10 epidemiological year for development set 1 and 2010/11 for validation set 1) and Set 2 (2010/11 for development Set 2 and 2011/12 for validation Set 2). Pearson's correlation coefficients were calculated between the Daum data and the ILI data for the development set. We selected the combined queries for which the correlation coefficients were .7 or higher and listed them in descending order. Then, we created a cumulative query method n representing the number of cumulative combined queries in descending order of the correlation coefficient. In validation set 1, 13 cumulative query methods were applied, and 8 had higher correlation coefficients (min=.916, max=.943) than that of the highest single combined query. Further, 11 of 13 cumulative query methods had an r value of ≥.7, but 4 of 13 combined queries had an r value of ≥.7. In validation set 2, 8 of 15 cumulative query methods showed higher correlation coefficients (min=.975, max=.987) than that of the highest single combined query. All 15 cumulative query methods had an r value of ≥.7, but 6 of 15 combined queries had an r value of ≥.7. Cumulative query method showed relatively higher correlation with national influenza surveillance data than combined queries in the development and validation set.

  15. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1975-09-01

    Application of normal, log normal, and Weibull distributions to environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. Corresponding W test calculations were made to determine the probability of a particular data set falling within the distribution of interest. Conclusions are drawn as to the fit of any data group to the various distributions. The significance of fitting statistical distributions to the data is discussed

  16. Evaluation of two surveillance methods for surgical site infection

    Directory of Open Access Journals (Sweden)

    M. Haji Abdolbaghi

    2006-08-01

    Full Text Available Background: Surgical wound infection surveillance is an important facet of hospital infection control processes. There are several surveillance methods for surgical site infections. The objective of this study is to evaluate the accuracy of two different surgical site infection surveillance methods. Methods: In this prospective cross sectional study 3020 undergoing surgey in general surgical wards of Imam Khomeini hospital were included. Surveillance methods consisted of review of medical records for postoperative fever and review of nursing daily note for prescription of antibiotics postoperatively and during patient’s discharge. Review of patient’s history and daily records and interview with patient’s surgeon and the head-nurse of the ward considered as a gold standard for surveillance. Results: The postoperative antibiotic consumption especially when considering its duration is a proper method for surgical wound infection surveillance. Accomplishments of a prospective study with postdischarge follow up until 30 days after surgery is recommended. Conclusion: The result of this study showed that postoperative antibiotic surveillance method specially with consideration of the antibiotic usage duration is a proper method for surgical site infection surveillance in general surgery wards. Accomplishments of a prospective study with post discharge follow up until 30 days after surgery is recommended.

  17. A surveillance sector review applied to infectious diseases at a country level

    Directory of Open Access Journals (Sweden)

    Easther Sally

    2010-06-01

    Full Text Available Abstract Background The new International Health Regulations (IHR require World Health Organization (WHO member states to assess their core capacity for surveillance. Such reviews also have the potential to identify important surveillance gaps, improve the organisation of disparate surveillance systems and to focus attention on upstream hazards, determinants and interventions. Methods We developed a surveillance sector review method for evaluating all of the surveillance systems and related activities across a sector, in this case those concerned with infectious diseases in New Zealand. The first stage was a systematic description of these surveillance systems using a newly developed framework and classification system. Key informant interviews were conducted to validate the available information on the systems identified. Results We identified 91 surveillance systems and related activities in the 12 coherent categories of infectious diseases examined. The majority (n = 40 or 44% of these were disease surveillance systems. They covered all categories, particularly for more severe outcomes including those resulting in death or hospitalisations. Except for some notifiable diseases and influenza, surveillance of less severe, but important infectious diseases occurring in the community was largely absent. There were 31 systems (34% for surveillance of upstream infectious disease hazards, including risk and protective factors. This area tended to have many potential gaps and lack integration, partly because such systems were operated by a range of different agencies, often outside the health sector. There were fewer surveillance systems for determinants, including population size and characteristics (n = 9, and interventions (n = 11. Conclusions It was possible to create and populate a workable framework for describing all the infectious diseases surveillance systems and related activities in a single developed country and to identify potential

  18. Possibilities of Applying Video Surveillance and other ICT Tools and Services in the Production Process

    Directory of Open Access Journals (Sweden)

    Adis Rahmanović

    2018-02-01

    Full Text Available The paper presents the possibilities of applying Video surveillance and other ICT tools and services in the production process. The first part of the paper presented the system for controlling video surveillance for and the given opportunity of application of video surveillance for the security of the employees and the assets. In the second part of the paper an analysis of the system for controlling production is given and then a video surveillance of a work excavator. The next part of the paper presents integration of video surveillance and the accompanying tools. At the end of the paper, suggestions were also given for further works in the field of data protection and cryptography in video surveillance use.

  19. Use of pattern recognition methods for nuclear reactor surveillance

    International Nuclear Information System (INIS)

    Dubuisson, B.; Lavison, P.

    1979-01-01

    The principles of pattern recognition are recalled. During the past several years, a team of the University of Compiegne has been working to adapt this method to surveillance of technologically complex systems poblems. In the second part of this article, the previous groundwork study, done in collaboration with CEN-Cadarache, on the surveillance problem of the Rapsodie reactor is described [fr

  20. Statistical distributions as applied to environmental surveillance data

    International Nuclear Information System (INIS)

    Speer, D.R.; Waite, D.A.

    1976-01-01

    Application of normal, lognormal, and Weibull distributions to radiological environmental surveillance data was investigated for approximately 300 nuclide-medium-year-location combinations. The fit of data to distributions was compared through probability plotting (special graph paper provides a visual check) and W test calculations. Results show that 25% of the data fit the normal distribution, 50% fit the lognormal, and 90% fit the Weibull.Demonstration of how to plot each distribution shows that normal and lognormal distributions are comparatively easy to use while Weibull distribution is complicated and difficult to use. Although current practice is to use normal distribution statistics, normal fit the least number of data groups considered in this study

  1. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis.

    Directory of Open Access Journals (Sweden)

    Alexandra Ziemann

    Full Text Available Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors.We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events.We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness.We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.

  2. Success Factors of European Syndromic Surveillance Systems: A Worked Example of Applying Qualitative Comparative Analysis.

    Science.gov (United States)

    Ziemann, Alexandra; Fouillet, Anne; Brand, Helmut; Krafft, Thomas

    2016-01-01

    Syndromic surveillance aims at augmenting traditional public health surveillance with timely information. To gain a head start, it mainly analyses existing data such as from web searches or patient records. Despite the setup of many syndromic surveillance systems, there is still much doubt about the benefit of the approach. There are diverse interactions between performance indicators such as timeliness and various system characteristics. This makes the performance assessment of syndromic surveillance systems a complex endeavour. We assessed if the comparison of several syndromic surveillance systems through Qualitative Comparative Analysis helps to evaluate performance and identify key success factors. We compiled case-based, mixed data on performance and characteristics of 19 syndromic surveillance systems in Europe from scientific and grey literature and from site visits. We identified success factors by applying crisp-set Qualitative Comparative Analysis. We focused on two main areas of syndromic surveillance application: seasonal influenza surveillance and situational awareness during different types of potentially health threatening events. We found that syndromic surveillance systems might detect the onset or peak of seasonal influenza earlier if they analyse non-clinical data sources. Timely situational awareness during different types of events is supported by an automated syndromic surveillance system capable of analysing multiple syndromes. To our surprise, the analysis of multiple data sources was no key success factor for situational awareness. We suggest to consider these key success factors when designing or further developing syndromic surveillance systems. Qualitative Comparative Analysis helped interpreting complex, mixed data on small-N cases and resulted in concrete and practically relevant findings.

  3. Surface-activated joining method for surveillance coupon reconstitution

    International Nuclear Information System (INIS)

    Kaihara, Shoichiro; Nakamura, Terumi

    1993-01-01

    As nuclear power plants approach the end of their license periods and license renewal is contemplated, there is an increasing need to expand the data base of mechanical properties obtainable from archival surveillance specimens. A new joining method for reconstituting broken Charpy specimens is being developed, the objective being to retain the original properties of the material in the process. The new method is called surface-activated joining (SAJ). It is designed to obtain a good junction without applying extra heating and deformation. In particular, the purpose of SAJ is to minimize the width of the heat-affected zone (HAZ) and to decrease the maximum temperature experienced by the specimen during reconsolidation of the two pieces. Generally, machined metal surfaces are contaminated with films of oxide, adsorbed gas, oil, or other vapors that impede bonding of surfaces during joining. However, if surface contamination is removed and the two surfaces are mated as closely as possible, joining can be achieved at low temperatures and modest stress levels. In order to apply the SAJ method, the following requirements must be met: (1) inert atmosphere to protect the surfaces from atmospheric gases and oxidation; (2) removal of the existing contamination layers to activate the surfaces; and (3) method for bringing the two surfaces into very intimate contact prior to joining

  4. A semi-automated magnetic capture probe based DNA extraction and real-time PCR method applied in the Swedish surveillance of Echinococcus multilocularis in red fox (Vulpes vulpes) faecal samples.

    Science.gov (United States)

    Isaksson, Mats; Hagström, Åsa; Armua-Fernandez, Maria Teresa; Wahlström, Helene; Ågren, Erik Olof; Miller, Andrea; Holmberg, Anders; Lukacs, Morten; Casulli, Adriano; Deplazes, Peter; Juremalm, Mikael

    2014-12-19

    Following the first finding of Echinococcus multilocularis in Sweden in 2011, 2985 red foxes (Vulpes vulpes) were analysed by the segmental sedimentation and counting technique. This is a labour intensive method and requires handling of the whole carcass of the fox, resulting in a costly analysis. In an effort to reduce the cost of labour and sample handling, an alternative method has been developed. The method is sensitive and partially automated for detection of E. multilocularis in faecal samples. The method has been used in the Swedish E. multilocularis monitoring program for 2012-2013 on more than 2000 faecal samples. We describe a new semi-automated magnetic capture probe DNA extraction method and real time hydrolysis probe polymerase chain reaction assay (MC-PCR) for the detection of E. multilocularis DNA in faecal samples from red fox. The diagnostic sensitivity was determined by validating the new method against the sedimentation and counting technique in fox samples collected in Switzerland where E. multilocularis is highly endemic. Of 177 foxes analysed by the sedimentation and counting technique, E. multilocularis was detected in 93 animals. Eighty-two (88%, 95% C.I 79.8-93.9) of these were positive in the MC-PCR. In foxes with more than 100 worms, the MC-PCR was positive in 44 out of 46 (95.7%) cases. The two MC-PCR negative samples originated from foxes with only immature E. multilocularis worms. In foxes with 100 worms or less, (n = 47), 38 (80.9%) were positive in the MC-PCR. The diagnostic specificity of the MC-PCR was evaluated using fox scats collected within the Swedish screening. Of 2158 samples analysed, two were positive. This implies that the specificity is at least 99.9% (C.I. = 99.7-100). The MC-PCR proved to have a high sensitivity and a very high specificity. The test is partially automated but also possible to perform manually if desired. The test is well suited for nationwide E. multilocularis surveillance programs where sampling

  5. Methods for injury surveillance in international cricket | Orchard ...

    African Journals Online (AJOL)

    Introduction. Varying methods of cricket injury surveillance projects have made direct comparison of published studies in this field impossible. Methods. A consensus regarding definitions and methods to calculate injury rates in cricket was sought between researchers in this field. This was arrived at through a variety of ...

  6. Applying participatory approaches in the evaluation of surveillance systems: A pilot study on African swine fever surveillance in Corsica.

    Science.gov (United States)

    Calba, Clémentine; Antoine-Moussiaux, Nicolas; Charrier, François; Hendrikx, Pascal; Saegerman, Claude; Peyre, Marisa; Goutard, Flavie L

    2015-12-01

    The implementation of regular and relevant evaluations of surveillance systems is critical in improving their effectiveness and their relevance whilst limiting their cost. The complex nature of these systems and the variable contexts in which they are implemented call for the development of flexible evaluation tools. Within this scope, participatory tools have been developed and implemented for the African swine fever (ASF) surveillance system in Corsica (France). The objectives of this pilot study were, firstly, to assess the applicability of participatory approaches within a developed environment involving various stakeholders and, secondly, to define and test methods developed to assess evaluation attributes. Two evaluation attributes were targeted: the acceptability of the surveillance system and its the non-monetary benefits. Individual semi-structured interviews and focus groups were implemented with representatives from every level of the system. Diagramming and scoring tools were used to assess the different elements that compose the definition of acceptability. A contingent valuation method, associated with proportional piling, was used to assess the non-monetary benefits, i.e., the value of sanitary information. Sixteen stakeholders were involved in the process, through 3 focus groups and 8 individual semi-structured interviews. Stakeholders were selected according to their role in the system and to their availability. Results highlighted a moderate acceptability of the system for farmers and hunters and a high acceptability for other representatives (e.g., private veterinarians, local laboratories). Out of the 5 farmers involved in assessing the non-monetary benefits, 3 were interested in sanitary information on ASF. The data collected via participatory approaches enable relevant recommendations to be made, based on the Corsican context, to improve the current surveillance system. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights

  7. Surveillance

    DEFF Research Database (Denmark)

    Albrechtslund, Anders; Coeckelbergh, Mark; Matzner, Tobias

    Studying surveillance involves raising questions about the very nature of concepts such as information, technology, identity, space and power. Besides the maybe all too obvious ethical issues often discussed with regard to surveillance, there are several other angles and approaches that we should...... like to encourage. Therefore, our panel will focus on the philosophical, yet non-ethical issues of surveillance in order to stimulate an intense debate with the audience on the ethical implications of our enquiries. We also hope to provide a broader and deeper understanding of surveillance....

  8. Applying surveillance and screening to family psychosocial issues: implications for the medical home.

    Science.gov (United States)

    Garg, Arvin; Dworkin, Paul H

    2011-06-01

    Within the medical home, understanding the family and community context in which children live is critical to optimally promoting children's health and development. How to best identify psychosocial issues likely to have an impact on children's development is uncertain. Professional guidelines encourage pediatricians to incorporate family psychosocial screening within the context of primary care, yet few providers routinely screen for these issues. The authors propose applying the core principles of surveillance and screening, as applied to children's development and behavior, to also address family psychosocial issues during health supervision services. Integrating psychosocial surveillance and screening into the medical home requires changes in professional training, provider practice, and public policy. The potential of family psychosocial surveillance and screening to promote children's optimal development justifies such changes.

  9. A risk analysis approach applied to field surveillance in utility meters in legal metrology

    Science.gov (United States)

    Rodrigues Filho, B. A.; Nonato, N. S.; Carvalho, A. D.

    2018-03-01

    Field surveillance represents the level of control in metrological supervision responsible for checking the conformity of measuring instruments in-service. Utility meters represent the majority of measuring instruments produced by notified bodies due to self-verification in Brazil. They play a major role in the economy once electricity, gas and water are the main inputs to industries in their production processes. Then, to optimize the resources allocated to control these devices, the present study applied a risk analysis in order to identify among the 11 manufacturers notified to self-verification, the instruments that demand field surveillance.

  10. Evaluation of tracking methods for maritime surveillance

    Science.gov (United States)

    Fischer, Yvonne; Baum, Marcus; Flohr, Fabian; Hanebeck, Uwe D.; Beyerer, Jürgen

    2012-06-01

    In this article, we present an evaluation of several multi-target tracking methods based on simulated scenarios in the maritime domain. In particular, we consider variations of the Joint Integrated Probabilistic Data Association (JIPDA) algorithm, namely the Linear Multi-Target IPDA (LMIPDA), Linear Joint IPDA (LJIPDA), and Markov Chain Monte Carlo Data Association (MCMCDA). The algorithms are compared with respect to an extension of the Optimal Subpattern Assignment (OSPA) metric, the Hellinger distance and further performance measures. As no single algorithm is equally well fitted to all tested scenarios, our results show which algorithms fits best for specific scenarios.

  11. Applied Formal Methods for Elections

    DEFF Research Database (Denmark)

    Wang, Jian

    development time, or second dynamically, i.e. monitoring while an implementation is used during an election, or after the election is over, for forensic analysis. This thesis contains two chapters on this subject: the chapter Analyzing Implementations of Election Technologies describes a technique...... process. The chapter Measuring Voter Lines describes an automated data collection method for measuring voters' waiting time, and discusses statistical models designed to provide an understanding of the voter behavior in polling stations....

  12. Applied Formal Methods for Elections

    DEFF Research Database (Denmark)

    Wang, Jian

    Information technology is changing the way elections are organized. Technology renders the electoral process more efficient, but things could also go wrong: Voting software is complex, it consists of over thousands of lines of code, which makes it error-prone. Technical problems may cause delays ...... process. The chapter Measuring Voter Lines describes an automated data collection method for measuring voters' waiting time, and discusses statistical models designed to provide an understanding of the voter behavior in polling stations....... at polling stations, or even delay the announcement of the final result. This thesis describes a set of methods to be used, for example, by system developers, administrators, or decision makers to examine election technologies, social choice algorithms and voter experience. Technology: Verifiability refers...... development time, or second dynamically, i.e. monitoring while an implementation is used during an election, or after the election is over, for forensic analysis. This thesis contains two chapters on this subject: the chapter Analyzing Implementations of Election Technologies describes a technique...

  13. Bayesian methods applied to GWAS.

    Science.gov (United States)

    Fernando, Rohan L; Garrick, Dorian

    2013-01-01

    Bayesian multiple-regression methods are being successfully used for genomic prediction and selection. These regression models simultaneously fit many more markers than the number of observations available for the analysis. Thus, the Bayes theorem is used to combine prior beliefs of marker effects, which are expressed in terms of prior distributions, with information from data for inference. Often, the analyses are too complex for closed-form solutions and Markov chain Monte Carlo (MCMC) sampling is used to draw inferences from posterior distributions. This chapter describes how these Bayesian multiple-regression analyses can be used for GWAS. In most GWAS, false positives are controlled by limiting the genome-wise error rate, which is the probability of one or more false-positive results, to a small value. As the number of test in GWAS is very large, this results in very low power. Here we show how in Bayesian GWAS false positives can be controlled by limiting the proportion of false-positive results among all positives to some small value. The advantage of this approach is that the power of detecting associations is not inversely related to the number of markers.

  14. Novel surveillance methods for the control of Ebola virus disease.

    Science.gov (United States)

    Houlihan, C F; Youkee, D; Brown, C S

    2017-05-01

    The unprecedented scale of the 2013-2016 West African Ebola virus disease (EVD) outbreak was in a large part due to failings in surveillance: contacts of confirmed cases were not systematically identified, monitored and diagnosed early, and new cases appearing in previously unaffected communities were similarly not rapidly identified, diagnosed and isolated. Over the course of this epidemic, traditional surveillance methods were strengthened and novel methods introduced. The wealth of experience gained, and the systems introduced in West Africa, should be used in future EVD outbreaks, as well as for other communicable diseases in the region and beyond. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Sunglass detection method for automation of video surveillance system

    Science.gov (United States)

    Sikandar, Tasriva; Samsudin, Wan Nur Azhani W.; Hawari Ghazali, Kamarul; Mohd, Izzeldin I.; Fazle Rabbi, Mohammad

    2018-04-01

    Wearing sunglass to hide face from surveillance camera is a common activity in criminal incidences. Therefore, sunglass detection from surveillance video has become a demanding issue in automation of security systems. In this paper we propose an image processing method to detect sunglass from surveillance images. Specifically, a unique feature using facial height and width has been employed to identify the covered region of the face. The presence of covered area by sunglass is evaluated using facial height-width ratio. Threshold value of covered area percentage is used to classify the glass wearing face. Two different types of glasses have been considered i.e. eye glass and sunglass. The results of this study demonstrate that the proposed method is able to detect sunglasses in two different illumination conditions such as, room illumination as well as in the presence of sunlight. In addition, due to the multi-level checking in facial region, this method has 100% accuracy of detecting sunglass. However, in an exceptional case where fabric surrounding the face has similar color as skin, the correct detection rate was found 93.33% for eye glass.

  16. Evaluation 2000 and regulation and method. Release monitoring and environmental surveillance around Cea centers

    International Nuclear Information System (INIS)

    2001-06-01

    This publication counts for the year 2000 for the evaluation of liquid and gaseous radioactive effluents releases and the radioactivity levels measured in the vicinity of Cea centers, through the air, water, vegetation and milk surveillance. An analysis of the results from 1996 to 2000 allows to follow their evolution. A second booklet develops the sampling and measurement methods made on effluents in environment. It present besides the regulation applied to effluents monitoring. (N.C.)

  17. Harnessing information from injury narratives in the 'big data' era: understanding and applying machine learning for injury surveillance.

    Science.gov (United States)

    Vallmuur, Kirsten; Marucci-Wellman, Helen R; Taylor, Jennifer A; Lehto, Mark; Corns, Helen L; Smith, Gordon S

    2016-04-01

    Vast amounts of injury narratives are collected daily and are available electronically in real time and have great potential for use in injury surveillance and evaluation. Machine learning algorithms have been developed to assist in identifying cases and classifying mechanisms leading to injury in a much timelier manner than is possible when relying on manual coding of narratives. The aim of this paper is to describe the background, growth, value, challenges and future directions of machine learning as applied to injury surveillance. This paper reviews key aspects of machine learning using injury narratives, providing a case study to demonstrate an application to an established human-machine learning approach. The range of applications and utility of narrative text has increased greatly with advancements in computing techniques over time. Practical and feasible methods exist for semiautomatic classification of injury narratives which are accurate, efficient and meaningful. The human-machine learning approach described in the case study achieved high sensitivity and PPV and reduced the need for human coding to less than a third of cases in one large occupational injury database. The last 20 years have seen a dramatic change in the potential for technological advancements in injury surveillance. Machine learning of 'big injury narrative data' opens up many possibilities for expanded sources of data which can provide more comprehensive, ongoing and timely surveillance to inform future injury prevention policy and practice. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  18. Performance evaluation of optimization methods for super-resolution mosaicking on UAS surveillance videos

    Science.gov (United States)

    Camargo, Aldo; He, Qiang; Palaniappan, K.

    2012-06-01

    Unmanned Aircraft Systems (UAS) have been widely applied into military reconnaissance and surveillance by exploiting the information collected from the digital imaging payload. However, the data analysis of UAS videos is frequently limited by motion blur; the frame-to-frame movement induced by aircraft roll, wind gusts, and less than ideal atmospheric conditions; and the noise inherent within the image sensors. Therefore, the super-resolution mosaicking on low-resolution UAS surveillance video frames, becomes an important task for UAS video processing and is a pre-step for further effective image understanding. Here we develop a novel super-resolution framework which does not require the construction of sparse matrices. This method applied image operators in spatial domain and adopted an iterated back-projection method to conduct super-resolution mosaics from UAS surveillance video frames. The Steepest Descent method, Conjugate Gradient method and Levenberg Marquardt algorithm are used to numerically solve the nonlinear optimization problem in the modeling of super-resolution mosaic. A quantity comparison in computation time and visual performance of the super-resolution using the three numerical methods is performed. The Levenberg Marquardt algorithm provides a numerical solution to the least squares curve fitting, which avoids the time-consuming computation of the inverse of the pseudo Hessian matrix in regular singular value decomposition (SVD). The Levenberg Marquardt method, interpolating between the Gauss-Newton algorithm (GNA) and the method of gradient descent, is efficient, robust, and easy to implement. The results obtained in our simulations shows a great improvement of the resolution of the low resolution mosaic of up to 47.54 dB for synthetic images, and a considerable visual improvement in sharpness and visual details for real UAS surveillance frames. The convergence is generally reached in no more than ten iterations.

  19. Microbiological methods for surveillance of carrier status of multiresistant bacteria.

    Science.gov (United States)

    Oteo, Jesús; Bou, Germán; Chaves, Fernando; Oliver, Antonio

    2017-12-01

    The presence of colonised patients is one of the main routes for the spread of multiresistant bacteria, and its containment is a clinical and public health priority. Surveillance studies are essential for early detection of colonisation by these bacteria. This article discusses the different microbiological methods, both based on culturing and molecular methods, for detection of carriers of multiresistant bacteria. Those species with a high clinical/epidemiological impact or generating therapeutic difficulties are included: Methicillin-resistant Staphylococcus aureus, Enterococcus spp. resistant to glycopeptides, enterobacteriaceae producing extended spectrum β-lactamases and plasmid-mediated AmpC, carbapenemases producing enterobacteriaceae, Acinetobacter baumannii and multiresistant Pseudomonas aeruginosa. The information in this document should be considered as a structure matrix to be tailored to the specific needs of each centre. Copyright © 2016 Elsevier España, S.L.U. and Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  20. A comparison of surveillance methods for small incidence rates

    Energy Technology Data Exchange (ETDEWEB)

    Sego, Landon H.; Woodall, William H.; Reynolds, Marion R.

    2008-05-15

    A number of methods have been proposed to detect an increasing shift in the incidence rate of a rare health event, such as a congenital malformation. Among these are the Sets method, two modifcations of the Sets method, and the CUSUM method based on the Poisson distribution. We consider the situation where data are observed as a sequence of Bernoulli trials and propose the Bernoulli CUSUM chart as a desirable method for the surveillance of rare health events. We compare the performance of the Sets method and its modifcations to the Bernoulli CUSUM chart under a wide variety of circumstances. Chart design parameters were chosen to satisfy a minimax criteria.We used the steady- state average run length to measure chart performance instead of the average run length which was used in nearly all previous comparisons involving the Sets method or its modifcations. Except in a very few instances, we found that the Bernoulli CUSUM chart has better steady-state average run length performance than the Sets method and its modifcations for the extensive number of cases considered.

  1. H-methods in applied sciences

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar

    2008-01-01

    The author has developed a framework for mathematical modelling within applied sciences. It is characteristic for data from 'nature and industry' that they have reduced rank for inference. It means that full rank solutions normally do not give satisfactory solutions. The basic idea of H...... with finding a balance between the estimation task and the prediction task. The name H-methods has been chosen because of close analogy with the Heisenberg uncertainty inequality. A similar situation is present in modelling data. The mathematical modelling stops, when the prediction aspect of the model cannot...... be improved. H-methods have been applied to wide range of fields within applied sciences. In each case, the H-methods provide with superior solutions compared to the traditional ones. A background for the H-methods is presented. The H-principle of mathematical modelling is explained. It is shown how...

  2. [Montessori method applied to dementia - literature review].

    Science.gov (United States)

    Brandão, Daniela Filipa Soares; Martín, José Ignacio

    2012-06-01

    The Montessori method was initially applied to children, but now it has also been applied to people with dementia. The purpose of this study is to systematically review the research on the effectiveness of this method using Medical Literature Analysis and Retrieval System Online (Medline) with the keywords dementia and Montessori method. We selected lo studies, in which there were significant improvements in participation and constructive engagement, and reduction of negative affects and passive engagement. Nevertheless, systematic reviews about this non-pharmacological intervention in dementia rate this method as weak in terms of effectiveness. This apparent discrepancy can be explained because the Montessori method may have, in fact, a small influence on dimensions such as behavioral problems, or because there is no research about this method with high levels of control, such as the presence of several control groups or a double-blind study.

  3. Risk-based methods for fish and terrestrial animal disease surveillance.

    Science.gov (United States)

    Oidtmann, Birgit; Peeler, Edmund; Lyngstad, Trude; Brun, Edgar; Bang Jensen, Britt; Stärk, Katharina D C

    2013-10-01

    Over recent years there have been considerable methodological developments in the field of animal disease surveillance. The principles of risk analysis were conceptually applied to surveillance in order to further develop approaches and tools (scenario tree modelling) to design risk-based surveillance (RBS) programmes. In the terrestrial animal context, examples of risk-based surveillance have demonstrated the substantial potential for cost saving, and a similar benefit is expected also for aquatic animals. RBS approaches are currently largely absent for aquatic animal diseases. A major constraint in developing RBS designs in the aquatic context is the lack of published data to assist in the design of RBS: this applies to data on (i) the relative risk of farm sites becoming infected due to the presence or absence of a given risk factor; (ii) the sensitivity of diagnostic tests (specificity is often addressed by follow-up investigation and re-testing and therefore less of a concern); (iii) data on the variability of prevalence of infection for fish within a holding unit, between holding units and at farm level. Another constraint is that some of the most basic data for planning surveillance are missing, e.g. data on farm location and animal movements. In Europe, registration or authorisation of fish farms has only recently become a requirement under EU Directive 2006/88. Additionally, the definition of the epidemiological unit (at site or area level) in the context of aquaculture is a challenge due to the often high level of connectedness (mainly via water) of aquaculture facilities with the aquatic environment. This paper provides a review of the principles, methods and examples of RBS in terrestrial, farmed and wild animals. It discusses the special challenges associated with surveillance for aquatic animal diseases (e.g. accessibility of animals for inspection and sampling, complexity of rearing systems) and provides an overview of current developments relevant

  4. Surveillance of healthcare-associated infection in hospitalised South African children: Which method performs best?

    Directory of Open Access Journals (Sweden)

    A Dramowski

    2017-01-01

    Full Text Available Background. In 2012, the South African (SA National Department of Health mandated surveillance of healthcare-associated infection (HAI, but made no recommendations of appropriate surveillance methods. Methods. Prospective clinical HAI surveillance (the reference method was conducted at Tygerberg Children’s Hospital, Cape Town, from 1 May to 31 October 2015. Performance of three surveillance methods (point prevalence surveys (PPSs, laboratory surveillance and tracking of antimicrobial prescriptions was compared with the reference method using surveillance evaluation guidelines. Factors associated with failure to detect HAI were identified by logistic regression analysis. Results. The reference method detected 417 HAIs among 1 347 paediatric hospitalisations (HAI incidence of 31/1000 patient days; 95% confidence interval (CI 28.2 - 34.2. Surveillance methods had variable sensitivity (S and positive predictive value (PPV: PPS S = 24.9% (95% CI 21 - 29.3, PPV = 100%; laboratory surveillance S = 48.4% (95% CI 43.7 - 53.2, PPV = 55.2% (95% CI 50.1 - 60.2; and antimicrobial prescriptions S = 66.4% (95% CI 61.8 - 70.8%, PPV = 88.5% (95% CI 84.5 - 91.6. Combined laboratory-antimicrobial surveillance achieved superior HAI detection (S = 84.7% (95% CI 80.9 - 87.8%, PPV = 97% (95% CI 94.6 - 98.4%. Factors associated with failure to detect HAI included patient transfer (odds ratio (OR 2.0, single HAI event (OR 2.8, age category 1 - 5 years (OR 2.1 and hospitalisation in a general ward (OR 2.3. Conclusions. Repeated PPSs, laboratory surveillance and/or antimicrobial prescription tracking are feasible HAI surveillance methods for low-resource settings. Combined laboratory-antimicrobial surveillance achieved the best sensitivity and PPV. SA paediatric healthcare facilities should individualise HAI surveillance, selecting a method suited to available resources and practice context.

  5. Isotope correlations for safeguards surveillance and accountancy methods

    International Nuclear Information System (INIS)

    Persiani, P.J.; Kalimullah.

    1983-01-01

    Isotope correlations corroborated by experiments, coupled with measurement methods for nuclear material in the fuel cycle have the potential as a safeguards surveillance and accountancy system. The US/DOE/OSS Isotope Correlations for Surveillance and Accountancy Methods (ICSAM) program has been structured into three phases: (1) the analytical development of Isotope Correlation Technique (ICT) for actual power reactor fuel cycles; (2) the development of a dedicated portable ICT computer system for in-field implementation, and (3) the experimental program for measurement of U, Pu isotopics in representative spent fuel-rods of the initial 3 or 4 burnup cycles of the Commonwealth Edison Zion -1 and -2 PWR power plants. Since any particular correlation could generate different curves depending upon the type and positioning of the fuel assembly, a 3-D reactor model and 2-group cross section depletion calculation for the first cycle of the ZION-2 was performed with each fuel assembly as a depletion block. It is found that for a given PWR all assemblies with a unique combination of enrichment zone and number of burnable poison rods (BPRs) generate one coincident curve. Some correlations are found to generate a single curve for assemblies of all enrichments and number of BPRs. The 8 axial segments of the 3-D calculation generate one coincident curve for each correlation. For some correlations the curve for the full assembly homogenized over core-height deviates from the curve for the 8 axial segments, and for other correlations coincides with the curve for the segments. The former behavior is primarily based on the transmutation lag between the end segment and the middle segments. The experimental implication is that the isotope correlations exhibiting this behavior can be determined by dissolving a full assembly but not by dissolving only an axial segment, or pellets

  6. Isotope correlations for safeguards surveillance and accountancy methods

    International Nuclear Information System (INIS)

    Persiani, P.J.; Kalimullah.

    1982-01-01

    Isotope correlations corroborated by experiments, coupled with measurement methods for nuclear material in the fuel cycle have the potential as a safeguards surveillance and accountancy system. The ICT allows the verification of: fabricator's uranium and plutonium content specifications, shipper/receiver differences between fabricator output and reactor input, reactor plant inventory changes, reprocessing batch specifications and shipper/receiver differences between reactor output and reprocessing plant input. The investigation indicates that there exist predictable functional relationships (i.e. correlations) between isotopic concentrations over a range of burnup. Several cross-correlations serve to establish the initial fuel assembly-averaged compositions. The selection of the more effective correlations will depend not only on the level of reliability of ICT for verification, but also on the capability, accuracy and difficulty of developing measurement methods. The propagation of measurement errors through the correlations have been examined to identify the sensitivity of the isotope correlations to measurement errors, and to establish criteria for measurement accuracy in the development and selection of measurement methods. 6 figures, 3 tables

  7. Applied mathematical methods in nuclear thermal hydraulics

    International Nuclear Information System (INIS)

    Ransom, V.H.; Trapp, J.A.

    1983-01-01

    Applied mathematical methods are used extensively in modeling of nuclear reactor thermal-hydraulic behavior. This application has required significant extension to the state-of-the-art. The problems encountered in modeling of two-phase fluid transients and the development of associated numerical solution methods are reviewed and quantified using results from a numerical study of an analogous linear system of differential equations. In particular, some possible approaches for formulating a well-posed numerical problem for an ill-posed differential model are investigated and discussed. The need for closer attention to numerical fidelity is indicated

  8. Entropy viscosity method applied to Euler equations

    International Nuclear Information System (INIS)

    Delchini, M. O.; Ragusa, J. C.; Berry, R. A.

    2013-01-01

    The entropy viscosity method [4] has been successfully applied to hyperbolic systems of equations such as Burgers equation and Euler equations. The method consists in adding dissipative terms to the governing equations, where a viscosity coefficient modulates the amount of dissipation. The entropy viscosity method has been applied to the 1-D Euler equations with variable area using a continuous finite element discretization in the MOOSE framework and our results show that it has the ability to efficiently smooth out oscillations and accurately resolve shocks. Two equations of state are considered: Ideal Gas and Stiffened Gas Equations Of State. Results are provided for a second-order time implicit schemes (BDF2). Some typical Riemann problems are run with the entropy viscosity method to demonstrate some of its features. Then, a 1-D convergent-divergent nozzle is considered with open boundary conditions. The correct steady-state is reached for the liquid and gas phases with a time implicit scheme. The entropy viscosity method correctly behaves in every problem run. For each test problem, results are shown for both equations of state considered here. (authors)

  9. Applying intelligent statistical methods on biometric systems

    OpenAIRE

    Betschart, Willie

    2005-01-01

    This master’s thesis work was performed at Optimum Biometric Labs, OBL, located in Karlskrona, Sweden. Optimum Biometric Labs perform independent scenario evaluations to companies who develop biometric devices. The company has a product Optimum preConTM which is surveillance and diagnosis tool for biometric systems. This thesis work’s objective was to develop a conceptual model and implement it as an additional layer above the biometric layer with intelligence about the biometric users. The l...

  10. Efficient super-resolution image reconstruction applied to surveillance video captured by small unmanned aircraft systems

    Science.gov (United States)

    He, Qiang; Schultz, Richard R.; Chu, Chee-Hung Henry

    2008-04-01

    The concept surrounding super-resolution image reconstruction is to recover a highly-resolved image from a series of low-resolution images via between-frame subpixel image registration. In this paper, we propose a novel and efficient super-resolution algorithm, and then apply it to the reconstruction of real video data captured by a small Unmanned Aircraft System (UAS). Small UAS aircraft generally have a wingspan of less than four meters, so that these vehicles and their payloads can be buffeted by even light winds, resulting in potentially unstable video. This algorithm is based on a coarse-to-fine strategy, in which a coarsely super-resolved image sequence is first built from the original video data by image registration and bi-cubic interpolation between a fixed reference frame and every additional frame. It is well known that the median filter is robust to outliers. If we calculate pixel-wise medians in the coarsely super-resolved image sequence, we can restore a refined super-resolved image. The primary advantage is that this is a noniterative algorithm, unlike traditional approaches based on highly-computational iterative algorithms. Experimental results show that our coarse-to-fine super-resolution algorithm is not only robust, but also very efficient. In comparison with five well-known super-resolution algorithms, namely the robust super-resolution algorithm, bi-cubic interpolation, projection onto convex sets (POCS), the Papoulis-Gerchberg algorithm, and the iterated back projection algorithm, our proposed algorithm gives both strong efficiency and robustness, as well as good visual performance. This is particularly useful for the application of super-resolution to UAS surveillance video, where real-time processing is highly desired.

  11. Evaluation 2000 and regulation and method. Release monitoring and environmental surveillance around Cea centers; Bilan 2000 et reglementation et methode. Controle des rejets et surveillance de l'environnement des centres CEA

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2001-06-01

    This publication counts for the year 2000 for the evaluation of liquid and gaseous radioactive effluents releases and the radioactivity levels measured in the vicinity of Cea centers, through the air, water, vegetation and milk surveillance. An analysis of the results from 1996 to 2000 allows to follow their evolution. A second booklet develops the sampling and measurement methods made on effluents in environment. It present besides the regulation applied to effluents monitoring. (N.C.)

  12. Geostatistical methods applied to field model residuals

    DEFF Research Database (Denmark)

    Maule, Fox; Mosegaard, K.; Olsen, Nils

    consists of measurement errors and unmodelled signal), and is typically assumed to be uncorrelated and Gaussian distributed. We have applied geostatistical methods to analyse the residuals of the Oersted(09d/04) field model [http://www.dsri.dk/Oersted/Field_models/IGRF_2005_candidates/], which is based......The geomagnetic field varies on a variety of time- and length scales, which are only rudimentary considered in most present field models. The part of the observed field that can not be explained by a given model, the model residuals, is often considered as an estimate of the data uncertainty (which...... on 5 years of Ørsted and CHAMP data, and includes secular variation and acceleration, as well as low-degree external (magnetospheric) and induced fields. The analysis is done in order to find the statistical behaviour of the space-time structure of the residuals, as a proxy for the data covariances...

  13. Computational methods applied to wind tunnel optimization

    Science.gov (United States)

    Lindsay, David

    methods, coordinate transformation theorems and techniques including the Method of Jacobians, and a derivation of the fluid flow fundamentals required for the model. It applies the methods to study the effect of cross-section and fillet variation, and to obtain a sample design of a high-uniformity nozzle.

  14. [Methodical approaches to studies of the efficiency and quality of the State Sanitary and Epidemiological Surveillance].

    Science.gov (United States)

    Kutsenko, G I; Manvel'ian, L V; Petruchuk, O E; Chigireva, E I; Berglezova, L N; Mosov, A V

    1999-01-01

    Current methodology and organization of evaluations of the efficiency of specialists and subdivisions of sanitary epidemiological institutions of the first level of managing are presented. The authors propose a method for quantitative evaluation of the efficiency for comparison of compatible and equivalent values. Definitions essential for evaluation of the efficiency the State Sanitary and Epidemiological Surveillance are formulated. A demonstration model of computer processing of the data for estimations of efficiency of the State Sanitary and Epidemiological Surveillance has been developed.

  15. Generalized reciprocal method applied in processing seismic ...

    African Journals Online (AJOL)

    A geophysical investigation was carried out at Shika, near Zaria, using seismic refraction method; with the aim of analyzing the data obtained using the generalized reciprocal method (GRM). The technique is for delineating undulating refractors at any depth from in-line seismic refraction data consisting of forward and ...

  16. Early detection of poor adherers to statins: applying individualized surveillance to pay for performance.

    Directory of Open Access Journals (Sweden)

    Andrew J Zimolzak

    Full Text Available Medication nonadherence costs $300 billion annually in the US. Medicare Advantage plans have a financial incentive to increase medication adherence among members because the Centers for Medicare and Medicaid Services (CMS now awards substantive bonus payments to such plans, based in part on population adherence to chronic medications. We sought to build an individualized surveillance model that detects early which beneficiaries will fall below the CMS adherence threshold.This was a retrospective study of over 210,000 beneficiaries initiating statins, in a database of private insurance claims, from 2008-2011. A logistic regression model was constructed to use statin adherence from initiation to day 90 to predict beneficiaries who would not meet the CMS measure of proportion of days covered 0.8 or above, from day 91 to 365. The model controlled for 15 additional characteristics. In a sensitivity analysis, we varied the number of days of adherence data used for prediction.Lower adherence in the first 90 days was the strongest predictor of one-year nonadherence, with an odds ratio of 25.0 (95% confidence interval 23.7-26.5 for poor adherence at one year. The model had an area under the receiver operating characteristic curve of 0.80. Sensitivity analysis revealed that predictions of comparable accuracy could be made only 40 days after statin initiation. When members with 30-day supplies for their first statin fill had predictions made at 40 days, and members with 90-day supplies for their first fill had predictions made at 100 days, poor adherence could be predicted with 86% positive predictive value.To preserve their Medicare Star ratings, plan managers should identify or develop effective programs to improve adherence. An individualized surveillance approach can be used to target members who would most benefit, recognizing the tradeoff between improved model performance over time and the advantage of earlier detection.

  17. Applying scrum methods to ITS projects.

    Science.gov (United States)

    2017-08-01

    The introduction of new technology generally brings new challenges and new methods to help with deployments. Agile methodologies have been introduced in the information technology industry to potentially speed up development. The Federal Highway Admi...

  18. Statistical classification methods applied to seismic discrimination

    Energy Technology Data Exchange (ETDEWEB)

    Ryan, F.M. [ed.; Anderson, D.N.; Anderson, K.K.; Hagedorn, D.N.; Higbee, K.T.; Miller, N.E.; Redgate, T.; Rohay, A.C.

    1996-06-11

    To verify compliance with a Comprehensive Test Ban Treaty (CTBT), low energy seismic activity must be detected and discriminated. Monitoring small-scale activity will require regional (within {approx}2000 km) monitoring capabilities. This report provides background information on various statistical classification methods and discusses the relevance of each method in the CTBT seismic discrimination setting. Criteria for classification method selection are explained and examples are given to illustrate several key issues. This report describes in more detail the issues and analyses that were initially outlined in a poster presentation at a recent American Geophysical Union (AGU) meeting. Section 2 of this report describes both the CTBT seismic discrimination setting and the general statistical classification approach to this setting. Seismic data examples illustrate the importance of synergistically using multivariate data as well as the difficulties due to missing observations. Classification method selection criteria are presented and discussed in Section 3. These criteria are grouped into the broad classes of simplicity, robustness, applicability, and performance. Section 4 follows with a description of several statistical classification methods: linear discriminant analysis, quadratic discriminant analysis, variably regularized discriminant analysis, flexible discriminant analysis, logistic discriminant analysis, K-th Nearest Neighbor discrimination, kernel discrimination, and classification and regression tree discrimination. The advantages and disadvantages of these methods are summarized in Section 5.

  19. Applying Fuzzy Possibilistic Methods on Critical Objects

    DEFF Research Database (Denmark)

    Yazdani, Hossein; Ortiz-Arroyo, Daniel; Choros, Kazimierz

    2016-01-01

    Providing a flexible environment to process data objects is a desirable goal of machine learning algorithms. In fuzzy and possibilistic methods, the relevance of data objects is evaluated and a membership degree is assigned. However, some critical objects objects have the potential ability to affect...... the performance of the clustering algorithms if they remain in a specific cluster or they are moved into another. In this paper we analyze and compare how critical objects affect the behaviour of fuzzy possibilistic methods in several data sets. The comparison is based on the accuracy and ability of learning...

  20. Tutte's barycenter method applied to isotopies

    NARCIS (Netherlands)

    de Verdiere, EC; Pocchiola, M; Vegter, G

    This paper is concerned with applications of Tutte's barycentric embedding theorem (Proc. London Math. Soc. 13 (1963) 743-768). It presents a method for building isotopies of triangulations in the plane, based on Tutte's theorem and the computation of equilibrium stresses of graphs by

  1. Spectral methods applied to Ising models

    International Nuclear Information System (INIS)

    DeFacio, B.; Hammer, C.L.; Shrauner, J.E.

    1980-01-01

    Several applications of Ising models are reviewed. A 2-d Ising model is studied, and the problem of describing an interface boundary in a 2-d Ising model is addressed. Spectral methods are used to formulate a soluble model for the surface tension of a many-Fermion system

  2. Applying Human Computation Methods to Information Science

    Science.gov (United States)

    Harris, Christopher Glenn

    2013-01-01

    Human Computation methods such as crowdsourcing and games with a purpose (GWAP) have each recently drawn considerable attention for their ability to synergize the strengths of people and technology to accomplish tasks that are challenging for either to do well alone. Despite this increased attention, much of this transformation has been focused on…

  3. [The diagnostic methods applied in mycology].

    Science.gov (United States)

    Kurnatowska, Alicja; Kurnatowski, Piotr

    2008-01-01

    The systemic fungal invasions are recognized with increasing frequency and constitute a primary cause of morbidity and mortality, especially in immunocompromised patients. Early diagnosis improves prognosis, but remains a problem because there is lack of sensitive tests to aid in the diagnosis of systemic mycoses on the one hand, and on the other the patients only present unspecific signs and symptoms, thus delaying early diagnosis. The diagnosis depends upon a combination of clinical observation and laboratory investigation. The successful laboratory diagnosis of fungal infection depends in major part on the collection of appropriate clinical specimens for investigations and on the selection of appropriate microbiological test procedures. So these problems (collection of specimens, direct techniques, staining methods, cultures on different media and non-culture-based methods) are presented in article.

  4. Proteomics methods applied to malaria: Plasmodium falciparum

    International Nuclear Information System (INIS)

    Cuesta Astroz, Yesid; Segura Latorre, Cesar

    2012-01-01

    Malaria is a parasitic disease that has a high impact on public health in developing countries. The sequencing of the plasmodium falciparum genome and the development of proteomics have enabled a breakthrough in understanding the biology of the parasite. Proteomics have allowed to characterize qualitatively and quantitatively the parasite s expression of proteins and has provided information on protein expression under conditions of stress induced by antimalarial. Given the complexity of their life cycle, this takes place in the vertebrate host and mosquito vector. It has proven difficult to characterize the protein expression during each stage throughout the infection process in order to determine the proteome that mediates several metabolic, physiological and energetic processes. Two dimensional electrophoresis, liquid chromatography and mass spectrometry have been useful to assess the effects of antimalarial on parasite protein expression and to characterize the proteomic profile of different p. falciparum stages and organelles. The purpose of this review is to present state of the art tools and advances in proteomics applied to the study of malaria, and to present different experimental strategies used to study the parasite's proteome in order to show the advantages and disadvantages of each one.

  5. METHOD OF APPLYING NICKEL COATINGS ON URANIUM

    Science.gov (United States)

    Gray, A.G.

    1959-07-14

    A method is presented for protectively coating uranium which comprises etching the uranium in an aqueous etching solution containing chloride ions, electroplating a coating of nickel on the etched uranium and heating the nickel plated uranium by immersion thereof in a molten bath composed of a material selected from the group consisting of sodium chloride, potassium chloride, lithium chloride, and mixtures thereof, maintained at a temperature of between 700 and 800 deg C, for a time sufficient to alloy the nickel and uranium and form an integral protective coating of corrosion-resistant uranium-nickel alloy.

  6. Versatile Formal Methods Applied to Quantum Information.

    Energy Technology Data Exchange (ETDEWEB)

    Witzel, Wayne [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Rudinger, Kenneth Michael [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Sarovar, Mohan [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    Using a novel formal methods approach, we have generated computer-veri ed proofs of major theorems pertinent to the quantum phase estimation algorithm. This was accomplished using our Prove-It software package in Python. While many formal methods tools are available, their practical utility is limited. Translating a problem of interest into these systems and working through the steps of a proof is an art form that requires much expertise. One must surrender to the preferences and restrictions of the tool regarding how mathematical notions are expressed and what deductions are allowed. Automation is a major driver that forces restrictions. Our focus, on the other hand, is to produce a tool that allows users the ability to con rm proofs that are essentially known already. This goal is valuable in itself. We demonstrate the viability of our approach that allows the user great exibility in expressing state- ments and composing derivations. There were no major obstacles in following a textbook proof of the quantum phase estimation algorithm. There were tedious details of algebraic manipulations that we needed to implement (and a few that we did not have time to enter into our system) and some basic components that we needed to rethink, but there were no serious roadblocks. In the process, we made a number of convenient additions to our Prove-It package that will make certain algebraic manipulations easier to perform in the future. In fact, our intent is for our system to build upon itself in this manner.

  7. Development of reconstitution method for surveillance specimens using surface activated joining

    Energy Technology Data Exchange (ETDEWEB)

    Nakamura, Terumi; Kaihara, Shoichiro; Yoshida, Kazuo; Sato, Akira [Ishikawajima-Harima Heavy Industries Co. Ltd., Tokyo (Japan); Onizawa, Kunio; Nishiyama, Yutaka; Fukaya, Kiyoshi; Suzuki, Masahide

    1996-03-01

    Evaluation of embrittlement of reactor vessel steel due to irradiation requires surveillance tests. However, many surveillance specimens are necessary for nuclear plants life extension. Therefore, a specimen reconstitution technique has become important to provide the many specimens for continued surveillance. A surface activated joining (SAJ) method has been developed to join various materials together at low temperatures with little deformation, and is useful to bond irradiated specimens. To assess the validity of this method, Charpy impact tests were carried out, and the characteristics caused by heating during joining were measured. The test results showed the Charpy impact values were almost the same as base materials, and surface activated joining reduced heat affected zone to less than 2 mm. (author).

  8. Development of in-house serological methods for diagnosis and surveillance of chikungunya

    Directory of Open Access Journals (Sweden)

    Saira Saborío Galo

    2017-08-01

    Full Text Available ABSTRACT Objective To develop and evaluate serological methods for chikungunya diagnosis and research in Nicaragua. Methods Two IgM ELISA capture systems (MAC-ELISA for diagnosis of acute chikungunya virus (CHIKV infections, and two Inhibition ELISA Methods (IEM to measure total antibodies against CHIKV were developed using monoclonal antibodies (mAbs and hyperimmune serum at the National Virology Laboratory of Nicaragua in 2014–2015. The sensitivity, specificity, predictive values, and agreement of the MAC-ELISAs were obtained by comparing the results of 198 samples (116 positive; 82 negative with the Centers for Disease Control and Prevention’s IgM ELISA (Atlanta, Georgia, United States; CDC-MAC-ELISA. For clinical evaluation of the four serological techniques, 260 paired acute and convalescent phase serum samples of suspected chikungunya cases were used. Results All four assays were standardized by determining the optimal concentrations of the different reagents. Processing times were substantially reduced compared to the CDC-MAC-ELISA. For the MAC-ELISA systems, a sensitivity of 96.6% and 97.4%, and a specificity of 98.8% and 91.5% were obtained using mAb and hyperimmune serum, respectively, compared with the CDC method. Clinical evaluation of the four serological techniques versus the CDC real-time RT-PCR assay resulted in a sensitivity of 95.7% and a specificity of 88.8%–95.9%. Conclusion Two MAC-ELISA and two IEM systems were standardized, demonstrating very good quality for chikungunya diagnosis and research demands. This will achieve more efficient epidemiological surveillance in Nicaragua, the first country in Central America to produce its own reagents for serological diagnosis of CHIKV. The methods evaluated here can be applied in other countries and will contribute to sustainable diagnostic systems to combat the disease.

  9. Development of in-house serological methods for diagnosis and surveillance of chikungunya.

    Science.gov (United States)

    Galo, Saira Saborío; González, Karla; Téllez, Yolanda; García, Nadezna; Pérez, Leonel; Gresh, Lionel; Harris, Eva; Balmaseda, Ángel

    2017-08-21

    To develop and evaluate serological methods for chikungunya diagnosis and research in Nicaragua. Two IgM ELISA capture systems (MAC-ELISA) for diagnosis of acute chikungunya virus (CHIKV) infections, and two Inhibition ELISA Methods (IEM) to measure total antibodies against CHIKV were developed using monoclonal antibodies (mAbs) and hyperimmune serum at the National Virology Laboratory of Nicaragua in 2014-2015. The sensitivity, specificity, predictive values, and agreement of the MAC-ELISAs were obtained by comparing the results of 198 samples (116 positive; 82 negative) with the Centers for Disease Control and Prevention's IgM ELISA (Atlanta, Georgia, United States; CDC-MAC-ELISA). For clinical evaluation of the four serological techniques, 260 paired acute and convalescent phase serum samples of suspected chikungunya cases were used. All four assays were standardized by determining the optimal concentrations of the different reagents. Processing times were substantially reduced compared to the CDC-MAC-ELISA. For the MAC-ELISA systems, a sensitivity of 96.6% and 97.4%, and a specificity of 98.8% and 91.5% were obtained using mAb and hyperimmune serum, respectively, compared with the CDC method. Clinical evaluation of the four serological techniques versus the CDC real-time RT-PCR assay resulted in a sensitivity of 95.7% and a specificity of 88.8%-95.9%. Two MAC-ELISA and two IEM systems were standardized, demonstrating very good quality for chikungunya diagnosis and research demands. This will achieve more efficient epidemiological surveillance in Nicaragua, the first country in Central America to produce its own reagents for serological diagnosis of CHIKV. The methods evaluated here can be applied in other countries and will contribute to sustainable diagnostic systems to combat the disease.

  10. Reflections on Mixing Methods in Applied Linguistics Research

    Science.gov (United States)

    Hashemi, Mohammad R.

    2012-01-01

    This commentary advocates the use of mixed methods research--that is the integration of qualitative and quantitative methods in a single study--in applied linguistics. Based on preliminary findings from a research project in progress, some reflections on the current practice of mixing methods as a new trend in applied linguistics are put forward.…

  11. Antenatal fetal magnetocardiography: a new method for fetal surveillance?

    Science.gov (United States)

    Quinn, A; Weir, A; Shahani, U; Bain, R; Maas, P; Donaldson, G

    1994-10-01

    To establish the reliability of fetal magnetocardiography as a method of measuring the time intervals of the fetal heart during the antenatal period. A prospective study. Wellcome Biomagnetism Unit, Southern General Hospital. One hundred and six low risk pregnant women at 20 to 42 weeks gestation. Success in obtaining QRS complexes, P waves and T waves. Correlation of time intervals with fetal outcome. The technique was acceptable to pregnant women. A QRS complex was successfully demonstrated in 68 (67%) of the unaveraged traces. Using off-line averaging techniques on these 68 cases, P waves were obtained in 75% and T waves in 72%. Although good quality traces were obtained throughout the range of gestational ages, in general it was more difficult below 28 weeks. QRS duration (R2 = 7%, P = 0.02) demonstrated a positive linear correlation with increasing gestation. Of the 35 (51%) cases with umbilical vein pH analysis available, only one result was less than 7.2. No significant relation was found between measurements of the fetal waveforms and the pH results. The technique of fetal magnetocardiography provides a significant advance in the technological field for the demonstration of QRS complexes and the full PQRST waveforms in gestations from 20 weeks onwards. With further technical improvements the clinical impact of this technique can be assessed more fully.

  12. Dynamic optimization of ISR sensors using a risk-based reward function applied to ground and space surveillance scenarios

    Science.gov (United States)

    DeSena, J. T.; Martin, S. R.; Clarke, J. C.; Dutrow, D. A.; Newman, A. J.

    2012-06-01

    As the number and diversity of sensing assets available for intelligence, surveillance and reconnaissance (ISR) operations continues to expand, the limited ability of human operators to effectively manage, control and exploit the ISR ensemble is exceeded, leading to reduced operational effectiveness. Automated support both in the processing of voluminous sensor data and sensor asset control can relieve the burden of human operators to support operation of larger ISR ensembles. In dynamic environments it is essential to react quickly to current information to avoid stale, sub-optimal plans. Our approach is to apply the principles of feedback control to ISR operations, "closing the loop" from the sensor collections through automated processing to ISR asset control. Previous work by the authors demonstrated non-myopic multiple platform trajectory control using a receding horizon controller in a closed feedback loop with a multiple hypothesis tracker applied to multi-target search and track simulation scenarios in the ground and space domains. This paper presents extensions in both size and scope of the previous work, demonstrating closed-loop control, involving both platform routing and sensor pointing, of a multisensor, multi-platform ISR ensemble tasked with providing situational awareness and performing search, track and classification of multiple moving ground targets in irregular warfare scenarios. The closed-loop ISR system is fullyrealized using distributed, asynchronous components that communicate over a network. The closed-loop ISR system has been exercised via a networked simulation test bed against a scenario in the Afghanistan theater implemented using high-fidelity terrain and imagery data. In addition, the system has been applied to space surveillance scenarios requiring tracking of space objects where current deliberative, manually intensive processes for managing sensor assets are insufficiently responsive. Simulation experiment results are presented

  13. Overview of molecular typing methods for outbreak detection and epidemiological surveillance.

    Science.gov (United States)

    Sabat, A J; Budimir, A; Nashev, D; Sá-Leão, R; van Dijl, J m; Laurent, F; Grundmann, H; Friedrich, A W

    2013-01-24

    Typing methods for discriminating different bacterial isolates of the same species are essential epidemiological tools in infection prevention and control. Traditional typing systems based on phenotypes, such as serotype, biotype, phage-type, or antibiogram, have been used for many years. However, more recent methods that examine the relatedness of isolates at a molecular level have revolutionised our ability to differentiate among bacterial types and subtypes. Importantly, the development of molecular methods has provided new tools for enhanced surveillance and outbreak detection. This has resulted in better implementation of rational infection control programmes and efficient allocation of resources across Europe. The emergence of benchtop sequencers using next generation sequencing technology makes bacterial whole genome sequencing (WGS) feasible even in small research and clinical laboratories. WGS has already been used for the characterisation of bacterial isolates in several large outbreaks in Europe and, in the near future, is likely to replace currently used typing methodologies due to its ultimate resolution. However, WGS is still too laborious and time-consuming to obtain useful data in routine surveillance. Also, a largely unresolved question is how genome sequences must be examined for epidemiological characterisation. In the coming years, the lessons learnt from currently used molecular methods will allow us to condense the WGS data into epidemiologically useful information. On this basis, we have reviewed current and new molecular typing methods for outbreak detection and epidemiological surveillance of bacterial pathogens in clinical practice, aiming to give an overview of their specific advantages and disadvantages.

  14. Using data linkage to improve surveillance methods for acute hepatitis E infections in England and Wales 2010-2016.

    Science.gov (United States)

    Oeser, C; Said, B; Warburton, F; Ijaz, S; Tedder, R; Morgan, D

    2017-10-01

    Indigenous, foodborne transmission of hepatitis E has been increasing across industrialised countries. Public Health England has conducted enhanced surveillance in England and Wales since 2003.This report gives an account of acute infections from 2010 to 2016 and describes modification made to the methods of surveillance to account for changes in reporting behaviours and improve ascertainment.

  15. A Low Energy Consumption Storage Method for Cloud Video Surveillance Data Based on SLA Classification

    OpenAIRE

    Yonghua Xiong; Chengda Lu; Min Wu; Keyuan Jiang; Dianhong Wang

    2016-01-01

    With the continuous expansion of the amount of data with time in mobile video applications such as cloud video surveillance (CVS), the increasing energy consumption in video data centers has drawn widespread attention for the past several years. Addressing the issue of reducing energy consumption, we propose a low energy consumption storage method specially designed for CVS systems based onthe service level agreement (SLA) classification. A novel SLA with an extra parameter of access time per...

  16. Convergence of Iterative Methods applied to Boussinesq equation

    Directory of Open Access Journals (Sweden)

    Sh. S. Behzadi

    2013-11-01

    Full Text Available In this paper, a Boussinesq equation is solved by using the Adomian's decomposition method, modified Adomian's decomposition method, variational iteration method, modified variational iteration method, homotopy perturbation method, modified homotopy perturbation method and homotopy analysis method. The approximate solution of this equation is calculated in the form of series which its components are computed by applying a recursive relation. The existence and uniqueness of the solution and the convergence of the proposed methods are proved. A numerical example is studied to demonstrate the accuracy of the presented methods.

  17. Parallel island genetic algorithm applied to a nuclear power plant auxiliary feedwater system surveillance tests policy optimization

    International Nuclear Information System (INIS)

    Pereira, Claudio M.N.A.; Lapa, Celso M.F.

    2003-01-01

    In this work, we focus the application of an Island Genetic Algorithm (IGA), a coarse-grained parallel genetic algorithm (PGA) model, to a Nuclear Power Plant (NPP) Auxiliary Feedwater System (AFWS) surveillance tests policy optimization. Here, the main objective is to outline, by means of comparisons, the advantages of the IGA over the simple (non-parallel) genetic algorithm (GA), which has been successfully applied in the solution of such kind of problem. The goal of the optimization is to maximize the system's average availability for a given period of time, considering realistic features such as: i) aging effects on standby components during the tests; ii) revealing failures in the tests implies on corrective maintenance, increasing outage times; iii) components have distinct test parameters (outage time, aging factors, etc.) and iv) tests are not necessarily periodic. In our experiments, which were made in a cluster comprised by 8 1-GHz personal computers, we could clearly observe gains not only in the computational time, which reduced linearly with the number of computers, but in the optimization outcome

  18. Discrimination symbol applying method for sintered nuclear fuel product

    International Nuclear Information System (INIS)

    Ishizaki, Jin

    1998-01-01

    The present invention provides a symbol applying method for applying discrimination information such as an enrichment degree on the end face of a sintered nuclear product. Namely, discrimination symbols of information of powders are applied by a sintering aid to the end face of a molded member formed by molding nuclear fuel powders under pressure. Then, the molded product is sintered. The sintering aid comprises aluminum oxide, a mixture of aluminum oxide and silicon dioxide, aluminum hydride or aluminum stearate alone or in admixture. As an applying means of the sintering aid, discrimination symbols of information of powders are drawn by an isostearic acid on the end face of the molded product, and the sintering aid is sprayed thereto, or the sintering aid is applied directly, or the sintering aid is suspended in isostearic acid, and the suspension is applied with a brush. As a result, visible discrimination information can be applied to the sintered member easily. (N.H.)

  19. Method of applying a mirror reflecting layer to instrument parts

    Science.gov (United States)

    Alkhanov, L. G.; Danilova, I. A.; Delektorskiy, G. V.

    1974-01-01

    A method follows for applying a mirror reflecting layer to the surfaces of parts, instruments, apparatus, and so on. A brief analysis is presented of the existing methods of obtaining the mirror surface and the advantages of the new method of obtaining the mirror surface by polymer casting mold are indicated.

  20. Research Methods in Healthcare Epidemiology and Antimicrobial Stewardship: Use of Administrative and Surveillance Databases.

    Science.gov (United States)

    Drees, Marci; Gerber, Jeffrey S; Morgan, Daniel J; Lee, Grace M

    2016-11-01

    Administrative and surveillance data are used frequently in healthcare epidemiology and antimicrobial stewardship (HE&AS) research because of their wide availability and efficiency. However, data quality issues exist, requiring careful consideration and potential validation of data. This methods paper presents key considerations for using administrative and surveillance data in HE&AS, including types of data available and potential use, data limitations, and the importance of validation. After discussing these issues, we review examples of HE&AS research using administrative data with a focus on scenarios when their use may be advantageous. A checklist is provided to help aid study development in HE&AS using administrative data. Infect Control Hosp Epidemiol 2016;1-10.

  1. Building "Applied Linguistic Historiography": Rationale, Scope, and Methods

    Science.gov (United States)

    Smith, Richard

    2016-01-01

    In this article I argue for the establishment of "Applied Linguistic Historiography" (ALH), that is, a new domain of enquiry within applied linguistics involving a rigorous, scholarly, and self-reflexive approach to historical research. Considering issues of rationale, scope, and methods in turn, I provide reasons why ALH is needed and…

  2. Release monitoring and environmental surveillance of Cea centers. Assessment and regulation and method 1999; Controle des rejets et surveillance de l'environnement des centres CEA. Bilan et reglementation et methode 1999

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-07-01

    The quality of the natural environment around the centers of the Commissariat a l Energie Atomique is an important point of its safety policy. The environmental protection is based on the control of risks coming from research and development activities of its installations. It aims to reduce as low as possible, the impact of its activities on man and his environment. This publication develops the sampling and measurement methods that are made on effluents and in environment, according to the radionuclides characteristics, that are present. It gives also the regulation that applied to the effluents monitoring. The results of radioactive effluents releases (liquid and gaseous) and the surveillance of environment around cea centers is given in the 'Bilan 1999' publication. An analysis of these results on the 1995-1999 period allows to follow their evolution. (N.C.)

  3. Studies of neutron methods for process control and criticality surveillance of fissile material processing facilities

    International Nuclear Information System (INIS)

    Zoltowski, T.

    1988-01-01

    The development of radiochemical processes for fissile material processing and spent fuel handling need new control procedures enabling an improvement of plant throughput. This is strictly related to the implementation of continuous criticality control policy and developing reliable methods for monitoring the reactivity of radiochemical plant operations in presence of the process perturbations. Neutron methods seem to be applicable for fissile material control in some technological facilities. The measurement of epithermal neutron source multiplication with heuristic evaluation of measured data enables surveillance of anomalous reactivity enhancement leading to unsafe states. 80 refs., 47 figs., 33 tabs. (author)

  4. Ejes de Vigilancia Tecnológica Aplicados en Universidades con Estudios a Distancia (Axes of Technological Surveillance Applied in Distance Learning Universities

    Directory of Open Access Journals (Sweden)

    Arlin J. Izarra Reverol

    2014-08-01

    Full Text Available Resumen El objetivo de la presente investigación es determinar los ejes de la vigilancia tecnológica aplicados en las Universidades del Municipio Maracaibo que ofrecen estudios a distancia, siguiendo los aportes teóricos de López et al. (2007, Palop y Vicente (1999, Martinet y Ribault (1989, Escorsa y Valls (2005, Fundación COTEC (1999 y Jakobiak (1995. La investigación es descriptiva, con un diseño no experimental transeccional de campo. La población la conforman doce directores y coordinadores (censo poblacional, a quienes se aplicó un cuestionario con preguntas cerradas y opciones de frecuencia, validado y con alta confiabilidad (0.90. Los datos obtenidos fueron interpretados a través del análisis y discusión de información estadística. Los resultados adquiridos indican que se aplican los ejes de vigilancia tecnológica en las universidades seleccionadas, enfatizando los siguientes elementos de estudio: la vigilancia competitiva, la vigilancia comercial, la vigilancia tecnológica propiamente dicha y la vigilancia de los entornos. Abstract The aim of this research is to determine the axes of technological surveillance applied in the universities of Maracaibo Municipality including distance learning, following the theoretical contributions of López et al. (2007, Palop and Vicente (1999, Martinet and Ribault (1989 Escorsa and Valls (2005, COTEC Foundation (1999 and Jakobiak (1995. This research is descriptive, based on a experimental design. The population consisted of twelve directors and coordinators (residents, who answered a questionnaire with closed questions and frequency options. This was validated with high reliability (0.90. The data obtained were interpreted through analysis and discussion of statistical information. The results indicate that the axes of technological surveillance are applied in the selected universities, emphasizing the following elements of study: competitive surveillance, commercial surveillance

  5. Quantitative EEG Applying the Statistical Recognition Pattern Method

    DEFF Research Database (Denmark)

    Engedal, Knut; Snaedal, Jon; Hoegh, Peter

    2015-01-01

    BACKGROUND/AIM: The aim of this study was to examine the discriminatory power of quantitative EEG (qEEG) applying the statistical pattern recognition (SPR) method to separate Alzheimer's disease (AD) patients from elderly individuals without dementia and from other dementia patients. METHODS...

  6. The harmonics detection method based on neural network applied ...

    African Journals Online (AJOL)

    The harmonics detection method based on neural network applied to harmonics compensation. R Dehini, A Bassou, B Ferdi. Abstract. Several different methods have been used to sense load currents and extract its harmonic component in order to produce a reference current in shunt active power filters (SAPF), and to ...

  7. Influenza surveillance in Europe: establishing epidemic thresholds by the Moving Epidemic Method

    Science.gov (United States)

    Vega, Tomás; Lozano, Jose Eugenio; Meerhoff, Tamara; Snacken, René; Mott, Joshua; Ortiz de Lejarazu, Raul; Nunes, Baltazar

    2012-01-01

    Please cite this paper as: Vega et al. (2012) Influenza surveillance in Europe: establishing epidemic thresholds by the moving epidemic method. Influenza and Other Respiratory Viruses 7(4), 546–558. Background  Timely influenza surveillance is important to monitor influenza epidemics. Objectives  (i) To calculate the epidemic threshold for influenza‐like illness (ILI) and acute respiratory infections (ARI) in 19 countries, as well as the thresholds for different levels of intensity. (ii) To evaluate the performance of these thresholds. Methods  The moving epidemic method (MEM) has been developed to determine the baseline influenza activity and an epidemic threshold. False alerts, detection lags and timeliness of the detection of epidemics were calculated. The performance was evaluated using a cross‐validation procedure. Results  The overall sensitivity of the MEM threshold was 71·8% and the specificity was 95·5%. The median of the timeliness was 1 week (range: 0–4·5). Conclusions  The method produced a robust and specific signal to detect influenza epidemics. The good balance between the sensitivity and specificity of the epidemic threshold to detect seasonal epidemics and avoid false alerts has advantages for public health purposes. This method may serve as standard to define the start of the annual influenza epidemic in countries in Europe. PMID:22897919

  8. CARRS Surveillance study: design and methods to assess burdens from multiple perspectives

    Directory of Open Access Journals (Sweden)

    Nair Manisha

    2012-08-01

    Full Text Available Abstract Background Cardio-metabolic diseases (CMDs are a growing public health problem, but data on incidence, trends, and costs in developing countries is scarce. Comprehensive and standardised surveillance for non-communicable diseases was recommended at the United Nations High-level meeting in 2011. Aims: To develop a model surveillance system for CMDs and risk factors that could be adopted for continued assessment of burdens from multiple perspectives in South-Asian countries. Methods Design: Hybrid model with two cross-sectional serial surveys three years apart to monitor trend, with a three-year prospective follow-up of the first cohort. Sites: Three urban settings (Chennai and New Delhi in India; Karachi in Pakistan, 4000 participants in each site stratified by gender and age. Sampling methodology: Multi-stage cluster random sampling; followed by within-household participant selection through a combination of Health Information National Trends Study (HINTS and Kish methods. Culturally-appropriate and methodologically-relevant data collection instruments were developed to gather information on CMDs and their risk factors; quality of life, health-care utilisation and costs, along with objective measures of anthropometric, clinical and biochemical parameters. The cohort follow-up is designed as a pilot study to understand the feasibility of estimating incidence of risk factors, disease events, morbidity, and mortality. Results The overall participant response rate in the first cross-sectional survey was 94.1% (Chennai 92.4%, n = 4943; Delhi 95.7%, n = 4425; Karachi 94.3%, n = 4016. 51.8% of the participants were females, 61.6% 60 years. Discussion This surveillance model will generate data on prevalence and trends; help study the complex life-course patterns of CMDs, and provide a platform for developing and testing interventions and tools for prevention and control of CMDs in South-Asia. It will also help understanding the

  9. Environmental public health tracking: piloting methods for surveillance of environmentally related diseases in England and Wales.

    Science.gov (United States)

    Saunders, Patrick; Mohammed, Mohammed A

    2009-04-01

    An effective environmental public health tracking system integrates data and intelligence on environmental hazards, exposures, and health outcomes to focus interventions on reducing the impact of environmental contamination on public health. Most work in this area in the UK has focused on assessing data on hazards that are relatively easy to obtain. However, most hazards will present no actual risk and information on exposure is required to make an effective risk assessment. Obtaining exposure data is technically challenging, expensive, and potentially raises ethical concerns. Consequently, the Health Protection Agency is exploring methods for targeting geographical zones for efficient detailed environmental assessment (including exposure assessment). This paper describes and assesses three methods (indirect standardization, statistical process control (SPC) and kernel density contouring) for the surveillance of potentially environmentally related diseases for this purpose. While the evaluation demonstrates the utility of the three methods, particularly SPC, the comparison was limited due to ethical approval issues.

  10. Applying the Taguchi method for optimized fabrication of bovine ...

    African Journals Online (AJOL)

    The objective of the present study was to optimize the fabrication of bovine serum albumin (BSA) nanoparticle by applying the Taguchi method with characterization of the nanoparticle bioproducts. BSA nanoparticles have been extensively studied in our previous works as suitable carrier for drug delivery, since they are ...

  11. Linear algebraic methods applied to intensity modulated radiation therapy.

    Science.gov (United States)

    Crooks, S M; Xing, L

    2001-10-01

    Methods of linear algebra are applied to the choice of beam weights for intensity modulated radiation therapy (IMRT). It is shown that the physical interpretation of the beam weights, target homogeneity and ratios of deposited energy can be given in terms of matrix equations and quadratic forms. The methodology of fitting using linear algebra as applied to IMRT is examined. Results are compared with IMRT plans that had been prepared using a commercially available IMRT treatment planning system and previously delivered to cancer patients.

  12. Methods of applied mathematics with a software overview

    CERN Document Server

    Davis, Jon H

    2016-01-01

    This textbook, now in its second edition, provides students with a firm grasp of the fundamental notions and techniques of applied mathematics as well as the software skills to implement them. The text emphasizes the computational aspects of problem solving as well as the limitations and implicit assumptions inherent in the formal methods. Readers are also given a sense of the wide variety of problems in which the presented techniques are useful. Broadly organized around the theme of applied Fourier analysis, the treatment covers classical applications in partial differential equations and boundary value problems, and a substantial number of topics associated with Laplace, Fourier, and discrete transform theories. Some advanced topics are explored in the final chapters such as short-time Fourier analysis and geometrically based transforms applicable to boundary value problems. The topics covered are useful in a variety of applied fields such as continuum mechanics, mathematical physics, control theory, and si...

  13. Informatics Tools and Methods to Enhance U.S. Cancer Surveillance Research, UG3/UH3 | Informatics Technology for Cancer Research (ITCR)

    Science.gov (United States)

    The goal of this Funding Opportunity Announcement (FOA) is to advance surveillance science by supporting the development of new and innovative tools and methods for more efficient, detailed, timely, and accurate data collection by cancer registries. Specifically, the FOA seeks applications for projects to develop, adapt, apply, scale-up, and validate tools and methods to improve the collection and integration cancer registry data and to expand the data items collected. Population-based central cancer registries (a partnership must involve at least two different registries).

  14. Using VPython to Apply Mathematics to Physics in Mathematical Methods

    Science.gov (United States)

    Demaree, Dedra; Eagan, J.; Finn, P.; Knight, B.; Singleton, J.; Therrien, A.

    2006-12-01

    At the College of the Holy Cross, the sophomore mathematical methods of physics students completed VPython programming projects. This is the first time VPython has been used in a physics course at this college. These projects were aimed at applying some methods learned to actual physical situations. Students first completed worksheets from North Carolina State University to learn the programming environment. They then used VPython to apply the mathematics of vectors and differential equations learned in class to solve physics situations which appear simple but are not easy to solve analytically. For most of these students it was their first programming experience. It was also one of the only chances we had to do actual physics applications during the semester due to the large amount of mathematical content covered. In addition to showcasing the students’ final programs, this poster will share their view of including VPython in this course.

  15. Optimizing provider recruitment for influenza surveillance networks.

    Directory of Open Access Journals (Sweden)

    Samuel V Scarpino

    Full Text Available The increasingly complex and rapid transmission dynamics of many infectious diseases necessitates the use of new, more advanced methods for surveillance, early detection, and decision-making. Here, we demonstrate that a new method for optimizing surveillance networks can improve the quality of epidemiological information produced by typical provider-based networks. Using past surveillance and Internet search data, it determines the precise locations where providers should be enrolled. When applied to redesigning the provider-based, influenza-like-illness surveillance network (ILINet for the state of Texas, the method identifies networks that are expected to significantly outperform the existing network with far fewer providers. This optimized network avoids informational redundancies and is thereby more effective than networks designed by conventional methods and a recently published algorithm based on maximizing population coverage. We show further that Google Flu Trends data, when incorporated into a network as a virtual provider, can enhance but not replace traditional surveillance methods.

  16. Synthetic data. A proposed method for applied risk management

    OpenAIRE

    Carbajal De Nova, Carolina

    2017-01-01

    The proposed method attempts to contribute towards the econometric and simulation applied risk management literature. It consists on an algorithm to construct synthetic data and risk simulation econometric models, supported by a set of behavioral assumptions. This algorithm has the advantage of replicating natural phenomena and uncertainty events in a short period of time. These features convey economically low costs besides computational efficiency. An application for wheat farmers is develo...

  17. Newton-Krylov methods applied to nonequilibrium radiation diffusion

    International Nuclear Information System (INIS)

    Knoll, D.A.; Rider, W.J.; Olsen, G.L.

    1998-01-01

    The authors present results of applying a matrix-free Newton-Krylov method to a nonequilibrium radiation diffusion problem. Here, there is no use of operator splitting, and Newton's method is used to convert the nonlinearities within a time step. Since the nonlinear residual is formed, it is used to monitor convergence. It is demonstrated that a simple Picard-based linearization produces a sufficient preconditioning matrix for the Krylov method, thus elevating the need to form or store a Jacobian matrix for Newton's method. They discuss the possibility that the Newton-Krylov approach may allow larger time steps, without loss of accuracy, as compared to an operator split approach where nonlinearities are not converged within a time step

  18. Standard Test Method for Application and Analysis of Helium Accumulation Fluence Monitors for Reactor Vessel Surveillance, E706 (IIIC)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2007-01-01

    1.1 This test method describes the concept and use of helium accumulation for neutron fluence dosimetry for reactor vessel surveillance. Although this test method is directed toward applications in vessel surveillance, the concepts and techniques are equally applicable to the general field of neutron dosimetry. The various applications of this test method for reactor vessel surveillance are as follows: 1.1.1 Helium accumulation fluence monitor (HAFM) capsules, 1.1.2 Unencapsulated, or cadmium or gadolinium covered, radiometric monitors (RM) and HAFM wires for helium analysis, 1.1.3 Charpy test block samples for helium accumulation, and 1.1.4 Reactor vessel (RV) wall samples for helium accumulation. This standard does not purport to address all of the safety concerns, if any, associated with its use. It is the responsibility of the user of this standard to establish appropriate safety and health practices and determine the applicability of regulatory limitations prior to use.

  19. The New Product Watch: Successes and Challenges of Crowdsourcing as a Method of Surveillance.

    Science.gov (United States)

    Nyman, Amy L; Biener, Lois

    2016-01-01

    New smokeless tobacco (eg, snus and dissolvable tobacco products) and nontobacco nicotine products (eg, e-cigarettes) have emerged in recent years amid widespread speculation about locations of test marketing, toxic constituents, and consumer targeting. The New Product Watch was a pilot online monitoring system aimed at filling these information gaps by using a form of crowdsourcing: recruiting volunteers to visit local retailers and report their findings. With very little funding, the New Product Watch gathered county-specific data on new product availability in 19 states as well as trend data on product marketing and demand, and completed 2 rounds of product purchases and subsequent toxic constituent analyses. Data were collected over a 2-year period, between 2009 and 2011. Despite the successes, we found that this small-scale, volunteer effort was not a sustainable method for ensuring continuous, systematic surveillance of new product availability, marketing, and toxicity.

  20. SurveillanceMapper: a simple method for creating an overview of quality data.

    Science.gov (United States)

    Poulsen, Kjeld B

    2008-02-01

    The amount of quality data continues to increase. To help prioritise resources for quality improvement, managers need thorough reviews to help them decide which indicators are most important to improve. The reality is that data is presented in piles of reports and hundreds of tables and graphs that are very time-consuming to go through and that rarely result in a simple comprehensive overview. This paper presents an empirically tested tool to create a simple overview of complex quality data. Data comes from a questionnaire-based patient satisfaction survey of 13,129 patients and 1,589 staff members at Ribe County Hospital. A method is described: how to use colour coding in order to present the results for 16 indicators, measured both by patients and three staff member groups, for 28 departments and 46 ambulatories, in one page. Data for mean satisfaction scores on all questions are shown for each department in a core map. Aggregated departmental mean satisfaction scores are then calculated, as are hospital mean scores for each question. The same is done for staff members' evaluation and for outpatient care. Few problems are universal and most of the problematic scores are related to a minority of departments, calling for local activities to improve quality. Diversity seems to be the rule. The SurveillanceMapper-tool proved effective for handling the complexity of quality measures. It is easy to translate hundreds of graphs and tables into the SurveillanceMapper-tool format. The method facilitates easy spotting areas for quality improvements and evaluating the results of intervention.

  1. Methods for model selection in applied science and engineering.

    Energy Technology Data Exchange (ETDEWEB)

    Field, Richard V., Jr.

    2004-10-01

    Mathematical models are developed and used to study the properties of complex systems and/or modify these systems to satisfy some performance requirements in just about every area of applied science and engineering. A particular reason for developing a model, e.g., performance assessment or design, is referred to as the model use. Our objective is the development of a methodology for selecting a model that is sufficiently accurate for an intended use. Information on the system being modeled is, in general, incomplete, so that there may be two or more models consistent with the available information. The collection of these models is called the class of candidate models. Methods are developed for selecting the optimal member from a class of candidate models for the system. The optimal model depends on the available information, the selected class of candidate models, and the model use. Classical methods for model selection, including the method of maximum likelihood and Bayesian methods, as well as a method employing a decision-theoretic approach, are formulated to select the optimal model for numerous applications. There is no requirement that the candidate models be random. Classical methods for model selection ignore model use and require data to be available. Examples are used to show that these methods can be unreliable when data is limited. The decision-theoretic approach to model selection does not have these limitations, and model use is included through an appropriate utility function. This is especially important when modeling high risk systems, where the consequences of using an inappropriate model for the system can be disastrous. The decision-theoretic method for model selection is developed and applied for a series of complex and diverse applications. These include the selection of the: (1) optimal order of the polynomial chaos approximation for non-Gaussian random variables and stationary stochastic processes, (2) optimal pressure load model to be

  2. Analysis of concrete beams using applied element method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a displacement based method of structural analysis. Some of its features are similar to that of Finite Element Method (FEM). In AEM, the structure is analysed by dividing it into several elements similar to FEM. But, in AEM, elements are connected by springs instead of nodes as in the case of FEM. In this paper, background to AEM is discussed and necessary equations are derived. For illustrating the application of AEM, it has been used to analyse plain concrete beam of fixed support condition. The analysis is limited to the analysis of 2-dimensional structures. It was found that the number of springs has no much influence on the results. AEM could predict deflection and reactions with reasonable degree of accuracy.

  3. Applying sample survey methods to clinical trials data.

    Science.gov (United States)

    LaVange, L M; Koch, G G; Schwartz, T A

    This paper outlines the utility of statistical methods for sample surveys in analysing clinical trials data. Sample survey statisticians face a variety of complex data analysis issues deriving from the use of multi-stage probability sampling from finite populations. One such issue is that of clustering of observations at the various stages of sampling. Survey data analysis approaches developed to accommodate clustering in the sample design have more general application to clinical studies in which repeated measures structures are encountered. Situations where these methods are of interest include multi-visit studies where responses are observed at two or more time points for each patient, multi-period cross-over studies, and epidemiological studies for repeated occurrences of adverse events or illnesses. We describe statistical procedures for fitting multiple regression models to sample survey data that are more effective for repeated measures studies with complicated data structures than the more traditional approaches of multivariate repeated measures analysis. In this setting, one can specify a primary sampling unit within which repeated measures have intraclass correlation. This intraclass correlation is taken into account by sample survey regression methods through robust estimates of the standard errors of the regression coefficients. Regression estimates are obtained from model fitting estimation equations which ignore the correlation structure of the data (that is, computing procedures which assume that all observational units are independent or are from simple random samples). The analytic approach is straightforward to apply with logistic models for dichotomous data, proportional odds models for ordinal data, and linear models for continuously scaled data, and results are interpretable in terms of population average parameters. Through the features summarized here, the sample survey regression methods have many similarities to the broader family of

  4. Classification of Specialized Farms Applying Multivariate Statistical Methods

    Directory of Open Access Journals (Sweden)

    Zuzana Hloušková

    2017-01-01

    Full Text Available Classification of specialized farms applying multivariate statistical methods The paper is aimed at application of advanced multivariate statistical methods when classifying cattle breeding farming enterprises by their economic size. Advantage of the model is its ability to use a few selected indicators compared to the complex methodology of current classification model that requires knowledge of detailed structure of the herd turnover and structure of cultivated crops. Output of the paper is intended to be applied within farm structure research focused on future development of Czech agriculture. As data source, the farming enterprises database for 2014 has been used, from the FADN CZ system. The predictive model proposed exploits knowledge of actual size classes of the farms tested. Outcomes of the linear discriminatory analysis multifactor classification method have supported the chance of filing farming enterprises in the group of Small farms (98 % filed correctly, and the Large and Very Large enterprises (100 % filed correctly. The Medium Size farms have been correctly filed at 58.11 % only. Partial shortages of the process presented have been found when discriminating Medium and Small farms.

  5. Enhanced Molecular Dynamics Methods Applied to Drug Design Projects.

    Science.gov (United States)

    Ziada, Sonia; Braka, Abdennour; Diharce, Julien; Aci-Sèche, Samia; Bonnet, Pascal

    2018-01-01

    Nobel Laureate Richard P. Feynman stated: "[…] everything that living things do can be understood in terms of jiggling and wiggling of atoms […]." The importance of computer simulations of macromolecules, which use classical mechanics principles to describe atom behavior, is widely acknowledged and nowadays, they are applied in many fields such as material sciences and drug discovery. With the increase of computing power, molecular dynamics simulations can be applied to understand biological mechanisms at realistic timescales. In this chapter, we share our computational experience providing a global view of two of the widely used enhanced molecular dynamics methods to study protein structure and dynamics through the description of their characteristics, limits and we provide some examples of their applications in drug design. We also discuss the appropriate choice of software and hardware. In a detailed practical procedure, we describe how to set up, run, and analyze two main molecular dynamics methods, the umbrella sampling (US) and the accelerated molecular dynamics (aMD) methods.

  6. A New Method for Estimating the Coverage of Mass Vaccination Campaigns Against Poliomyelitis From Surveillance Data.

    Science.gov (United States)

    O'Reilly, K M; Cori, A; Durry, E; Wadood, M Z; Bosan, A; Aylward, R B; Grassly, N C

    2015-12-01

    Mass vaccination campaigns with the oral poliovirus vaccine targeting children aged poliomyelitis eradication effort. Monitoring the coverage of these campaigns is essential to allow corrective action, but current approaches are limited by their cross-sectional nature, nonrandom sampling, reporting biases, and accessibility issues. We describe a new Bayesian framework using data augmentation and Markov chain Monte Carlo methods to estimate variation in vaccination coverage from children's vaccination histories investigated during surveillance for acute flaccid paralysis. We tested the method using simulated data with at least 200 cases and were able to detect undervaccinated groups if they exceeded 10% of all children and temporal changes in coverage of ±10% with greater than 90% sensitivity. Application of the method to data from Pakistan for 2010-2011 identified undervaccinated groups within the Balochistan/Federally Administered Tribal Areas and Khyber Pakhtunkhwa regions, as well as temporal changes in coverage. The sizes of these groups are consistent with the multiple challenges faced by the program in these regions as a result of conflict and insecurity. Application of this new method to routinely collected data can be a useful tool for identifying poorly performing areas and assisting in eradication efforts. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health.

  7. Metrological evaluation of characterization methods applied to nuclear fuels

    International Nuclear Information System (INIS)

    Faeda, Kelly Cristina Martins; Lameiras, Fernando Soares; Camarano, Denise das Merces; Ferreira, Ricardo Alberto Neto; Migliorini, Fabricio Lima; Carneiro, Luciana Capanema Silva; Silva, Egonn Hendrigo Carvalho

    2010-01-01

    In manufacturing the nuclear fuel, characterizations are performed in order to assure the minimization of harmful effects. The uranium dioxide is the most used substance as nuclear reactor fuel because of many advantages, such as: high stability even when it is in contact with water at high temperatures, high fusion point, and high capacity to retain fission products. Several methods are used for characterization of nuclear fuels, such as thermogravimetric analysis for the ratio O / U, penetration-immersion method, helium pycnometer and mercury porosimetry for the density and porosity, BET method for the specific surface, chemical analyses for relevant impurities, and the laser flash method for thermophysical properties. Specific tools are needed to control the diameter and the sphericity of the microspheres and the properties of the coating layers (thickness, density, and degree of anisotropy). Other methods can also give information, such as scanning and transmission electron microscopy, X-ray diffraction, microanalysis, and mass spectroscopy of secondary ions for chemical analysis. The accuracy of measurement and level of uncertainty of the resulting data are important. This work describes a general metrological characterization of some techniques applied to the characterization of nuclear fuel. Sources of measurement uncertainty were analyzed. The purpose is to summarize selected properties of UO 2 that have been studied by CDTN in a program of fuel development for Pressurized Water Reactors (PWR). The selected properties are crucial for thermalhydraulic codes to study basic design accidents. The thermal characterization (thermal diffusivity and thermal conductivity) and the penetration immersion method (density and open porosity) of UO 2 samples were focused. The thermal characterization of UO 2 samples was determined by the laser flash method between room temperature and 448 K. The adaptive Monte Carlo Method was used to obtain the endpoints of the

  8. Nuclear and nuclear related analytical methods applied in environmental research

    International Nuclear Information System (INIS)

    Popescu, Ion V.; Gheboianu, Anca; Bancuta, Iulian; Cimpoca, G. V; Stihi, Claudia; Radulescu, Cristiana; Oros Calin; Frontasyeva, Marina; Petre, Marian; Dulama, Ioana; Vlaicu, G.

    2010-01-01

    Nuclear Analytical Methods can be used for research activities on environmental studies like water quality assessment, pesticide residues, global climatic change (transboundary), pollution and remediation. Heavy metal pollution is a problem associated with areas of intensive industrial activity. In this work the moss bio monitoring technique was employed to study the atmospheric deposition in Dambovita County Romania. Also, there were used complementary nuclear and atomic analytical methods: Neutron Activation Analysis (NAA), Atomic Absorption Spectrometry (AAS) and Inductively Coupled Plasma Atomic Emission Spectrometry (ICP-AES). These high sensitivity analysis methods were used to determine the chemical composition of some samples of mosses placed in different areas with different pollution industrial sources. The concentrations of Cr, Fe, Mn, Ni and Zn were determined. The concentration of Fe from the same samples was determined using all these methods and we obtained a very good agreement, in statistical limits, which demonstrate the capability of these analytical methods to be applied on a large spectrum of environmental samples with the same results. (authors)

  9. Analysis of Brick Masonry Wall using Applied Element Method

    Science.gov (United States)

    Lincy Christy, D.; Madhavan Pillai, T. M.; Nagarajan, Praveen

    2018-03-01

    The Applied Element Method (AEM) is a versatile tool for structural analysis. Analysis is done by discretising the structure as in the case of Finite Element Method (FEM). In AEM, elements are connected by a set of normal and shear springs instead of nodes. AEM is extensively used for the analysis of brittle materials. Brick masonry wall can be effectively analyzed in the frame of AEM. The composite nature of masonry wall can be easily modelled using springs. The brick springs and mortar springs are assumed to be connected in series. The brick masonry wall is analyzed and failure load is determined for different loading cases. The results were used to find the best aspect ratio of brick to strengthen brick masonry wall.

  10. Applied systems ecology: models, data, and statistical methods

    Energy Technology Data Exchange (ETDEWEB)

    Eberhardt, L L

    1976-01-01

    In this report, systems ecology is largely equated to mathematical or computer simulation modelling. The need for models in ecology stems from the necessity to have an integrative device for the diversity of ecological data, much of which is observational, rather than experimental, as well as from the present lack of a theoretical structure for ecology. Different objectives in applied studies require specialized methods. The best predictive devices may be regression equations, often non-linear in form, extracted from much more detailed models. A variety of statistical aspects of modelling, including sampling, are discussed. Several aspects of population dynamics and food-chain kinetics are described, and it is suggested that the two presently separated approaches should be combined into a single theoretical framework. It is concluded that future efforts in systems ecology should emphasize actual data and statistical methods, as well as modelling.

  11. A Low Energy Consumption Storage Method for Cloud Video Surveillance Data Based on SLA Classification

    Directory of Open Access Journals (Sweden)

    Yonghua Xiong

    2016-01-01

    Full Text Available With the continuous expansion of the amount of data with time in mobile video applications such as cloud video surveillance (CVS, the increasing energy consumption in video data centers has drawn widespread attention for the past several years. Addressing the issue of reducing energy consumption, we propose a low energy consumption storage method specially designed for CVS systems based onthe service level agreement (SLA classification. A novel SLA with an extra parameter of access time period is proposed and then utilized as a criterion for dividing virtual machines (VMs and data storage nodes into different classifications. Tasks can be scheduled in real time for running on the homologous VMs and data storage nodes according to their access time periods. Any nodes whose access time periods do not encompass the current time will be placed into the energy saving state while others are in normal state with the capability of undertaking tasks. As a result, overall electric energy consumption in data centers is reduced while the SLA is fulfilled. To evaluate the performance, we compare the method with two related approaches using the Hadoop Distributed File System (HDFS. The results show the superiority and effectiveness of our method.

  12. [Evaluation on Hepatitis B surveillance models at surveillance pilot points in China, 2013-2015].

    Science.gov (United States)

    Miao, N; Wang, F Z; Zhang, L J; Zheng, H; Sun, X J; Wang, F; Zhang, G M

    2017-12-10

    Objective: To evaluate the effects on Hepatitis B surveillance models at the surveillance pilot points in China. Methods: Hepatitis B related records kept at the surveillance pilot points were downloaded from NNDRS. Data concerning proportion of unclassified Hepatitis B cases, consistency of additional records and the accuracy of reported acute Hepatitis B cases were evaluated. Results: The proportion of unclassified Hepatitis B cases was decreasing year by year ( P surveillance could be applied elsewhere in the nation to improve the quality of report system on Hepatitis B.

  13. A new method for improving the reliability of fracture toughness surveillance of nuclear pressure vessel by neutron irradiated embrittlement

    International Nuclear Information System (INIS)

    Zhang Xinping; Shi Yaowu

    1992-01-01

    In order to obtain more information from neutron irradiated sample specimens and raise the reliability of fracture toughness surveillance test, it has more important significance to repeatedly exploit the broken Charpy-size specimen which had been tested in surveillance test. In this work, on the renewing design and utilization for Charpy-size specimens, 9 data of fracture toughness can be gained from one pre-cracked side-grooved Charpy-size specimen while at the preset usually only 1 to 3 data of fracture toughness can be obtained from one Chharpy-size specimen. Thus, it is found that the new method would obviously improve the reliability of fracture toughness surveillance test and evaluation. Some factors which affect the reasonable design of pre-cracked deep side-groove Charpy-size compound specimen have been discussed

  14. An introduction to analytical methods for the postmarketing surveillance of veterinary vaccines.

    Science.gov (United States)

    Siev, D

    1999-01-01

    Any analysis of spontaneous AER data must consider the many biases inherent in the observation and reporting of vaccine adverse events. The absence of a clear probability structure requires statistical procedures to be used in a spirit of exploratory description rather than definitive confirmation. The extent of such descriptions should be temperate, without the implication that they extend to parent populations. It is important to recognize the presence of overdispersion in selecting methods and constructing models. Important stochastic or systematic features of the data may always be unknown. Our attempts to delineate what constitutes an AER have not eliminated all the fuzziness in its definition. Some count every event in a report as a separate AER. Besides confusing the role of event and report, this introduces a complex correlational structure, since multiple event descriptions received in a single report can hardly be considered independent. The many events described by one reporter would then become inordinately weighted. The alternative is to record an AER once, regardless of how many event descriptions it includes. As a practical compromise, many regard the simultaneous submission of several report forms by one reporter as a single AER, and the next submission by that reporter as another AER. This method is reasonable when reporters submit AERs very infrequently. When individual reporters make frequent reports, it becomes difficult to justify the inconsistency of counting multiple events as a single AER when they are submitted together, but as separate AERs when they are reported at different times. While either choice is imperfect, the latter approach is currently used by the USDA and its licensed manufacturers in developing a mandatory postmarketing surveillance system for veterinary immunobiologicals in the United States. Under the proposed system, summaries of an estimated 10,000 AERs received annually by the manufacturers would be submitted to the

  15. Accuracy of Diagnostic Methods and Surveillance Sensitivity for Human Enterovirus, South Korea, 1999–2011

    Science.gov (United States)

    Hyeon, Ji-Yeon; Hwang, Seoyeon; Kim, Hyejin; Song, Jaehyoung; Ahn, Jeongbae; Kang, Byunghak; Kim, Kisoon; Choi, Wooyoung; Chung, Jae Keun; Kim, Cheon-Hyun; Cho, Kyungsoon; Jee, Youngmee; Kim, Jonghyun; Kim, Kisang; Kim, Sun-Hee; Kim, Min-Ji

    2013-01-01

    The epidemiology of enteroviral infection in South Korea during 1999–2011 chronicles nationwide outbreaks and changing detection and subtyping methods used over the 13-year period. Of 14,657 patients whose samples were tested, 4,762 (32.5%) samples were positive for human enterovirus (human EV); as diagnostic methods improved, the rate of positive results increased. A seasonal trend of outbreaks was documented. Genotypes enterovirus 71, echovirus 30, coxsackievirus B5, enterovirus 6, and coxsackievirus B2 were the most common genotypes identified. Accurate test results correlated clinical syndromes to enterovirus genotypes: aseptic meningitis to echovirus 30, enterovirus 6, and coxsackievirus B5; hand, foot and mouth disease to coxsackievirus A16; and hand, foot and mouth disease with neurologic complications to enterovirus 71. There are currently no treatments specific to human EV infections; surveillance of enterovirus infections such as this study provides may assist with evaluating the need to research and develop treatments for infections caused by virulent human EV genotypes. PMID:23876671

  16. Analytical methods applied to diverse types of Brazilian propolis

    Directory of Open Access Journals (Sweden)

    Marcucci Maria

    2011-06-01

    Full Text Available Abstract Propolis is a bee product, composed mainly of plant resins and beeswax, therefore its chemical composition varies due to the geographic and plant origins of these resins, as well as the species of bee. Brazil is an important supplier of propolis on the world market and, although green colored propolis from the southeast is the most known and studied, several other types of propolis from Apis mellifera and native stingless bees (also called cerumen can be found. Propolis is usually consumed as an extract, so the type of solvent and extractive procedures employed further affect its composition. Methods used for the extraction; analysis the percentage of resins, wax and insoluble material in crude propolis; determination of phenolic, flavonoid, amino acid and heavy metal contents are reviewed herein. Different chromatographic methods applied to the separation, identification and quantification of Brazilian propolis components and their relative strengths are discussed; as well as direct insertion mass spectrometry fingerprinting. Propolis has been used as a popular remedy for several centuries for a wide array of ailments. Its antimicrobial properties, present in propolis from different origins, have been extensively studied. But, more recently, anti-parasitic, anti-viral/immune stimulating, healing, anti-tumor, anti-inflammatory, antioxidant and analgesic activities of diverse types of Brazilian propolis have been evaluated. The most common methods employed and overviews of their relative results are presented.

  17. Method of applying a coating onto a steel plate

    International Nuclear Information System (INIS)

    Masuda, Hiromasa; Murakami, Shozo; Chihara, Yoshihi.

    1970-01-01

    A method of applying a protective coating onto a steel plate to protect it from corrosion is given, using an irradiation process and a vehicle consisting of a radically polymerizable high molecular compound, a radically polymerizable less-volatile monomer and/or a functional intermediate agent, and a volatile solvent. The radiation may be electron beams at an energy level ranging from 100 to 1,000 keV. An advantage of this invention is that the ratio of the prepolymer to the monomer can be kept constant without difficulty during the irradiation operation, so that the variation in thickness is very small. Another advantage is that the addition of a monomer is not necessary for viscosity reduction, so that the optimum cross-linking density can be obtained. The molecular weight is so high that application by spraying is possible. The solvent remaining after the irradiation operation has substantially no influence on the polymerization hardening and gel content. In one example, 62 parts of prepolymer produced by reacting an epoxy resin Epikote No.1001 with an equal equivalent of acrylic acid were mixed with 17 parts of hydroxyl ethyl acrylate, 77.5 parts of methyl ethyl ketone and 5.5 parts of isopropyl alcohol to produce a vehicle composition. This composition was applied onto the surface of glass plate 20 microns in thickness. The monomer remaining in the mixture showed a very small change over an elapsed period of time. (Iwakiri, K.)

  18. Data elements and validation methods used for electronic surveillance of health care-associated infections: a systematic review.

    Science.gov (United States)

    Cato, Kenrick D; Cohen, Bevin; Larson, Elaine

    2015-06-01

    We describe the primary data sources, data elements, and validation methods currently used in electronic surveillance systems (ESS) for identification and surveillance of health care-associated infections (HAIs), and compares these data elements and validation methods with recommended standards. Using Preferred Reporting Items for Systematic Reviews and Meta-Analyses guidelines, a PubMed and manual search was conducted to identify research articles describing ESS for identification and surveillance of HAIs published January 1, 2009-August 31, 2014. Selected articles were evaluated to determine what data elements and validation methods were included. Among the 509 articles identified in the original literature search, 30 met the inclusion criteria. Whereas the majority of studies (83%) used recommended data sources and validated the numerator (80%), only 10% of studies performed external and internal validation. In addition, there was variation in the ESS data formats used. Our findings suggest that the majority of ESS for HAI surveillance use standard definitions, but the lack of widespread internal data, denominator, and external validation in these systems reduces the reliability of their findings. Additionally, advanced programming skills are required to create, implement, and maintain these systems and to reduce the variability in data formats. Copyright © 2015 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  19. On interval methods applied to robot reliability quantification

    International Nuclear Information System (INIS)

    Carreras, C.; Walker, I.D.

    2000-01-01

    Interval methods have recently been successfully applied to obtain significantly improved robot reliability estimates via fault trees for the case of uncertain and time-varying input reliability data. These initial studies generated output distributions of failure probabilities by extending standard interval arithmetic with new abstractions called interval grids which can be parameterized to control the complexity and accuracy of the estimation process. In this paper different parameterization strategies are evaluated in order to gain a more complete understanding of the potential benefits of the approach. A canonical example of a robot manipulator system is used to show that an appropriate selection of parameters is a key issue for the successful application of such novel interval-based methodologies

  20. Parallel fast multipole boundary element method applied to computational homogenization

    Science.gov (United States)

    Ptaszny, Jacek

    2018-01-01

    In the present work, a fast multipole boundary element method (FMBEM) and a parallel computer code for 3D elasticity problem is developed and applied to the computational homogenization of a solid containing spherical voids. The system of equation is solved by using the GMRES iterative solver. The boundary of the body is dicretized by using the quadrilateral serendipity elements with an adaptive numerical integration. Operations related to a single GMRES iteration, performed by traversing the corresponding tree structure upwards and downwards, are parallelized by using the OpenMP standard. The assignment of tasks to threads is based on the assumption that the tree nodes at which the moment transformations are initialized can be partitioned into disjoint sets of equal or approximately equal size and assigned to the threads. The achieved speedup as a function of number of threads is examined.

  1. Influenza surveillance in Europe. Comparing intensity levels calculated using the Moving Epidemic Method.

    LENUS (Irish Health Repository)

    Vega, Tomás

    2015-05-30

    Although influenza-like illnesses (ILI) and acute respiratory illnesses (ARI) surveillance are well established in Europe, the comparability of intensity among countries and seasons remains an unresolved challenge.

  2. The virtual fields method applied to spalling tests on concrete

    Directory of Open Access Journals (Sweden)

    Forquin P.

    2012-08-01

    Full Text Available For one decade spalling techniques based on the use of a metallic Hopkinson bar put in contact with a concrete sample have been widely employed to characterize the dynamic tensile strength of concrete at strain-rates ranging from a few tens to two hundreds of s−1. However, the processing method mainly based on the use of the velocity profile measured on the rear free surface of the sample (Novikov formula remains quite basic and an identification of the whole softening behaviour of the concrete is out of reach. In the present paper a new processing method is proposed based on the use of the Virtual Fields Method (VFM. First, a digital high speed camera is used to record the pictures of a grid glued on the specimen. Next, full-field measurements are used to obtain the axial displacement field at the surface of the specimen. Finally, a specific virtual field has been defined in the VFM equation to use the acceleration map as an alternative ‘load cell’. This method applied to three spalling tests allowed to identify Young’s modulus during the test. It was shown that this modulus is constant during the initial compressive part of the test and decreases in the tensile part when micro-damage exists. It was also shown that in such a simple inertial test, it was possible to reconstruct average axial stress profiles using only the acceleration data. Then, it was possible to construct local stress-strain curves and derive a tensile strength value.

  3. Use of antibiotics in nursing homes--surveillance with different methods.

    Science.gov (United States)

    Eriksen, Hanne-Merete; Sæther, Anja Ramberg; Viktil, Kirsten K; Andberg, Lene; Munkerud, Marianne Winther; Willoch, Karin; Blix, Hege Salvesen

    2013-10-15

    Residents in nursing homes have a heightened risk of developing infections that should be treated with antibiotics. Inappropriate use of antibiotics may generate drug-related problems and increase resistance. In this study, we describe the use of antibiotics in nursing homes on the basis of prevalence surveys and drug sales statistics. Five nursing homes in Oslo participated in two one-day surveys in 2009. All use of systemic antibiotics was registered. The data collection was undertaken according to a protocol developed by the European Surveillance of Antimicrobial Consumption (ESAC) Network and was part of a European study. The nursing homes' drug sales statistics for systemic antibiotics during 2009, distributed by the number of bed days for each nursing home, were estimated. Information on indications for each antibiotic from the prevalence surveys was collated with sales data to achieve an estimate of how the purchased antibiotics were used. The prevalence surveys showed that more than 8% of the residents received antibiotics. Prophylactic treatment accounted for 33% of the prescriptions. A prevalence of antibiotic use of 10% was estimated from the drug sales statistics. Urinary tract infection was the most frequently registered indication. Pivmecillinam and methenamine were most frequently prescribed and most frequently purchased. Most courses of treatment were prescribed in accordance with the national guidelines for antibiotic use. The results from the drug sales statistics concurred well with the prevalence surveys, and the methods can thus be relevant for purposes of monitoring the use of antibiotics.

  4. Potassium fertilizer applied by different methods in the zucchini crop

    Directory of Open Access Journals (Sweden)

    Carlos N. V. Fernandes

    Full Text Available ABSTRACT Aiming to evaluate the effect of potassium (K doses applied by the conventional method and fertigation in zucchini (Cucurbita pepo L., a field experiment was conducted in Fortaleza, CE, Brazil. The statistical design was a randomized block, with four replicates, in a 4 x 2 factorial scheme, which corresponded to four doses of K (0, 75, 150 and 300 kg K2O ha-1 and two fertilization methods (conventional and fertigation. The analyzed variables were: fruit mass (FM, number of fruits (NF, fruit length (FL, fruit diameter (FD, pulp thickness (PT, soluble solids (SS, yield (Y, water use efficiency (WUE and potassium use efficiency (KUE, besides an economic analysis using the net present value (NPV, internal rate of return (IRR and payback period (PP. K doses influenced FM, FD, PT and Y, which increased linearly, with the highest value estimated at 36,828 kg ha-1 for the highest K dose (300 kg K2O ha-1. This dose was also responsible for the largest WUE, 92 kg ha-1 mm-1. KUE showed quadratic behavior and the dose of 174 kg K2O ha-1 led to its maximum value (87.41 kg ha-1 (kg K2O ha-1-1. All treatments were economically viable, and the most profitable months were May, April, December and November.

  5. Digital dashboard design using multiple data streams for disease surveillance with influenza surveillance as an example.

    Science.gov (United States)

    Cheng, Calvin K Y; Ip, Dennis K M; Cowling, Benjamin J; Ho, Lai Ming; Leung, Gabriel M; Lau, Eric H Y

    2011-10-14

    Great strides have been made exploring and exploiting new and different sources of disease surveillance data and developing robust statistical methods for analyzing the collected data. However, there has been less research in the area of dissemination. Proper dissemination of surveillance data can facilitate the end user's taking of appropriate actions, thus maximizing the utility of effort taken from upstream of the surveillance-to-action loop. The aims of the study were to develop a generic framework for a digital dashboard incorporating features of efficient dashboard design and to demonstrate this framework by specific application to influenza surveillance in Hong Kong. Based on the merits of the national websites and principles of efficient dashboard design, we designed an automated influenza surveillance digital dashboard as a demonstration of efficient dissemination of surveillance data. We developed the system to synthesize and display multiple sources of influenza surveillance data streams in the dashboard. Different algorithms can be implemented in the dashboard for incorporating all surveillance data streams to describe the overall influenza activity. We designed and implemented an influenza surveillance dashboard that utilized self-explanatory figures to display multiple surveillance data streams in panels. Indicators for individual data streams as well as for overall influenza activity were summarized in the main page, which can be read at a glance. Data retrieval function was also incorporated to allow data sharing in standard format. The influenza surveillance dashboard serves as a template to illustrate the efficient synthesization and dissemination of multiple-source surveillance data, which may also be applied to other diseases. Surveillance data from multiple sources can be disseminated efficiently using a dashboard design that facilitates the translation of surveillance information to public health actions.

  6. Flood Hazard Mapping by Applying Fuzzy TOPSIS Method

    Science.gov (United States)

    Han, K. Y.; Lee, J. Y.; Keum, H.; Kim, B. J.; Kim, T. H.

    2017-12-01

    There are lots of technical methods to integrate various factors for flood hazard mapping. The purpose of this study is to suggest the methodology of integrated flood hazard mapping using MCDM(Multi Criteria Decision Making). MCDM problems involve a set of alternatives that are evaluated on the basis of conflicting and incommensurate criteria. In this study, to apply MCDM to assessing flood risk, maximum flood depth, maximum velocity, and maximum travel time are considered as criterion, and each applied elements are considered as alternatives. The scheme to find the efficient alternative closest to a ideal value is appropriate way to assess flood risk of a lot of element units(alternatives) based on various flood indices. Therefore, TOPSIS which is most commonly used MCDM scheme is adopted to create flood hazard map. The indices for flood hazard mapping(maximum flood depth, maximum velocity, and maximum travel time) have uncertainty concerning simulation results due to various values according to flood scenario and topographical condition. These kind of ambiguity of indices can cause uncertainty of flood hazard map. To consider ambiguity and uncertainty of criterion, fuzzy logic is introduced which is able to handle ambiguous expression. In this paper, we made Flood Hazard Map according to levee breach overflow using the Fuzzy TOPSIS Technique. We confirmed the areas where the highest grade of hazard was recorded through the drawn-up integrated flood hazard map, and then produced flood hazard map can be compared them with those indicated in the existing flood risk maps. Also, we expect that if we can apply the flood hazard map methodology suggested in this paper even to manufacturing the current flood risk maps, we will be able to make a new flood hazard map to even consider the priorities for hazard areas, including more varied and important information than ever before. Keywords : Flood hazard map; levee break analysis; 2D analysis; MCDM; Fuzzy TOPSIS

  7. Grate monitoring systems - New methods for surveillance and control; Rostoevervakning - Nya metoder foer reglering och oevervakning av foerbraenningsroster

    Energy Technology Data Exchange (ETDEWEB)

    Rodin, Aasa; Jacoby, Juergen; Blom, Elisabet

    2003-08-01

    The objective of this project has been to practically test and evaluate new measurement methods for surveillance of grate combustion systems as well as to investigate the feasibility and applicability of different instruments with respect to repetitive and continuous measurement signals. Finding adequate measurement techniques will enable a better control of the combustion grate which will result in an even combustion. This in turn will reduce the overall emissions and will increase the average load on the entire system. This aim can however only be reached if the momentary conditions on the grate are known. Three different laser systems and one ultrasonic system have been tested during the course of the project. None of these instruments however was suitable for continuous measurements in such an environment. It is expected that the very high concentration of dust and particulates inside the incinerator caused a too intense dispersion of the measurement signals. All units that have been tested are commercially available and are not specifically designed for measurements in a waste incinerator. The exact signal processing within each system was not known and its effect on the measurement results could thus not be estimated. Due to the high concentration of particulates and dust, any measurement system should have an intensity of its measurement signal higher than for low-dust applications. However, commercial instruments have been developed in the opposite direction, i.e. lower intensities in order to improve that safety of the working environment. Radar-based systems have been considered as a possible measurement technique. However, the fuel needs to be conductive in order to act as a radar-reflector. This is not the case in a waste incinerator, hence radar was excluded as a suitable technique. Gamma-radiation measurement systems are commonly applied for level surveillance applications. Usually the measurement direction is horizontally. Placing such a system

  8. Post-discharge surveillance (PDS) for surgical site infections: a good method is more important than a long duration.

    Science.gov (United States)

    Koek, M B; Wille, J C; Isken, M R; Voss, A; van Benthem, B H

    2015-02-26

    Post-discharge surveillance (PDS) for surgical site infections (SSIs) normally lasts 30 days, or one year after implant surgery, causing delayed feedback to healthcare professionals. We investigated the effect of shortened PDS durations on SSI incidence to determine whether shorter PDS durations are justified. We also studied the impact of two national PDS methods (those mandatory since 2009 (‘mandatory’) and other methods acceptable before 2009 (‘other’)) on SSI incidence. From Dutch surveillance (PREZIES) data (1999-2008), four implant-free surgeries (breast amputation, Caesarean section, laparoscopic cholecystectomy and colectomy) and two implant surgeries (knee replacement and total hip replacement) were selected. We studied the impact of PDS duration and method on SSI incidences by survival and Cox regression analyses. We included 105,607 operations. Shortened PDS duration for implant surgery from one year to 90 days resulted in 6–14% of all SSIs being missed. For implant-free procedures, PDS reduction from 30 to 21 days caused similar levels of missed SSIs. In contrast, up to 62% of SSIs (for cholecystectomy) were missed if other instead of mandatory PDS methods were used. Inferior methods of PDS, rather than shortened PDS durations, may lead to greater underestimation of SSI incidence. Our data validate international recommendations to limit the maximum PDS duration (for implant surgeries) to 90 days for surveillance purposes, as this provides robust insight into trends.

  9. Applying multi-resolution numerical methods to geodynamics

    Science.gov (United States)

    Davies, David Rhodri

    Computational models yield inaccurate results if the underlying numerical grid fails to provide the necessary resolution to capture a simulation's important features. For the large-scale problems regularly encountered in geodynamics, inadequate grid resolution is a major concern. The majority of models involve multi-scale dynamics, being characterized by fine-scale upwelling and downwelling activity in a more passive, large-scale background flow. Such configurations, when coupled to the complex geometries involved, present a serious challenge for computational methods. Current techniques are unable to resolve localized features and, hence, such models cannot be solved efficiently. This thesis demonstrates, through a series of papers and closely-coupled appendices, how multi-resolution finite-element methods from the forefront of computational engineering can provide a means to address these issues. The problems examined achieve multi-resolution through one of two methods. In two-dimensions (2-D), automatic, unstructured mesh refinement procedures are utilized. Such methods improve the solution quality of convection dominated problems by adapting the grid automatically around regions of high solution gradient, yielding enhanced resolution of the associated flow features. Thermal and thermo-chemical validation tests illustrate that the technique is robust and highly successful, improving solution accuracy whilst increasing computational efficiency. These points are reinforced when the technique is applied to geophysical simulations of mid-ocean ridge and subduction zone magmatism. To date, successful goal-orientated/error-guided grid adaptation techniques have not been utilized within the field of geodynamics. The work included herein is therefore the first geodynamical application of such methods. In view of the existing three-dimensional (3-D) spherical mantle dynamics codes, which are built upon a quasi-uniform discretization of the sphere and closely coupled

  10. Analytic methods in applied probability in memory of Fridrikh Karpelevich

    CERN Document Server

    Suhov, Yu M

    2002-01-01

    This volume is dedicated to F. I. Karpelevich, an outstanding Russian mathematician who made important contributions to applied probability theory. The book contains original papers focusing on several areas of applied probability and its uses in modern industrial processes, telecommunications, computing, mathematical economics, and finance. It opens with a review of Karpelevich's contributions to applied probability theory and includes a bibliography of his works. Other articles discuss queueing network theory, in particular, in heavy traffic approximation (fluid models). The book is suitable

  11. Evaluation of the antibacterial residue surveillance programme in Danish pigs using Bayesian methods

    DEFF Research Database (Denmark)

    Freitas de Matos Baptista, Filipa; Alban, L.; Olsen, A. M.

    2012-01-01

    the impact of a potential risk-based sampling approach to the residue surveillance programme in Danish slaughter pigs. Danish surveillance data from 2005 to 2009 and limited knowledge about true prevalence and test sensitivity and specificity were included in the model. According to the model, the true...... classes used in Danish pigs. If high-risk slaughter pigs could be identified by taking into account antibacterial use or meat inspection risk factors, a potential risk-based sampling approach to antibacterial residue surveillance in slaughter pigs would allow reducing the sample size substantially, while...... antibacterial residue prevalence in Danish pigs is very low in both sows (~0.20%) and slaughter pigs (~0.01%). Despite data constraints, the results suggest that the current screening test used in Denmark presents high sensitivity (85-99%) and very high specificity (>99%) for the most relevant antibacterial...

  12. Review on finite element method | Erhunmwun | Journal of Applied ...

    African Journals Online (AJOL)

    Journal of Applied Sciences and Environmental Management. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 21, No 5 (2017) >. Log in or Register to get access to full text downloads.

  13. Acti-Glide: a simple method of applying compression hosiery.

    Science.gov (United States)

    Hampton, Sylvie

    2005-05-01

    Compression hosiery is often worn to help prevent aching legs and swollen ankles, to prevent ulceration, to treat venous ulceration or to treat varicose veins. However, patients and nurses may experience problems applying hosiery and this can lead to non-concordance in patients and possibly reluctance from nurses to use compression hosiery. A simple solution to applying firm hosiery is Acti-Glide from Activa Healthcare.

  14. Dose rate reduction method for NMCA applied BWR plants

    International Nuclear Information System (INIS)

    Nagase, Makoto; Aizawa, Motohiro; Ito, Tsuyoshi; Hosokawa, Hideyuki; Varela, Juan; Caine, Thomas

    2012-09-01

    BRAC (BWR Radiation Assessment and Control) dose rate is used as an indicator of the incorporation of activated corrosion by products into BWR recirculation piping, which is known to be a significant contributor to dose rate received by workers during refueling outages. In order to reduce radiation exposure of the workers during the outage, it is desirable to keep BRAC dose rates as low as possible. After HWC was adopted to reduce IGSCC, a BRAC dose rate increase was observed in many plants. As a countermeasure to these rapid dose rate increases under HWC conditions, Zn injection was widely adopted in United States and Europe resulting in a reduction of BRAC dose rates. However, BRAC dose rates in several plants remain high, prompting the industry to continue to investigate methods to achieve further reductions. In recent years a large portion of the BWR fleet has adopted NMCA (NobleChem TM ) to enhance the hydrogen injection effect to suppress SCC. After NMCA, especially OLNC (On-Line NobleChem TM ), BRAC dose rates were observed to decrease. In some OLNC applied BWR plants this reduction was observed year after year to reach a new reduced equilibrium level. This dose rate reduction trends suggest the potential dose reduction might be obtained by the combination of Pt and Zn injection. So, laboratory experiments and in-plant tests were carried out to evaluate the effect of Pt and Zn on Co-60 deposition behaviour. Firstly, laboratory experiments were conducted to study the effect of noble metal deposition on Co deposition on stainless steel surfaces. Polished type 316 stainless steel coupons were prepared and some of them were OLNC treated in the test loop before the Co deposition test. Water chemistry conditions to simulate HWC were as follows: Dissolved oxygen, hydrogen and hydrogen peroxide were below 5 ppb, 100 ppb and 0 ppb (no addition), respectively. Zn was injected to target a concentration of 5 ppb. The test was conducted up to 1500 hours at 553 K. Test

  15. Using Routinely Collected Hospital Data for Child Maltreatment Surveillance: Issues, Methods and Patterns

    Directory of Open Access Journals (Sweden)

    Scott Debbie A

    2011-01-01

    Full Text Available Abstract Background International data on child maltreatment are largely derived from child protection agencies, and predominantly report only substantiated cases of child maltreatment. This approach underestimates the incidence of maltreatment and makes inter-jurisdictional comparisons difficult. There has been a growing recognition of the importance of health professionals in identifying, documenting and reporting suspected child maltreatment. This study aimed to describe the issues around case identification using coded morbidity data, outline methods for selecting and grouping relevant codes, and illustrate patterns of maltreatment identified. Methods A comprehensive review of the ICD-10-AM classification system was undertaken, including review of index terms, a free text search of tabular volumes, and a review of coding standards pertaining to child maltreatment coding. Identified codes were further categorised into maltreatment types including physical abuse, sexual abuse, emotional or psychological abuse, and neglect. Using these code groupings, one year of Australian hospitalisation data for children under 18 years of age was examined to quantify the proportion of patients identified and to explore the characteristics of cases assigned maltreatment-related codes. Results Less than 0.5% of children hospitalised in Australia between 2005 and 2006 had a maltreatment code assigned, almost 4% of children with a principal diagnosis of a mental and behavioural disorder and over 1% of children with an injury or poisoning as the principal diagnosis had a maltreatment code assigned. The patterns of children assigned with definitive T74 codes varied by sex and age group. For males selected as having a maltreatment-related presentation, physical abuse was most commonly coded (62.6% of maltreatment cases while for females selected as having a maltreatment-related presentation, sexual abuse was the most commonly assigned form of maltreatment (52.9% of

  16. Applied Research of Decision Tree Method on Football Training

    Directory of Open Access Journals (Sweden)

    Liu Jinhui

    2015-01-01

    Full Text Available This paper will make an analysis of decision tree at first, and then offer a further analysis of CLS based on it. As CLS contains the most substantial and most primitive decision-making idea, it can provide the basis of decision tree establishment. Due to certain limitation in details, the ID3 decision tree algorithm is introduced to offer more details. It applies information gain as attribute selection metrics to provide reference for seeking the optimal segmentation point. At last, the ID3 algorithm is applied in football training. Verification is made on this algorithm and it has been proved effectively and reasonably.

  17. Muon radiography method for fundamental and applied research

    Science.gov (United States)

    Alexandrov, A. B.; Vladymyrov, M. S.; Galkin, V. I.; Goncharova, L. A.; Grachev, V. M.; Vasina, S. G.; Konovalova, N. S.; Malovichko, A. A.; Managadze, A. K.; Okat'eva, N. M.; Polukhina, N. G.; Roganova, T. M.; Starkov, N. I.; Tioukov, V. E.; Chernyavsky, M. M.; Shchedrina, T. V.

    2017-12-01

    This paper focuses on the basic principles of the muon radiography method, reviews the major muon radiography experiments, and presents the first results in Russia obtained by the authors using this method based on emulsion track detectors.

  18. POSSIBILITIES OF IMPROVING THE METHODS AND TECHNIQUES USED IN THE SURVEILLANCE OF CREDIT RISK MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Balogh Peter

    2010-12-01

    Full Text Available Through their daily activities, credit institutions are subject to various risks which could affect both the bank and the whole banking system, national and transnational. The activity field of the banks, marked by volatility, by the internationalization and liberalization of the financial markets, is in a continuous change. The contagion effect, as it has been proved by the spread of the financial crisis effects, determines the surveillance authorities to pay increased attention to the financial risks and implicitly to the systemic risk. In this study, to start with, there shall be presented some aspects regarding the banking rating systems used by the surveillance authorities and then some ways of improving the models of managing credit risk in banks. In the end, there will be demonstrated that the risk profile of the banking institution has a determining role in the management of the credit portfolio.

  19. Application of advanced irradiation analysis methods to light water reactor pressure vessel test and surveillance programs

    International Nuclear Information System (INIS)

    Odette, R.; Dudey, N.; McElroy, W.; Wullaert, R.; Fabry, A.

    1977-01-01

    Inaccurate characterization and inappropriate application of neutron irradiation exposure variables contribute a substantial amount of uncertainty to embrittlement analysis of light water reactor pressure vessels. Damage analysis involves characterization of the irradiation environment (dosimetry), correlation of test and surveillance metallurgical and dosimetry data, and projection of such data to service conditions. Errors in available test and surveillance dosimetry data are estimated to contribute a factor of approximately 2 to the data scatter. Non-physical (empirical) correlation procedures and the need to extrapolate to the vessel may add further error. Substantial reductions in these uncertainties in future programs can be obtained from a more complete application of available damage analysis tools which have been developed for the fast reactor program. An approach to reducing embrittlement analysis errors is described, and specific examples of potential applications are given. The approach is based on damage analysis techniques validated and calibrated in benchmark environments

  20. Surveillance in a Telemedicine Setting: Application of Epidemiologic Methods at NASA Johnson Space Center Adriana

    Science.gov (United States)

    Babiak-Vazquez, Adriana; Ruffaner, Lanie; Wear, Mary; Crucian Brian; Sams, Clarence; Lee, Lesley R.; Van Baalen, Mary

    2016-01-01

    Space medicine presents unique challenges and opportunities for epidemiologists, such as the use of telemedicine during spaceflight. Medical capabilities aboard the International Space Station (ISS) are limited due to severe restrictions on power, volume, and mass. Consequently, inflight health information is based heavily on crewmember (CM) self-report of signs and symptoms, rather than formal diagnoses. While CM's are in flight, the primary source of crew health information is verbal communication between physicians and crewmembers. In 2010 NASA implemented the Lifetime Surveillance of Astronaut Health, an occupational surveillance program for the U.S. Astronaut corps. This has shifted the epidemiological paradigm from tracking diagnoses based on traditional terrestrial clinical practice to one that incorporates symptomatology and may gain a more population-based understanding of early detection of disease process.

  1. Development and refinement of new statistical methods for enhanced syndromic surveillance during the 2012 Olympic and Paralympic Games.

    Science.gov (United States)

    Morbey, Roger A; Elliot, Alex J; Charlett, Andre; Andrews, Nick; Verlander, Neville Q; Ibbotson, Sue; Smith, Gillian E

    2015-06-01

    Prior to the 2012 London Olympic and Paralympic Games, new statistical methods had to be developed for the enhanced syndromic surveillance during the Games. Different methods were developed depending on whether or not historical data were available. Practical solutions were needed to cope with the required daily reporting and data quality issues. During the Games, nearly 4800 signals were tested on average each day, generating statistical alarms that were assessed to provide information on areas of potential public health concern and reassurance that no major adverse incident had occurred. spjhi;21/2/159/FIG41460458213517577 F1 fig4-1460458213517577. © The Author(s) 2013.

  2. New Method for Tuning Robust Controllers Applied to Robot Manipulators

    Directory of Open Access Journals (Sweden)

    Gerardo Romero

    2012-11-01

    Full Text Available This paper presents a methodology to select the parameters of a nonlinear controller using Linear Matrix Inequalities (LMI. The controller is applied to a robotic manipulator to improve its robustness. This type of dynamic system enables the robust control law to be applied because it largely depends on the mathematical model of the system; however, in most cases it is impossible to be completely precise. The discrepancy between the dynamic behaviour of the robot and its mathematical model is taken into account by including a nonlinear term that represents the model's uncertainty. The controller's parameters are selected with two purposes: to guarantee the asymptotic stability of the closed-loop system while taking into account the uncertainty, and to increase its robustness margin. The results are validated with numerical simulations for a particular case study; these are then compared with previously published results to prove a better controller performance.

  3. Waste classification and methods applied to specific disposal sites

    International Nuclear Information System (INIS)

    Rogers, V.C.

    1979-01-01

    An adequate definition of the classes of radioactive wastes is necessary to regulating the disposal of radioactive wastes. A classification system is proposed in which wastes are classified according to characteristics relating to their disposal. Several specific sites are analyzed with the methodology in order to gain insights into the classification of radioactive wastes. Also presented is the analysis of ocean dumping as it applies to waste classification. 5 refs

  4. The World Health Organization STEPwise Approach to Noncommunicable Disease Risk-Factor Surveillance: Methods, Challenges, and Opportunities.

    Science.gov (United States)

    Riley, Leanne; Guthold, Regina; Cowan, Melanie; Savin, Stefan; Bhatti, Lubna; Armstrong, Timothy; Bonita, Ruth

    2016-01-01

    We sought to outline the framework and methods used by the World Health Organization (WHO) STEPwise approach to noncommunicable disease (NCD) surveillance (STEPS), describe the development and current status, and discuss strengths, limitations, and future directions of STEPS surveillance. STEPS is a WHO-developed, standardized but flexible framework for countries to monitor the main NCD risk factors through questionnaire assessment and physical and biochemical measurements. It is coordinated by national authorities of the implementing country. The STEPS surveys are generally household-based and interviewer-administered, with scientifically selected samples of around 5000 participants. To date, 122 countries across all 6 WHO regions have completed data collection for STEPS or STEPS-aligned surveys. STEPS data are being used to inform NCD policies and track risk-factor trends. Future priorities include strengthening these linkages from data to action on NCDs at the country level, and continuing to develop STEPS' capacities to enable a regular and continuous cycle of risk-factor surveillance worldwide.

  5. Review on finite element method | Erhunmwun | Journal of Applied ...

    African Journals Online (AJOL)

    ... finite elements, so that it is possible to systematically construct the approximation functions needed in a variational or weighted-residual approximation of the solution of a problem over each element. Keywords: Weak Formulation, Discretisation, Numerical methods, Finite element method, Global equations, Nodal solution ...

  6. The flow curvature method applied to canard explosion

    Energy Technology Data Exchange (ETDEWEB)

    Ginoux, Jean-Marc [Laboratoire Protee, IUT de Toulon, Universite du Sud, BP 20132, F-83957 La Garde cedex (France); Llibre, Jaume, E-mail: ginoux@univ-tln.fr, E-mail: jllibre@mat.uab.cat [Departament de Matematiques, Universitat Autonoma de Barcelona, 08193 Bellaterra, Barcelona (Spain)

    2011-11-18

    The aim of this work is to establish that the bifurcation parameter value leading to a canard explosion in dimension 2 obtained by the so-called geometric singular perturbation method can be found according to the flow curvature method. This result will be then exemplified with the classical Van der Pol oscillator. (paper)

  7. The flow curvature method applied to canard explosion

    Science.gov (United States)

    Ginoux, Jean-Marc; Llibre, Jaume

    2011-11-01

    The aim of this work is to establish that the bifurcation parameter value leading to a canard explosion in dimension 2 obtained by the so-called geometric singular perturbation method can be found according to the flow curvature method. This result will be then exemplified with the classical Van der Pol oscillator.

  8. Using short-message-service notification as a method to improve acute flaccid paralysis surveillance in Papua New Guinea

    Directory of Open Access Journals (Sweden)

    Siddhartha Sankar Datta

    2016-05-01

    Full Text Available Abstract Background High quality acute flaccid paralysis (AFP surveillance is required to maintain polio-free status of a country. Papua New Guinea (PNG is considered as one of the highest risk countries for polio re-importation and circulation in the Western Pacific Region (WPRO of the World Health Organization due to poor healthcare infrastructure and inadequate performance in AFP surveillance. The Government of PNG, in collaboration with WHO, piloted the introduction of short-message-service (SMS to sensitize pediatricians and provincial disease control officers on AFP and to receive notification of possible AFP cases to improve surveillance quality in PNG. Methods Ninety six health care professionals were registered to receive SMS reminders to report any case of acute flaccid paralysis. Fourteen SMS messages were sent to each participant from September 2012 to November 2013. The number of reported AFP cases were compared before and after the introduction of SMS. Results Two hundred fifty three unique responses were received with an overall response rate of 21 %. More than 80 % of responses were reported within 3 days of sending the SMS. The number of reported AFP cases increased from 10 cases per year in 2009–2012 to 25 cases per year during the study period and correlated with provincial participation of the health care professionals. Conclusions Combined with improved sensitization of health care professionals on AFP reporting criteria and sample collection, SMS messaging provides an effective means to increase timely reporting and improve the availability of epidemiologic information on polio surveillance in PNG.

  9. Does the Method of Radiologic Surveillance Impact Survival Following Resection of Stage I Non-Small Cell Lung Cancer?

    Science.gov (United States)

    Crabtree, Traves D.; Puri, Varun; Chen, Simon B.; Gierada, David S.; Bell, Jennifer M.; Broderick, Stephen; Krupnick, A. Sasha; Kreisel, Daniel; Patterson, G. Alexander; Meyers, Bryan F.

    2014-01-01

    Objective Controversy persists regarding appropriate radiographic surveillance strategies following lung cancer resection. We compared the impact of surveillance CT scan (CT) vs. chest radiograph (CXR) in patients who underwent resection for stage I lung cancer. Methods A retrospective analysis was performed of all patients undergoing resection for pathologic stage I lung cancer from January 2000–April 2013. After resection, follow-up included routine history and physical exam in conjunction with CXR or CT at the discretion of the treating physician. Identification of successive lung malignancy (i.e. recurrence at any new site or new primary) and survival were recorded. Results There were 554 evaluable patients with 232 undergoing routine postoperative CT and 322 receiving routine CXR. Postoperative five-year survival was 67.8% in the CT group vs. 74.8% in the CXR group (p = 0.603). Successive lung malignancy was found in 27% (63/232) of patients undergoing CT vs. 22% (72/322) receiving CXR (p = 0.19). The mean time from surgery to diagnosis of successive malignancy was 1.93 years for CT vs. 2.56 years for CXR (p = 0.046). For the CT group, 41% (26/63) of successive malignancies were treated with curative intent vs. 40% (29/72) in the CXR group (p = 0.639). Cox-proportional hazard analysis indicated imaging modality (CT vs. CXR) was not associated with survival (p = 0.958). Conclusion Surveillance CT may result in earlier diagnosis of successive malignancy vs. CXR in stage I lung cancer, although no difference in survival was demonstrated. A randomized trial would help determine the impact of postoperative surveillance strategies on survival. PMID:25218540

  10. Literature Review of Applying Visual Method to Understand Mathematics

    Directory of Open Access Journals (Sweden)

    Yu Xiaojuan

    2015-01-01

    Full Text Available As a new method to understand mathematics, visualization offers a new way of understanding mathematical principles and phenomena via image thinking and geometric explanation. It aims to deepen the understanding of the nature of concepts or phenomena and enhance the cognitive ability of learners. This paper collates and summarizes the application of this visual method in the understanding of mathematics. It also makes a literature review of the existing research, especially with a visual demonstration of Euler’s formula, introduces the application of this method in solving relevant mathematical problems, and points out the differences and similarities between the visualization method and the numerical-graphic combination method, as well as matters needing attention for its application.

  11. Methodical Aspects of Applying Strategy Map in an Organization

    Directory of Open Access Journals (Sweden)

    Piotr Markiewicz

    2013-06-01

    Full Text Available One of important aspects of strategic management is the instrumental aspect included in a rich set of methods and techniques used at particular stages of strategic management process. The object of interest in this study is the development of views and the implementation of strategy as an element of strategic management and instruments in the form of methods and techniques. The commonly used method in strategy implementation and measuring progress is Balanced Scorecard (BSC. The method was created as a result of implementing the project “Measuring performance in the Organization of the future” of 1990, completed by a team under the supervision of David Norton (Kaplan, Norton 2002. The developed method was used first of all to evaluate performance by decomposition of a strategy into four perspectives and identification of measures of achievement. In the middle of 1990s the method was improved by enriching it, first of all, with a strategy map, in which the process of transition of intangible assets into tangible financial effects is reflected (Kaplan, Norton 2001. Strategy map enables illustration of cause and effect relationship between processes in all four perspectives and performance indicators at the level of organization. The purpose of the study being prepared is to present methodical conditions of using strategy maps in the strategy implementation process in organizations of different nature.

  12. Diagrammatic Monte Carlo method as applied to the polaron problem

    International Nuclear Information System (INIS)

    Mishchenko, A.S.

    2005-01-01

    Exact numerical solution methods for the problem of a few particles interacting with one another and with several bosonic excitation modes are presented. The diagrammatic Monte Carlo method allows the exact calculation of the Green function, and the stochastic optimization technique provides an analytic continuation. Results unobtainable by conventional methods are discussed, including the properties of excited states in the self-trapping phenomenon, the optical spectra of polarons in all coupling regimes, the validity analysis of the exciton models, and the photoemission spectra of a phonon-coupled hole [ru

  13. Applying a life cycle approach to project management methods

    OpenAIRE

    Biggins, David; Trollsund, F.; Høiby, A.L.

    2016-01-01

    Project management is increasingly important to organisations because projects are the method\\ud by which organisations respond to their environment. A key element within project management\\ud is the standards and methods that are used to control and conduct projects, collectively known as\\ud project management methods (PMMs) and exemplified by PRINCE2, the Project Management\\ud Institute’s and the Association for Project Management’s Bodies of Knowledge (PMBOK and\\ud APMBOK. The purpose of t...

  14. Method for curing alkyd resin compositions by applying ionizing radiation

    International Nuclear Information System (INIS)

    Watanabe, T.; Murata, K.; Maruyama, T.

    1975-01-01

    An alkyd resin composition is prepared by dissolving a polymerizable alkyd resin having from 10 to 50 percent of oil length into a vinyl monomer. The polymerizable alkyd resin is obtained by a half-esterification reaction of an acid anhydride having a polymerizable unsaturated group and an alkyd resin modified with conjugated unsaturated oil having at least one reactive hydroxyl group per one molecule. The alkyd resin composition thus obtained is coated on an article, and ionizing radiation is applied on the article to cure the coated film thereon. (U.S.)

  15. Spectral methods applied to fluidized bed combustors. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, R.C.; Christofides, N.J.; Junk, K.W.; Raines, T.S.; Thiede, T.D.

    1996-08-01

    The objective of this project was to develop methods for characterizing fuels and sorbents from time-series data obtained during transient operation of fluidized bed boilers. These methods aimed at determining time constants for devolatilization and char burnout using carbon dioxide (CO{sub 2}) profiles and from time constants for the calcination and sulfation processes using CO{sub 2} and sulfur dioxide (SO{sub 2}) profiles.

  16. Reliability demonstration of imaging surveillance systems

    International Nuclear Information System (INIS)

    Sheridan, T.F.; Henderson, J.T.; MacDiarmid, P.R.

    1979-01-01

    Security surveillance systems which employ closed circuit television are being deployed with increasing frequency for the protection of property and other valuable assets. A need exists to demonstrate the reliability of such systems before their installation to assure that the deployed systems will operate when needed with only the scheduled amount of maintenance and support costs. An approach to the reliability demonstration of imaging surveillance systems which employ closed circuit television is described. Failure definitions based on industry television standards and imaging alarm assessment criteria for surveillance systems are discussed. Test methods which allow 24 hour a day operation without the need for numerous test scenarios, test personnel and elaborate test facilities are presented. Existing reliability demonstration standards are shown to apply which obviate the need for elaborate statistical tests. The demonstration methods employed are shown to have applications in other types of imaging surveillance systems besides closed circuit television

  17. Community-based sampling methods for surveillance of the Chagas disease vector, Triatoma dimidiata (Hemiptera: Reduviidae: Triatominae).

    Science.gov (United States)

    Weeks, E N I; Davies, C; Rosales, C Cordón; Yeo, M; Gezan, S A; Parra-Henao, G; Cameron, M M

    2014-09-01

    In Guatemala, the most widespread vector of Trypanosoma cruzi (Kinetoplastida: Trypanosomatidae), the causative agent of Chagas disease, is Triatoma dimidiata (Latreille) (Hemiptera: Reduviidae: Triatominae). T. dimidiata is native to Guatemala and is present in both domestic and sylvatic habitats. Consequently, control of T. dimidiata is difficult because after successful elimination from homes, individual insects can recolonize homes from the surrounding environment. Therefore, intensive long-term surveillance of this species is essential to ensure adequate control is achieved. Manual inspection for signs of infestation, the current method used to monitor Triatominae throughout Central and South America, is labor and time-consuming, so cost-effective alternatives are needed. The current study compared the effectiveness of the current method of surveillance of T. dimidiata with community-based techniques of G6mez-Nuñez sensor boxes, collection and observation of bugs by householders, and presence of triatomine-like feces on walls. Although manual inspection was the most sensitive method when used alone, collection by householders also was sensitive and specific and involved less effort. Sensor boxes were not sensitive indicators of T. dimidiata infestation when used alone. Two recorded variables, visual inspection for feces and the sighting of bugs by householders, were sensitive and specific indicators of infestation, and in combination with collection by householders and sensor boxes these methods were significantly more likely to detect infestations than manual inspection alone. A surveillance program that combines multiple community-based techniques should have low cost and involve minimal effort from the government and at the same time promote sustainable community involvement in disease prevention.

  18. Methodology implementation in order to evaluate the biological risks in the Centre for Research and Rehabilitation of Hereditary Ataxias of Cuba: a biosecurity surveillance method

    Directory of Open Access Journals (Sweden)

    Dailín Cobos Valdes

    2014-12-01

    Full Text Available Introduction: The Center for Research and Rehabilitation of Hereditary Ataxias faces biological risks. Nevertheless a Biosafety system was not yet implemented. Objective: To apply the methodology in order to evaluate these risks Materials and Methods: Interview with the researchers of the center and the use of the methodology for evaluating biological risks designed for Cobos, 2009. Results: Fifty-three biological risks were identified and evaluated, 32 as moderated, 18 as tolerable and 3 as trivial. Such classification are crucial to establish its management priorities and represent a way of surveillance in Biosafety field. Conclusion: The results of this research represent an essential factor for the Biosafety documentation development adapted to the Center and according to the legal basis in terms of biological safety in Cuba.

  19. Apply of torque method at rationalization of work

    Directory of Open Access Journals (Sweden)

    Bandurová Miriam

    2001-03-01

    Full Text Available Aim of the study was to analyse consumption of time for profession - cylinder grinder, by torque method.Method of torque following is used for detection of sorts and size of time slope, on detection of portion of individual sorts of time consumption and cause of time slope. By this way it is possible to find out coefficient of employment and recovery of workers in organizational unit. Advantage of torque survey is low costs on informations acquirement, non-fastidiousness per worker and observer, which is easy trained. It is mentally acceptable method for objects of survey.Finding and detection of reserves in activity of cylinders grinder result of torque was surveys. Loss of time presents till 8% of working time. In 5 - shift service and average occupiying of shift by 4,4 grinder ( from statistic information of service , loss at grinder of cylinders are for whole centre 1,48 worker.According presented information it was recommended to cancel one job place - grinder of cylinders - and reduce state about one grinder. Next job place isn't possible cancel, because grindery of cylinders must to adapt to the grind line by number of polished cylinders in shift and semi - finishing of polished cylinders can not be high for often changes in area of grinding and sortiment changes.By this contribution we confirmed convenience of exploitation of torque method as one of the methods using during the job rationalization.

  20. Thermoluminescence as a dating method applied to the Morocco Neolithic

    International Nuclear Information System (INIS)

    Ousmoi, M.

    1989-09-01

    Thermoluminescence is an absolute dating method which is well adapted to the study of burnt clays and so of the prehistoric ceramics belonging to the Neolithic period. The purpose of this study is to establish a first absolute chronology of the septentrional morocco Neolithic between 3000 and 7000 years before us and some improvements of the TL dating. The first part of the thesis contains some hypothesis about the morocco Neolithic and some problems to solve. Then we study the TL dating method along with new process to ameliorate the quality of the results like the shift of quartz TL peaks or the crushing of samples. The methods which were employed using 24 samples belonging to various civilisations are: the quartz inclusion method and the fine grain technique. For the dosimetry, several methods were used: determination of the K 2 O contents, alpha counting, site dosimetry using TL dosimeters and a scintillation counter. The results which were found bring some interesting answers to the archeologic question and ameliorate the chronologic schema of the Northern morocco Neolithic: development of the old cardial Neolithic in the North, and perhaps in the center of Morocco (the region of Rabat), between 5500 and 7000 before us. Development of the recent middle Neolithic around 4000-5000 before us, with a protocampaniforme (Skhirat), little older than the campaniforme recognized in the south of Spain. Development of the bronze age around 2000-4000 before us [fr

  1. Evaluation of Controller Tuning Methods Applied to Distillation Column Control

    DEFF Research Database (Denmark)

    Nielsen, Kim; W. Andersen, Henrik; Kümmel, Professor Mogens

    A frequency domain approach is used to compare the nominal performance and robustness of dual composition distillation column control tuned according to Ziegler-Nichols (ZN) and Biggest Log Modulus Tuning (BLT) for three binary distillation columns, WOBE, LUVI and TOFA. The scope...... of this is to examine whether ZN and BLT design yield satisfactory control of distillation columns. Further, PI controllers are tuned according to a proposed multivariable frequency domain method. A major conclusion is that the ZN tuned controllers yield undesired overshoot and oscillation and poor stability robustness...... properties. BLT tuning removes the overshoot and oscillation, however, at the expense of a more sluggish response. We conclude that if a simple control design is to be used, the BLT method should be referred compared to the ZN method. The frequency domain design approach presented yields a more proper trade...

  2. Modal method for crack identification applied to reactor recirculation pump

    International Nuclear Information System (INIS)

    Miller, W.H.; Brook, R.

    1991-01-01

    Nuclear reactors have been operating and producing useful electricity for many years. Within the last few years, several plants have found cracks in the reactor coolant pump shaft near the thermal barrier. The modal method and results described herein show the analytical results of using a Modal Analysis test method to determine the presence, size, and location of a shaft crack. The authors have previously demonstrated that the test method can analytically and experimentally identify shaft cracks as small as five percent (5%) of the shaft diameter. Due to small differences in material property distribution, the attempt to identify cracks smaller than 3% of the shaft diameter has been shown to be impractical. The rotor dynamics model includes a detailed motor rotor, external weights and inertias, and realistic total support stiffness. Results of the rotor dynamics model have been verified through a comparison with on-site vibration test data

  3. Boron autoradiography method applied to the study of steels

    International Nuclear Information System (INIS)

    Gugelmeier, R.; Barcelo, G.N.; Boado, J.H.; Fernandez, C.

    1986-01-01

    The boron state, contained in the steel microestructure, is determined. The autoradiography by neutrons is used, permiting to obtain boron distribution images by means of additional information which is difficult to acquire by other methods. The application of the method is described, based on the neutronic irradiation of a polished steel sample, over which a celulose nitrate sheet or other appropriate material is fixed to constitute the detector. The particles generated by the neutron-boron interaction affect the detector sheet, which is subsequently revealed with a chemical treatment and can be observed at the optical microscope. In the case of materials used for the construction of nuclear reactors, special attention must be given to the presence of boron, since owing to the exceptionaly high capacity of neutron absorption, lowest quantities of boron acquire importance. The adaption of the method to metallurgical problems allows the obtainment of a correlation between the boron distribution images and the material's microstructure. (M.E.L.) [es

  4. Diagrammatic Monte Carlo method as applied to the polaron problems

    International Nuclear Information System (INIS)

    Mishchenko, Andrei S

    2005-01-01

    Numerical methods whereby exact solutions to the problem of a few particles interacting with one another and with several bosonic excitation branches are presented. The diagrammatic Monte Carlo method allows the exact calculation of the Matsubara Green function, and the stochastic optimization technique provides an approximation-free analytic continuation. In this review, results unobtainable by conventional methods are discussed, including the properties of excited states in the self-trapping phenomenon, the optical spectra of polarons in all coupling regimes, the validity range analysis of the Frenkel and Wannier approximations relevant to the exciton, and the peculiarities of photoemission spectra of a lattice-coupled hole in a Mott insulator. (reviews of topical problems)

  5. DAKOTA reliability methods applied to RAVEN/RELAP-7.

    Energy Technology Data Exchange (ETDEWEB)

    Swiler, Laura Painton; Mandelli, Diego; Rabiti, Cristian; Alfonsi, Andrea

    2013-09-01

    This report summarizes the result of a NEAMS project focused on the use of reliability methods within the RAVEN and RELAP-7 software framework for assessing failure probabilities as part of probabilistic risk assessment for nuclear power plants. RAVEN is a software tool under development at the Idaho National Laboratory that acts as the control logic driver and post-processing tool for the newly developed Thermal-Hydraulic code RELAP-7. Dakota is a software tool developed at Sandia National Laboratories containing optimization, sensitivity analysis, and uncertainty quantification algorithms. Reliability methods are algorithms which transform the uncertainty problem to an optimization problem to solve for the failure probability, given uncertainty on problem inputs and a failure threshold on an output response. The goal of this work is to demonstrate the use of reliability methods in Dakota with RAVEN/RELAP-7. These capabilities are demonstrated on a demonstration of a Station Blackout analysis of a simplified Pressurized Water Reactor (PWR).

  6. Nonstandard Finite Difference Method Applied to a Linear Pharmacokinetics Model

    Directory of Open Access Journals (Sweden)

    Oluwaseun Egbelowo

    2017-05-01

    Full Text Available We extend the nonstandard finite difference method of solution to the study of pharmacokinetic–pharmacodynamic models. Pharmacokinetic (PK models are commonly used to predict drug concentrations that drive controlled intravenous (I.V. transfers (or infusion and oral transfers while pharmacokinetic and pharmacodynamic (PD interaction models are used to provide predictions of drug concentrations affecting the response of these clinical drugs. We structure a nonstandard finite difference (NSFD scheme for the relevant system of equations which models this pharamcokinetic process. We compare the results obtained to standard methods. The scheme is dynamically consistent and reliable in replicating complex dynamic properties of the relevant continuous models for varying step sizes. This study provides assistance in understanding the long-term behavior of the drug in the system, and validation of the efficiency of the nonstandard finite difference scheme as the method of choice.

  7. Variance reduction methods applied to deep-penetration problems

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    All deep-penetration Monte Carlo calculations require variance reduction methods. Before beginning with a detailed approach to these methods, several general comments concerning deep-penetration calculations by Monte Carlo, the associated variance reduction, and the similarities and differences of these with regard to non-deep-penetration problems will be addressed. The experienced practitioner of Monte Carlo methods will easily find exceptions to any of these generalities, but it is felt that these comments will aid the novice in understanding some of the basic ideas and nomenclature. Also, from a practical point of view, the discussions and developments presented are oriented toward use of the computer codes which are presented in segments of this Monte Carlo course

  8. Robustness of Modal Parameter Estimation Methods Applied to Lightweight Structures

    DEFF Research Database (Denmark)

    Dickow, Kristoffer Ahrens; Kirkegaard, Poul Henning; Andersen, Lars Vabbersgaard

    2013-01-01

    of two parameter estimation methods built into the commercial modal testing software B&K Pulse Re ex Advanced Modal Analysis. The investigations are done by means of frequency response functions generated from a nite-element model and subjected to articial noise before being analyzed with Pulse Re ex....... The ability to handle closely spaced modes and broad frequency ranges is investigated for a numerical model of a lightweight junction under dierent signal-to-noise ratios. The selection of both excitation points and response points are discussed. It is found that both the Rational Fraction Polynomial-Z method...

  9. Efficient electronic structure methods applied to metal nanoparticles

    DEFF Research Database (Denmark)

    Larsen, Ask Hjorth

    of efficient approaches to density functional theory and the application of these methods to metal nanoparticles. We describe the formalism and implementation of localized atom-centered basis sets within the projector augmented wave method. Basis sets allow for a dramatic increase in performance compared...... and jumps in Fermi level near magic numbers can lead to alkali-like or halogen-like behaviour when main-group atoms adsorb onto gold clusters. A non-self-consistent NewnsAnderson model is used to more closely study the chemisorption of main-group atoms on magic-number Au clusters. The behaviour at magic...

  10. Applying the Priority Distribution Method for Employee Motivation

    Directory of Open Access Journals (Sweden)

    Jonas Žaptorius

    2013-09-01

    Full Text Available In an age of increasing healthcare expenditure, the efficiency of healthcare services is a burning issue. This paper deals with the creation of a performance-related remuneration system, which would meet requirements for efficiency and sustainable quality. In real world scenarios, it is difficult to create an objective and transparent employee performance evaluation model dealing with both qualitative and quantitative criteria. To achieve these goals, the use of decision support methods is suggested and analysed. The systematic approach of practical application of the Priority Distribution Method to healthcare provider organisations is created and described.

  11. Non-perturbative methods applied to multiphoton ionization

    International Nuclear Information System (INIS)

    Brandi, H.S.; Davidovich, L.; Zagury, N.

    1982-09-01

    The use of non-perturbative methods in the treatment of atomic ionization is discussed. Particular attention is given to schemes of the type proposed by Keldysh where multiphoton ionization and tunnel auto-ionization occur for high intensity fields. These methods are shown to correspond to a certain type of expansion of the T-matrix in the intra-atomic potential; in this manner a criterium concerning the range of application of these non-perturbative schemes is suggested. A brief comparison between the ionization rate of atoms in the presence of linearly and circularly polarized light is presented. (Author) [pt

  12. Tutte’s barycenter method applied to isotopies

    NARCIS (Netherlands)

    Colin de Verdière, Éric; Pocchiola, Michel; Vegter, Gert

    2003-01-01

    This paper is concerned with applications of Tutte’s barycentric embedding theorem. It presents a method for building isotopies of triangulations in the plane, based on Tutte’s theorem and the computation of equilibrium stresses of graphs by Maxwell–Cremona’s theorem; it also provides a

  13. Inversion method applied to the rotation curves of galaxies

    Science.gov (United States)

    Márquez-Caicedo, L. A.; Lora-Clavijo, F. D.; Sanabria-Gómez, J. D.

    2017-07-01

    We used simulated annealing, Montecarlo and genetic algorithm methods for matching both numerical data of density and velocity profiles in some low surface brigthness galaxies with theoretical models of Boehmer-Harko, Navarro-Frenk-White and Pseudo Isothermal Profiles for galaxies with dark matter halos. We found that Navarro-Frenk-White model does not fit at all in contrast with the other two models which fit very well. Inversion methods have been widely used in various branches of science including astrophysics (Charbonneau 1995, ApJS, 101, 309). In this work we have used three different parametric inversion methods (MonteCarlo, Genetic Algorithm and Simmulated Annealing) in order to determine the best fit of the observed data of the density and velocity profiles of a set of low surface brigthness galaxies (De Block et al. 2001, ApJ, 122, 2396) with three models of galaxies containing dark mattter. The parameters adjusted by the inversion methods were the central density and a characteristic distance in the Boehmer-Harko BH (Boehmer & Harko 2007, JCAP, 6, 25), Navarro-Frenk-White NFW (Navarro et al. 2007, ApJ, 490, 493) and Pseudo Isothermal Profile PI (Robles & Matos 2012, MNRAS, 422, 282). The results obtained showed that the BH and PI Profile dark matter galaxies fit very well for both the density and the velocity profiles, in contrast the NFW model did not make good adjustments to the profiles in any analized galaxy.

  14. E-LEARNING METHOD APPLIED TO TECHNICAL GRAPHICS SUBJECTS

    Directory of Open Access Journals (Sweden)

    GOANTA Adrian Mihai

    2011-11-01

    Full Text Available The paper presents some of the author’s endeavors in creating video courses for the students from the Faculty of Engineering in Braila related to subjects involving technical graphics . There are also mentioned the steps taken in completing the method and how to achieve a feedback on the rate of access to these types of courses by the students.

  15. Some methods of computational geometry applied to computer graphics

    NARCIS (Netherlands)

    Overmars, M.H.; Edelsbrunner, H.; Seidel, R.

    1984-01-01

    Abstract Windowing a two-dimensional picture means to determine those line segments of the picture that are visible through an axis-parallel window. A study of some algorithmic problems involved in windowing a picture is offered. Some methods from computational geometry are exploited to store the

  16. [Synchrotron-based characterization methods applied to ancient materials (I)].

    Science.gov (United States)

    Anheim, Étienne; Thoury, Mathieu; Bertrand, Loïc

    2015-12-01

    This article aims at presenting the first results of a transdisciplinary research programme in heritage sciences. Based on the growing use and on the potentialities of micro- and nano-characterization synchrotron-based methods to study ancient materials (archaeology, palaeontology, cultural heritage, past environments), this contribution will identify and test conceptual and methodological elements of convergence between physicochemical and historical sciences.

  17. About the Finite Element Method Applied to Thick Plates

    Directory of Open Access Journals (Sweden)

    Mihaela Ibănescu

    2006-01-01

    Full Text Available The present paper approaches of plates subjected to transverse loads, when the shear force and the actual boundary conditions are considered, by using the Finite Element Method. The isoparametric finite elements create real facilities in formulating the problems and great possibilities in creating adequate computer programs.

  18. The harmonics detection method based on neural network applied ...

    African Journals Online (AJOL)

    user

    with MATLAB Simulink Power System Toolbox. The simulation study results of this novel technique compared to other similar methods are found quite satisfactory by assuring good filtering characteristics and high system stability. Keywords: Artificial Neural Networks (ANN), p-q theory, (SAPF), Harmonics, Total Harmonic ...

  19. Data Mining Methods Applied to Flight Operations Quality Assurance Data: A Comparison to Standard Statistical Methods

    Science.gov (United States)

    Stolzer, Alan J.; Halford, Carl

    2007-01-01

    In a previous study, multiple regression techniques were applied to Flight Operations Quality Assurance-derived data to develop parsimonious model(s) for fuel consumption on the Boeing 757 airplane. The present study examined several data mining algorithms, including neural networks, on the fuel consumption problem and compared them to the multiple regression results obtained earlier. Using regression methods, parsimonious models were obtained that explained approximately 85% of the variation in fuel flow. In general data mining methods were more effective in predicting fuel consumption. Classification and Regression Tree methods reported correlation coefficients of .91 to .92, and General Linear Models and Multilayer Perceptron neural networks reported correlation coefficients of about .99. These data mining models show great promise for use in further examining large FOQA databases for operational and safety improvements.

  20. Theoretical and applied aerodynamics and related numerical methods

    CERN Document Server

    Chattot, J J

    2015-01-01

    This book covers classical and modern aerodynamics, theories and related numerical methods, for senior and first-year graduate engineering students, including: -The classical potential (incompressible) flow theories for low speed aerodynamics of thin airfoils and high and low aspect ratio wings. - The linearized theories for compressible subsonic and supersonic aerodynamics. - The nonlinear transonic small disturbance potential flow theory, including supercritical wing sections, the extended transonic area rule with lift effect, transonic lifting line and swept or oblique wings to minimize wave drag. Unsteady flow is also briefly discussed. Numerical simulations based on relaxation mixed-finite difference methods are presented and explained. - Boundary layer theory for all Mach number regimes and viscous/inviscid interaction procedures used in practical aerodynamics calculations. There are also four chapters covering special topics, including wind turbines and propellers, airplane design, flow analogies and h...

  1. Generic Methods for Formalising Sequent Calculi Applied to Provability Logic

    Science.gov (United States)

    Dawson, Jeremy E.; Goré, Rajeev

    We describe generic methods for reasoning about multiset-based sequent calculi which allow us to combine shallow and deep embeddings as desired. Our methods are modular, permit explicit structural rules, and are widely applicable to many sequent systems, even to other styles of calculi like natural deduction and term rewriting systems. We describe new axiomatic type classes which enable simplification of multiset or sequent expressions using existing algebraic manipulation facilities. We demonstrate the benefits of our combined approach by formalising in Isabelle/HOL a variant of a recent, non-trivial, pen-and-paper proof of cut-admissibility for the provability logic GL, where we abstract a large part of the proof in a way which is immediately applicable to other calculi. Our work also provides a machine-checked proof to settle the controversy surrounding the proof of cut-admissibility for GL.

  2. Applying probabilistic methods for assessments and calculations for accident prevention

    International Nuclear Information System (INIS)

    Anon.

    1984-01-01

    The guidelines for the prevention of accidents require plant design-specific and radioecological calculations to be made in order to show that maximum acceptable expsoure values will not be exceeded in case of an accident. For this purpose, main parameters affecting the accident scenario have to be determined by probabilistic methods. This offers the advantage that parameters can be quantified on the basis of unambigious and realistic criteria, and final results can be defined in terms of conservativity. (DG) [de

  3. The transfer matrix method applied to steel sheet pile walls

    Science.gov (United States)

    Kort, D. A.

    2003-05-01

    This paper proposes two subgrade reaction models for the analysis of steel sheet pile walls based on the transfer matrix method. In the first model a plastic hinge is generated when the maximum moment in the retaining structure is exceeded. The second model deals with a beam with an asymmetrical cross-section that can bend in two directions.In the first part of this paper the transfer matrix method is explained on the basis of a simple example. Further the development of two computer models is described: Plaswall and Skewwall.The second part of this paper deals with an application of both models. In the application of Plaswall the effect of four current earth pressure theories to the subgrade reaction method is compared to a finite element calculation. It is shown that the earth pressure theory is of major importance on the calculation result of a sheet pile wall both with and without a plastic hinge.In the application of Skewwall the effectiveness of structural measures to reduce oblique bending is investigated. The results are compared to a 3D finite element calculation. It is shown that with simple structural measures the loss of structural resistance due to oblique bending can be reduced.

  4. Applying Hierarchical Task Analysis Method to Discovery Layer Evaluation

    Directory of Open Access Journals (Sweden)

    Marlen Promann

    2015-03-01

    Full Text Available Libraries are implementing discovery layers to offer better user experiences. While usability tests have been helpful in evaluating the success or failure of implementing discovery layers in the library context, the focus has remained on its relative interface benefits over the traditional federated search. The informal site- and context specific usability tests have offered little to test the rigor of the discovery layers against the user goals, motivations and workflow they have been designed to support. This study proposes hierarchical task analysis (HTA as an important complementary evaluation method to usability testing of discovery layers. Relevant literature is reviewed for the discovery layers and the HTA method. As no previous application of HTA to the evaluation of discovery layers was found, this paper presents the application of HTA as an expert based and workflow centered (e.g. retrieving a relevant book or a journal article method to evaluating discovery layers. Purdue University’s Primo by Ex Libris was used to map eleven use cases as HTA charts. Nielsen’s Goal Composition theory was used as an analytical framework to evaluate the goal carts from two perspectives: a users’ physical interactions (i.e. clicks, and b user’s cognitive steps (i.e. decision points for what to do next. A brief comparison of HTA and usability test findings is offered as a way of conclusion.

  5. System and method of applying energetic ions for sterilization

    Science.gov (United States)

    Schmidt, John A.

    2003-12-23

    A method of sterilization of a container is provided whereby a cold plasma is caused to be disposed near a surface to be sterilized, and the cold plasma is then subjected to a pulsed voltage differential for producing energized ions in the plasma. Those energized ions then operate to achieve spore destruction on the surface to be sterilized. Further, a system for sterilization of a container which includes a conductive or non-conductive container, a cold plasma in proximity to the container, and a high voltage source for delivering a pulsed voltage differential between an electrode and the container and across the cold plasma, is provided.

  6. Interesting Developments in Testing Methods Applied to Foundation Piles

    Science.gov (United States)

    Sobala, Dariusz; Tkaczyński, Grzegorz

    2017-10-01

    Both: piling technologies and pile testing methods are a subject of current development. New technologies, providing larger diameters or using in-situ materials, are very demanding in terms of providing proper quality of execution of works. That concerns the material quality and continuity which define the integral strength of pile. On the other side we have the capacity of the ground around the pile and its ability to carry the loads transferred by shaft and pile base. Inhomogeneous nature of soils and a relatively small amount of tested piles imposes very good understanding of small amount of results. In some special cases the capacity test itself form an important cost in the piling contract. This work presents a brief description of selected testing methods and authors remarks based on cooperation with Universities constantly developing new ideas. Paper presents some experience based remarks on integrity testing by means of low energy impact (low strain) and introduces selected (Polish) developments in the field of closed-end pipe piles testing based on bi-directional loading, similar to Osterberg idea, but without sacrificial hydraulic jack. Such test is suitable especially when steel piles are used for temporary support in the rivers, where constructing of conventional testing appliance with anchor piles or kentledge meets technical problems. According to the author’s experience, such tests were not yet used on the building site but they bring a real potential especially, when the displacement control can be provided from the river bank using surveying techniques.

  7. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    Science.gov (United States)

    Lynnes, Chris; Little, Mike; Huang, Thomas; Jacob, Joseph; Yang, Phil; Kuo, Kwo-Sen

    2016-01-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based file systems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  8. Artificial Intelligence Methods Applied to Parameter Detection of Atrial Fibrillation

    Science.gov (United States)

    Arotaritei, D.; Rotariu, C.

    2015-09-01

    In this paper we present a novel method to develop an atrial fibrillation (AF) based on statistical descriptors and hybrid neuro-fuzzy and crisp system. The inference of system produce rules of type if-then-else that care extracted to construct a binary decision system: normal of atrial fibrillation. We use TPR (Turning Point Ratio), SE (Shannon Entropy) and RMSSD (Root Mean Square of Successive Differences) along with a new descriptor, Teager- Kaiser energy, in order to improve the accuracy of detection. The descriptors are calculated over a sliding window that produce very large number of vectors (massive dataset) used by classifier. The length of window is a crisp descriptor meanwhile the rest of descriptors are interval-valued type. The parameters of hybrid system are adapted using Genetic Algorithm (GA) algorithm with fitness single objective target: highest values for sensibility and sensitivity. The rules are extracted and they are part of the decision system. The proposed method was tested using the Physionet MIT-BIH Atrial Fibrillation Database and the experimental results revealed a good accuracy of AF detection in terms of sensitivity and specificity (above 90%).

  9. Modern analytic methods applied to the art and archaeology

    International Nuclear Information System (INIS)

    Tenorio C, M. D.; Longoria G, L. C.

    2010-01-01

    The interaction of diverse areas as the analytic chemistry, the history of the art and the archaeology has allowed the development of a variety of techniques used in archaeology, in conservation and restoration. These methods have been used to date objects, to determine the origin of the old materials and to reconstruct their use and to identify the degradation processes that affect the integrity of the art works. The objective of this chapter is to offer a general vision on the researches that have been realized in the Instituto Nacional de Investigaciones Nucleares (ININ) in the field of cultural goods. A series of researches carried out in collaboration with national investigators and of the foreigner is described shortly, as well as with the great support of degree students and master in archaeology of the National School of Anthropology and History, since one of the goals that have is to diffuse the knowledge of the existence of these techniques among the young archaeologists, so that they have a wider vision of what they could use in an in mediate future and they can check hypothesis with scientific methods. (Author)

  10. Benchmark Comparison of Cloud Analytics Methods Applied to Earth Observations

    Science.gov (United States)

    Lynnes, C.; Little, M. M.; Huang, T.; Jacob, J. C.; Yang, C. P.; Kuo, K. S.

    2016-12-01

    Cloud computing has the potential to bring high performance computing capabilities to the average science researcher. However, in order to take full advantage of cloud capabilities, the science data used in the analysis must often be reorganized. This typically involves sharding the data across multiple nodes to enable relatively fine-grained parallelism. This can be either via cloud-based filesystems or cloud-enabled databases such as Cassandra, Rasdaman or SciDB. Since storing an extra copy of data leads to increased cost and data management complexity, NASA is interested in determining the benefits and costs of various cloud analytics methods for real Earth Observation cases. Accordingly, NASA's Earth Science Technology Office and Earth Science Data and Information Systems project have teamed with cloud analytics practitioners to run a benchmark comparison on cloud analytics methods using the same input data and analysis algorithms. We have particularly looked at analysis algorithms that work over long time series, because these are particularly intractable for many Earth Observation datasets which typically store data with one or just a few time steps per file. This post will present side-by-side cost and performance results for several common Earth observation analysis operations.

  11. The Movable Type Method Applied to Protein-Ligand Binding.

    Science.gov (United States)

    Zheng, Zheng; Ucisik, Melek N; Merz, Kenneth M

    2013-12-10

    Accurately computing the free energy for biological processes like protein folding or protein-ligand association remains a challenging problem. Both describing the complex intermolecular forces involved and sampling the requisite configuration space make understanding these processes innately difficult. Herein, we address the sampling problem using a novel methodology we term "movable type". Conceptually it can be understood by analogy with the evolution of printing and, hence, the name movable type. For example, a common approach to the study of protein-ligand complexation involves taking a database of intact drug-like molecules and exhaustively docking them into a binding pocket. This is reminiscent of early woodblock printing where each page had to be laboriously created prior to printing a book. However, printing evolved to an approach where a database of symbols (letters, numerals, etc.) was created and then assembled using a movable type system, which allowed for the creation of all possible combinations of symbols on a given page, thereby, revolutionizing the dissemination of knowledge. Our movable type (MT) method involves the identification of all atom pairs seen in protein-ligand complexes and then creating two databases: one with their associated pairwise distant dependent energies and another associated with the probability of how these pairs can combine in terms of bonds, angles, dihedrals and non-bonded interactions. Combining these two databases coupled with the principles of statistical mechanics allows us to accurately estimate binding free energies as well as the pose of a ligand in a receptor. This method, by its mathematical construction, samples all of configuration space of a selected region (the protein active site here) in one shot without resorting to brute force sampling schemes involving Monte Carlo, genetic algorithms or molecular dynamics simulations making the methodology extremely efficient. Importantly, this method explores the free

  12. Applied statistical methods in agriculture, health and life sciences

    CERN Document Server

    Lawal, Bayo

    2014-01-01

    This textbook teaches crucial statistical methods to answer research questions using a unique range of statistical software programs, including MINITAB and R. This textbook is developed for undergraduate students in agriculture, nursing, biology and biomedical research. Graduate students will also find it to be a useful way to refresh their statistics skills and to reference software options. The unique combination of examples is approached using MINITAB and R for their individual strengths. Subjects covered include among others data description, probability distributions, experimental design, regression analysis, randomized design and biological assay. Unlike other biostatistics textbooks, this text also includes outliers, influential observations in regression and an introduction to survival analysis. Material is taken from the author's extensive teaching and research in Africa, USA and the UK. Sample problems, references and electronic supplementary material accompany each chapter.

  13. Applying Simulation Method in Formulation of Gluten-Free Cookies

    Directory of Open Access Journals (Sweden)

    Nikitina Marina

    2017-01-01

    Full Text Available At present time priority direction in the development of new food products its developing of technology products for special purposes. These types of products are gluten-free confectionery products, intended for people with celiac disease. Gluten-free products are in demand among consumers, it needs to expand assortment, and improvement of quality indicators. At this article results of studies on the development of pastry products based on amaranth flour does not contain gluten. Study based on method of simulation recipes gluten-free confectionery functional orientation to optimize their chemical composition. The resulting products will allow to diversify and supplement the necessary nutrients diet for people with gluten intolerance, as well as for those who follow a gluten-free diet.

  14. Applying Human-Centered Design Methods to Scientific Communication Products

    Science.gov (United States)

    Burkett, E. R.; Jayanty, N. K.; DeGroot, R. M.

    2016-12-01

    Knowing your users is a critical part of developing anything to be used or experienced by a human being. User interviews, journey maps, and personas are all techniques commonly employed in human-centered design practices because they have proven effective for informing the design of products and services that meet the needs of users. Many non-designers are unaware of the usefulness of personas and journey maps. Scientists who are interested in developing more effective products and communication can adopt and employ user-centered design approaches to better reach intended audiences. Journey mapping is a qualitative data-collection method that captures the story of a user's experience over time as related to the situation or product that requires development or improvement. Journey maps help define user expectations, where they are coming from, what they want to achieve, what questions they have, their challenges, and the gaps and opportunities that can be addressed by designing for them. A persona is a tool used to describe the goals and behavioral patterns of a subset of potential users or customers. The persona is a qualitative data model that takes the form of a character profile, built upon data about the behaviors and needs of multiple users. Gathering data directly from users avoids the risk of basing models on assumptions, which are often limited by misconceptions or gaps in understanding. Journey maps and user interviews together provide the data necessary to build the composite character that is the persona. Because a persona models the behaviors and needs of the target audience, it can then be used to make informed product design decisions. We share the methods and advantages of developing and using personas and journey maps to create more effective science communication products.

  15. Simplified Methods Applied to Nonlinear Motion of Spar Platforms

    Energy Technology Data Exchange (ETDEWEB)

    Haslum, Herbjoern Alf

    2000-07-01

    Simplified methods for prediction of motion response of spar platforms are presented. The methods are based on first and second order potential theory. Nonlinear drag loads and the effect of the pumping motion in a moon-pool are also considered. Large amplitude pitch motions coupled to extreme amplitude heave motions may arise when spar platforms are exposed to long period swell. The phenomenon is investigated theoretically and explained as a Mathieu instability. It is caused by nonlinear coupling effects between heave, surge, and pitch. It is shown that for a critical wave period, the envelope of the heave motion makes the pitch motion unstable. For the same wave period, a higher order pitch/heave coupling excites resonant heave response. This mutual interaction largely amplifies both the pitch and the heave response. As a result, the pitch/heave instability revealed in this work is more critical than the previously well known Mathieu's instability in pitch which occurs if the wave period (or the natural heave period) is half the natural pitch period. The Mathieu instability is demonstrated both by numerical simulations with a newly developed calculation tool and in model experiments. In order to learn more about the conditions for this instability to occur and also how it may be controlled, different damping configurations (heave damping disks and pitch/surge damping fins) are evaluated both in model experiments and by numerical simulations. With increased drag damping, larger wave amplitudes and more time are needed to trigger the instability. The pitch/heave instability is a low probability of occurrence phenomenon. Extreme wave periods are needed for the instability to be triggered, about 20 seconds for a typical 200m draft spar. However, it may be important to consider the phenomenon in design since the pitch/heave instability is very critical. It is also seen that when classical spar platforms (constant cylindrical cross section and about 200m draft

  16. Perturbation Method of Analysis Applied to Substitution Measurements of Buckling

    Energy Technology Data Exchange (ETDEWEB)

    Persson, Rolf

    1966-11-15

    Calculations with two-group perturbation theory on substitution experiments with homogenized regions show that a condensation of the results into a one-group formula is possible, provided that a transition region is introduced in a proper way. In heterogeneous cores the transition region comes in as a consequence of a new cell concept. By making use of progressive substitutions the properties of the transition region can be regarded as fitting parameters in the evaluation procedure. The thickness of the region is approximately equal to the sum of 1/(1/{tau} + 1/L{sup 2}){sup 1/2} for the test and reference regions. Consequently a region where L{sup 2} >> {tau}, e.g. D{sub 2}O, contributes with {radical}{tau} to the thickness. In cores where {tau} >> L{sup 2} , e.g. H{sub 2}O assemblies, the thickness of the transition region is determined by L. Experiments on rod lattices in D{sub 2}O and on test regions of D{sub 2}O alone (where B{sup 2} = - 1/L{sup 2} ) are analysed. The lattice measurements, where the pitches differed by a factor of {radical}2, gave excellent results, whereas the determination of the diffusion length in D{sub 2}O by this method was not quite successful. Even regions containing only one test element can be used in a meaningful way in the analysis.

  17. Variational methods applied to problems of diffusion and reaction

    CERN Document Server

    Strieder, William

    1973-01-01

    This monograph is an account of some problems involving diffusion or diffusion with simultaneous reaction that can be illuminated by the use of variational principles. It was written during a period that included sabbatical leaves of one of us (W. S. ) at the University of Minnesota and the other (R. A. ) at the University of Cambridge and we are grateful to the Petroleum Research Fund for helping to support the former and the Guggenheim Foundation for making possible the latter. We would also like to thank Stephen Prager for getting us together in the first place and for showing how interesting and useful these methods can be. We have also benefitted from correspondence with Dr. A. M. Arthurs of the University of York and from the counsel of Dr. B. D. Coleman the general editor of this series. Table of Contents Chapter 1. Introduction and Preliminaries . 1. 1. General Survey 1 1. 2. Phenomenological Descriptions of Diffusion and Reaction 2 1. 3. Correlation Functions for Random Suspensions 4 1. 4. Mean Free ...

  18. Nondestructive methods of analysis applied to oriental swords

    Directory of Open Access Journals (Sweden)

    Edge, David

    2015-12-01

    Full Text Available Various neutron techniques were employed at the Budapest Nuclear Centre in an attempt to find the most useful method for analysing the high-carbon steels found in Oriental arms and armour, such as those in the Wallace Collection, London. Neutron diffraction was found to be the most useful in terms of identifying such steels and also indicating the presence of hidden patternEn el Centro Nuclear de Budapest se han empleado varias técnicas neutrónicas con el fin de encontrar un método adecuado para analizar las armas y armaduras orientales con un alto contenido en carbono, como algunas de las que se encuentran en la Colección Wallace de Londres. El empleo de la difracción de neutrones resultó ser la técnica más útil de cara a identificar ese tipo de aceros y también para encontrar patrones escondidos.

  19. Adaptive Array Antenna Control Methods with Delay Tolerant Networking for the Winter Road Surveillance System

    Directory of Open Access Journals (Sweden)

    Noriki Uchida

    2017-02-01

    Full Text Available It is considered that the road condition in the winter is one of the significant issues for the safety driving by tourists or residents. However, there are many difficulties of the V2V networks such as the transmission range of wireless networks and the noises from the automobilefs bodies. Thus, this paper introduces the Adaptive Array Antenna (AAA controls for the vehicle-to-vehicle (V2V networks based the Delay Tolerant Networking (DTN in the road surveillance system. In the proposed system, the vehicles equip the AAA control systems with IEEE802.11a/b/g based the DTN, and the wireless directions are controlled by the visual recognitions with Kalman filter algorithm to make the longer and stable wireless connections for the efficiency of the DTN. The porotype system is introduced in this paper, and the results are discussed for the future studies.

  20. Complexity methods applied to turbulence in plasma astrophysics

    Science.gov (United States)

    Vlahos, L.; Isliker, H.

    2016-09-01

    In this review many of the well known tools for the analysis of Complex systems are used in order to study the global coupling of the turbulent convection zone with the solar atmosphere where the magnetic energy is dissipated explosively. Several well documented observations are not easy to interpret with the use of Magnetohydrodynamic (MHD) and/or Kinetic numerical codes. Such observations are: (1) The size distribution of the Active Regions (AR) on the solar surface, (2) The fractal and multi fractal characteristics of the observed magnetograms, (3) The Self-Organised characteristics of the explosive magnetic energy release and (4) the very efficient acceleration of particles during the flaring periods in the solar corona. We review briefly the work published the last twenty five years on the above issues and propose solutions by using methods borrowed from the analysis of complex systems. The scenario which emerged is as follows: (a) The fully developed turbulence in the convection zone generates and transports magnetic flux tubes to the solar surface. Using probabilistic percolation models we were able to reproduce the size distribution and the fractal properties of the emerged and randomly moving magnetic flux tubes. (b) Using a Non Linear Force Free (NLFF) magnetic extrapolation numerical code we can explore how the emerged magnetic flux tubes interact nonlinearly and form thin and Unstable Current Sheets (UCS) inside the coronal part of the AR. (c) The fragmentation of the UCS and the redistribution of the magnetic field locally, when the local current exceeds a Critical threshold, is a key process which drives avalanches and forms coherent structures. This local reorganization of the magnetic field enhances the energy dissipation and influences the global evolution of the complex magnetic topology. Using a Cellular Automaton and following the simple rules of Self Organized Criticality (SOC), we were able to reproduce the statistical characteristics of the

  1. Computational methods and implementation of the 3-D PWR core dynamics SIMTRAN code for online surveillance and prediction

    International Nuclear Information System (INIS)

    Aragones, J.M.; Ahnert, C.

    1995-01-01

    New computational methods have been developed in our 3-D PWR core dynamics SIMTRAN code for online surveillance and prediction. They improve the accuracy and efficiency of the coupled neutronic-thermalhydraulic solution and extend its scope to provide, mainly, the calculation of: the fission reaction rates at the incore mini-detectors; the responses at the excore detectors (power range); the temperatures at the thermocouple locations; and the in-vessel distribution of the loop cold-leg inlet coolant conditions in the reflector and core channels, and to the hot-leg outlets per loop. The functional capabilities implemented in the extended SIMTRAN code for online utilization include: online surveillance, incore-excore calibration, evaluation of peak power factors and thermal margins, nominal update and cycle follow, prediction of maneuvers and diagnosis of fast transients and oscillations. The new code has been installed at the Vandellos-II PWR unit in Spain, since the startup of its cycle 7 in mid-June, 1994. The computational implementation has been performed on HP-700 workstations under the HP-UX Unix system, including the machine-man interfaces for online acquisition of measured data and interactive graphical utilization, in C and X11. The agreement of the simulated results with the measured data, during the startup tests and first months of actual operation, is well within the accuracy requirements. The performance and usefulness shown during the testing and demo phase, to be extended along this cycle, has proved that SIMTRAN and the man-machine graphic user interface have the qualities for a fast, accurate, user friendly, reliable, detailed and comprehensive online core surveillance and prediction

  2. Tetanus Surveillance

    Science.gov (United States)

    ... Links Tetanus Vaccination Maternal and Neonatal Tetanus Elimination Surveillance Recommend on Facebook Tweet Share Compartir Reported tetanus ... date on their 10-year booster shots. National surveillance for tetanus is monitored by the National Notifiable ...

  3. Applying the Maternal Near Miss Approach for the Evaluation of Quality of Obstetric Care: A Worked Example from a Multicenter Surveillance Study

    Directory of Open Access Journals (Sweden)

    Samira Maerrawi Haddad

    2014-01-01

    Full Text Available Objective. To assess quality of care of women with severe maternal morbidity and to identify associated factors. Method. This is a national multicenter cross-sectional study performing surveillance for severe maternal morbidity, using the World Health Organization criteria. The expected number of maternal deaths was calculated with the maternal severity index (MSI based on the severity of complication, and the standardized mortality ratio (SMR for each center was estimated. Analyses on the adequacy of care were performed. Results. 17 hospitals were classified as providing adequate and 10 as nonadequate care. Besides almost twofold increase in maternal mortality ratio, the main factors associated with nonadequate performance were geographic difficulty in accessing health services (P<0.001, delays related to quality of medical care (P=0.012, absence of blood derivatives (P=0.013, difficulties of communication between health services (P=0.004, and any delay during the whole process (P=0.039. Conclusions. This is an example of how evaluation of the performance of health services is possible, using a benchmarking tool specific to Obstetrics. In this study the MSI was a useful tool for identifying differences in maternal mortality ratios and factors associated with nonadequate performance of care.

  4. Applying the Maternal Near Miss Approach for the Evaluation of Quality of Obstetric Care: A Worked Example from a Multicenter Surveillance Study

    Science.gov (United States)

    Haddad, Samira Maerrawi; Souza, Joao Paulo; Sousa, Maria Helena; Parpinelli, Mary Angela; Costa, Maria Laura; Pacagnella, Rodolfo C.; Brum, Ione R.; Moraes Filho, Olímpio B.; Feitosa, Francisco E.; Menezes, Carlos A.; Guanabara, Everardo M.; Moreira, Joaquim L.; Peret, Frederico A.; Schmaltz, Luiza E.; Katz, Leila; Lima, Antonio C. Barbosa; Amorim, Melania M.; Martins, Marilia G.; Nascimento, Denis J.; Paiva, Cláudio S.; Rohloff, Roger D.; Costa, Sergio M.; Luz, Adriana G.; Lobato, Gustavo; Cordioli, Eduardo; Peraçoli, Jose C.; Maia Filho, Nelson L.; Quintana, Silvana M.; Lotufo, Fátima A.; Aquino, Márcia M.; Mattar, Rosiane

    2014-01-01

    Objective. To assess quality of care of women with severe maternal morbidity and to identify associated factors. Method. This is a national multicenter cross-sectional study performing surveillance for severe maternal morbidity, using the World Health Organization criteria. The expected number of maternal deaths was calculated with the maternal severity index (MSI) based on the severity of complication, and the standardized mortality ratio (SMR) for each center was estimated. Analyses on the adequacy of care were performed. Results. 17 hospitals were classified as providing adequate and 10 as nonadequate care. Besides almost twofold increase in maternal mortality ratio, the main factors associated with nonadequate performance were geographic difficulty in accessing health services (P < 0.001), delays related to quality of medical care (P = 0.012), absence of blood derivatives (P = 0.013), difficulties of communication between health services (P = 0.004), and any delay during the whole process (P = 0.039). Conclusions. This is an example of how evaluation of the performance of health services is possible, using a benchmarking tool specific to Obstetrics. In this study the MSI was a useful tool for identifying differences in maternal mortality ratios and factors associated with nonadequate performance of care. PMID:25147830

  5. Methods for computational disease surveillance in infection prevention and control: Statistical process control versus Twitter's anomaly and breakout detection algorithms.

    Science.gov (United States)

    Wiemken, Timothy L; Furmanek, Stephen P; Mattingly, William A; Wright, Marc-Oliver; Persaud, Annuradha K; Guinn, Brian E; Carrico, Ruth M; Arnold, Forest W; Ramirez, Julio A

    2018-02-01

    Although not all health care-associated infections (HAIs) are preventable, reducing HAIs through targeted intervention is key to a successful infection prevention program. To identify areas in need of targeted intervention, robust statistical methods must be used when analyzing surveillance data. The objective of this study was to compare and contrast statistical process control (SPC) charts with Twitter's anomaly and breakout detection algorithms. SPC and anomaly/breakout detection (ABD) charts were created for vancomycin-resistant Enterococcus, Acinetobacter baumannii, catheter-associated urinary tract infection, and central line-associated bloodstream infection data. Both SPC and ABD charts detected similar data points as anomalous/out of control on most charts. The vancomycin-resistant Enterococcus ABD chart detected an extra anomalous point that appeared to be higher than the same time period in prior years. Using a small subset of the central line-associated bloodstream infection data, the ABD chart was able to detect anomalies where the SPC chart was not. SPC charts and ABD charts both performed well, although ABD charts appeared to work better in the context of seasonal variation and autocorrelation. Because they account for common statistical issues in HAI data, ABD charts may be useful for practitioners for analysis of HAI surveillance data. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  6. Comparison of Molecular Typing Methods Useful for Detecting Clusters of Campylobacter jejuni and C. coli Isolates through Routine Surveillance

    Science.gov (United States)

    Taboada, Eduardo; Grant, Christopher C. R.; Blakeston, Connie; Pollari, Frank; Marshall, Barbara; Rahn, Kris; MacKinnon, Joanne; Daignault, Danielle; Pillai, Dylan; Ng, Lai-King

    2012-01-01

    Campylobacter spp. may be responsible for unreported outbreaks of food-borne disease. The detection of these outbreaks is made more difficult by the fact that appropriate methods for detecting clusters of Campylobacter have not been well defined. We have compared the characteristics of five molecular typing methods on Campylobacter jejuni and C. coli isolates obtained from human and nonhuman sources during sentinel site surveillance during a 3-year period. Comparative genomic fingerprinting (CGF) appears to be one of the optimal methods for the detection of clusters of cases, and it could be supplemented by the sequencing of the flaA gene short variable region (flaA SVR sequence typing), with or without subsequent multilocus sequence typing (MLST). Different methods may be optimal for uncovering different aspects of source attribution. Finally, the use of several different molecular typing or analysis methods for comparing individuals within a population reveals much more about that population than a single method. Similarly, comparing several different typing methods reveals a great deal about differences in how the methods group individuals within the population. PMID:22162562

  7. Standard Test Method for Application and Analysis of Solid State Track Recorder (SSTR) Monitors for Reactor Surveillance, E706(IIIB)

    CERN Document Server

    American Society for Testing and Materials. Philadelphia

    2003-01-01

    1.1 This test method describes the use of solid-state track recorders (SSTRs) for neutron dosimetry in light-water reactor (LWR) applications. These applications extend from low neutron fluence to high neutron fluence, including high power pressure vessel surveillance and test reactor irradiations as well as low power benchmark field measurement. (1) This test method replaces Method E 418. This test method is more detailed and special attention is given to the use of state-of-the-art manual and automated track counting methods to attain high absolute accuracies. In-situ dosimetry in actual high fluence-high temperature LWR applications is emphasized. 1.2 This test method includes SSTR analysis by both manual and automated methods. To attain a desired accuracy, the track scanning method selected places limits on the allowable track density. Typically good results are obtained in the range of 5 to 800 000 tracks/cm2 and accurate results at higher track densities have been demonstrated for some cases. (2) Trac...

  8. Redefining syndromic surveillance

    Directory of Open Access Journals (Sweden)

    Rebecca Katz

    2011-12-01

    Full Text Available With growing concerns about international spread of disease and expanding use of early disease detection surveillance methods, the field of syndromic surveillance has received increased attention over the last decade. The purpose of this article is to clarify the various meanings that have been assigned to the term syndromic surveillance and to propose a refined categorization of the characteristics of these systems. Existing literature and conference proceedings were examined on syndromic surveillance from 1998 to 2010, focusing on low- and middle-income settings. Based on the 36 unique definitions of syndromic surveillance found in the literature, five commonly accepted principles of syndromic surveillance systems were identified, as well as two fundamental categories: specific and non-specific disease detection. Ultimately, the proposed categorization of syndromic surveillance distinguishes between systems that focus on detecting defined syndromes or outcomes of interest and those that aim to uncover non-specific trends that suggest an outbreak may be occurring. By providing an accurate and comprehensive picture of this field’s capabilities, and differentiating among system types, a unified understanding of the syndromic surveillance field can be developed, encouraging the adoption, investment in, and implementation of these systems in settings that need bolstered surveillance capacity, particularly low- and middle-income countries.

  9. Further Insight and Additional Inference Methods for Polynomial Regression Applied to the Analysis of Congruence

    Science.gov (United States)

    Cohen, Ayala; Nahum-Shani, Inbal; Doveh, Etti

    2010-01-01

    In their seminal paper, Edwards and Parry (1993) presented the polynomial regression as a better alternative to applying difference score in the study of congruence. Although this method is increasingly applied in congruence research, its complexity relative to other methods for assessing congruence (e.g., difference score methods) was one of the…

  10. Child pornography offenders detected by surveillance of the Internet and by other methods.

    Science.gov (United States)

    Nielssen, Olav; O'Dea, Jeremy; Sullivan, Danny; Rodriguez, Marcelo; Bourget, Dominique; Large, Matthew

    2011-07-01

    Availability of child pornography on the Internet has created new opportunities for offending. It has been noted that many people charged with offences relating to this had not previously been identified as sexual offenders against children. Our aim was to compare the characteristics of people charged with child pornography offences as a result of police monitoring of the Internet with those detected by other means. We hypothesised that those apprehended via the Internet would be more likely to be older and less likely to have severe psychiatric disorder or to have been previously charged with a sexual offence involving contact with a child than those identified by other means. Data were extracted from the findings of clinical examinations by the authors either in the course of preparing reports for court, or in the course of providing treatment. There were 52 men detected by police Internet surveillance and 53 men detected by other means, the latter including 16 men who had not been charged with an offence at the time of referral. Those detected via the Internet were more likely to be in possession of very large quantities of child pornography. Those detected by other means were more likely to have major psychiatric and substance abuse disorders and to report childhood sexual abuse. A subgroup analysis of the 89 people who were facing charges at the time of the assessment found that the only significant differences were in the amount of material and the history of sexual abuse. The men recruited to this study, conducted over a period of nearly 10 years, reflect the changing nature of the technology used to commit this type of offence in that time. The characteristics of the subjects did not confirm the stereotype of an Internet child pornography offender who was high functioning and otherwise well adjusted and carried a low risk of other types of offences. Copyright © 2011 John Wiley & Sons, Ltd.

  11. [Monitoring medication errors in personalised dispensing using the Sentinel Surveillance System method].

    Science.gov (United States)

    Pérez-Cebrián, M; Font-Noguera, I; Doménech-Moral, L; Bosó-Ribelles, V; Romero-Boyero, P; Poveda-Andrés, J L

    2011-01-01

    To assess the efficacy of a new quality control strategy based on daily randomised sampling and monitoring a Sentinel Surveillance System (SSS) medication cart, in order to identify medication errors and their origin at different levels of the process. Prospective quality control study with one year follow-up. A SSS medication cart was randomly selected once a week and double-checked before dispensing medication. Medication errors were recorded before it was taken to the relevant hospital ward. Information concerning complaints after receiving medication and 24-hour monitoring were also noted. Type and origin error data were assessed by a Unit Dose Quality Control Group, which proposed relevant improvement measures. Thirty-four SSS carts were assessed, including 5130 medication lines and 9952 dispensed doses, corresponding to 753 patients. Ninety erroneous lines (1.8%) and 142 mistaken doses (1.4%) were identified at the Pharmacy Department. The most frequent error was dose duplication (38%) and its main cause inappropriate management and forgetfulness (69%). Fifty medication complaints (6.6% of patients) were mainly due to new treatment at admission (52%), and 41 (0.8% of all medication lines), did not completely match the prescription (0.6% lines) as recorded by the Pharmacy Department. Thirty-seven (4.9% of patients) medication complaints due to changes at admission and 32 matching errors (0.6% medication lines) were recorded. The main cause also was inappropriate management and forgetfulness (24%). The simultaneous recording of incidences due to complaints and new medication coincided in 33.3%. In addition, 433 (4.3%) of dispensed doses were returned to the Pharmacy Department. After the Unit Dose Quality Control Group conducted their feedback analysis, 64 improvement measures for Pharmacy Department nurses, 37 for pharmacists, and 24 for the hospital ward were introduced. The SSS programme has proven to be useful as a quality control strategy to identify Unit

  12. Application of low bitrate image coding to surveillance of electric power facilities. Part 1. Proposal of low bitrate coding for surveillance of electric power facilities and examination of facilities region extraction method; Denryoku setsubi kanshi eno tei rate fugoka hoshiki no tekiyo. 1. Setsubi kanshiyo fugoka hoshiki no teian to setsubi ryoiki chushutsuho no kento

    Energy Technology Data Exchange (ETDEWEB)

    Murata, H.; Ishino, R. [Central Research Institute of Electric Power Industry, Tokyo (Japan)

    1996-03-01

    Current status of low bitrate image coding has been investigated, and a low bitrate coding suitable for the surveillance of electric power facilities has been proposed, to extract its problems to be solved. For the conventional image coding, the waveform coding has been used by which the images are processed as signals. While, for the MPEG-4, a coding method with considering the image information has been proposed. For these coding methods, however, image information lacks details primarily, when lowering the bitrate. Accordingly, these methods can not be applied when the details in the images are important, such as in the case of surveillance of facilities. Then, the coding method has been proposed by expanding the partially detailed coding, and by separating constituent images of facilities, such as power cables and steel towers, designated by operators. It is the special feature of this method that the method can easily respond to the low bitrate and the detailed information can be conserved by using the structure extraction coding for the designated partial image which is generally processed by the low bitrate waveform coding. 29 refs., 17 figs., 1 tab.

  13. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    Science.gov (United States)

    2016-06-28

    potential computational bottleneck is solely in the computation on the FVS itself, where brute-force solution yields computations that are cubic in the...both from other academic researchers including those involved in animal behavior analysis and from those involved in applied programs for DoD and other...Hangzhou, China, in October 2013. 6) Prof. Win gave an Invited University Lecture at the 100th Anniversary Event of the Chulalongkorn University in

  14. Application of a new containment and surveillance portal monitor data analysis method

    International Nuclear Information System (INIS)

    Pratt, J.C.; Henry, C.N.; Hastings, R.D.

    1980-01-01

    A new method for processing the information available from an ordinary doorway monitor in an earlier report is described. Additional tests of this concept on doorway monitors at facilities of the Los Alamos Scientific Laboratory have been made. The lessons learned and estimates of the sensitivity of this method for detection of a trickle of special nuclear material through the portal are presented

  15. Applying Hotspot Detection Methods in Forestry: A Case Study of Chestnut Oak Regeneration

    Directory of Open Access Journals (Sweden)

    Songlin Fei

    2010-01-01

    Full Text Available Hotspot detection has been widely adopted in health sciences for disease surveillance, but rarely in natural resource disciplines. In this paper, two spatial scan statistics (SaTScan and ClusterSeer and a nonspatial classification and regression trees method were evaluated as techniques for identifying chestnut oak (Quercus Montana regeneration hotspots among 50 mixed-oak stands in the central Appalachian region of the eastern United States. Hotspots defined by the three methods had a moderate level of conformity and revealed similar chestnut oak regeneration site affinity. Chestnut oak regeneration hotspots were positively associated with the abundance of chestnut oak trees in the overstory and a moderate cover of heather species (Vaccinium and Gaylussacia spp. but were negatively associated with the abundance of hayscented fern (Dennstaedtia punctilobula and mountain laurel (Kalmia latiforia. In general, hotspot detection is a viable tool for assisting natural resource managers with identifying areas possessing significantly high or low tree regeneration.

  16. Applying Hotspot Detection Methods in Forestry: A Case Study of Chestnut Oak Regeneration

    International Nuclear Information System (INIS)

    Fei, S.

    2010-01-01

    Hotspot detection has been widely adopted in health sciences for disease surveillance, but rarely in natural resource disciplines. In this paper, two spatial scan statistics (SaT Scan and Cluster Seer) and a non spatial classification and regression trees method were evaluated as techniques for identifying chestnut oak (Quercus Montana) regeneration hotspots among 50 mixed-oak stands in the central Appalachian region of the eastern United States. Hotspots defined by the three methods had a moderate level of conformity and revealed similar chestnut oak regeneration site affinity. Chestnut oak regeneration hotspots were positively associated with the abundance of chestnut oak trees in the over story and a moderate cover of heather species (Vaccinium and Gaylussacia spp.) but were negatively associated with the abundance of hay scented fern (Dennstaedtia punctilobula) and mountain laurel (Kalmia latiforia). In general, hotspot detection is a viable tool for assisting natural resource managers with identifying areas possessing significantly high or low tree regeneration.

  17. Introduction to surveillance studies

    CERN Document Server

    Petersen, JK

    2012-01-01

    Introduction & OverviewIntroduction Brief History of Surveillance Technologies & TechniquesOptical SurveillanceAerial Surveillance Audio Surveillance Radio-Wave SurveillanceGlobal Positioning Systems Sensors Computers & the Internet Data Cards Biochemical Surveillance Animal Surveillance Biometrics Genetics Practical ConsiderationsPrevalence of Surveillance Effectiveness of Surveillance Freedom & Privacy IssuesConstitutional Freedoms Privacy Safeguards & Intrusions ResourcesReferences Glossary Index

  18. Surveillance of extreme hyperbilirubinaemia in Denmark. A method to identify the newborn infants

    DEFF Research Database (Denmark)

    Bjerre, J.V.; Petersen, Jes Reinholdt; Ebbesen, F.

    2008-01-01

    bilirubin encephalopathy; one infant had advanced-phase symptoms. Four infants received an exchange transfusion. ABO blood group incompatibility was present in 52 infants. Thirty-seven infants were of non-Caucasian descent. CONCLUSION: A method to obtain the national epidemiological data is presented....... The observed incidence of extreme hyperbilirubinaemia is higher than previously reported in Denmark. This is mainly due to a very sensitive method of identifying the study group Udgivelsesdato: 2008/8...

  19. Accurate Simulation of MPPT Methods Performance When Applied to Commercial Photovoltaic Panels

    OpenAIRE

    Cubas, Javier; Pindado, Santiago; Sanz-Andrés, Ángel

    2015-01-01

    A new, simple, and quick-calculation methodology to obtain a solar panel model, based on the manufacturers' datasheet, to perform MPPT simulations, is described. The method takes into account variations on the ambient conditions (sun irradiation and solar cells temperature) and allows fast MPPT methods comparison or their performance prediction when applied to a particular solar panel. The feasibility of the described methodology is checked with four different MPPT methods applied to a commer...

  20. FLDS: A Comprehensive dsRNA Sequencing Method for Intracellular RNA Virus Surveillance.

    Science.gov (United States)

    Urayama, Syun-Ichi; Takaki, Yoshihiro; Nunoura, Takuro

    2016-01-01

    Knowledge of the distribution and diversity of RNA viruses is still limited in spite of their possible environmental and epidemiological impacts because RNA virus-specific metagenomic methods have not yet been developed. We herein constructed an effective metagenomic method for RNA viruses by targeting long double-stranded (ds)RNA in cellular organisms, which is a hallmark of infection, or the replication of dsRNA and single-stranded (ss)RNA viruses, except for retroviruses. This novel dsRNA targeting metagenomic method is characterized by an extremely high recovery rate of viral RNA sequences, the retrieval of terminal sequences, and uniform read coverage, which has not previously been reported in other metagenomic methods targeting RNA viruses. This method revealed a previously unidentified viral RNA diversity of more than 20 complete RNA viral genomes including dsRNA and ssRNA viruses associated with an environmental diatom colony. Our approach will be a powerful tool for cataloging RNA viruses associated with organisms of interest.

  1. Molecular methods for the detection of human papillomavirus infection: new insights into their role in diagnostics and epidemiological surveillance

    Directory of Open Access Journals (Sweden)

    Andrea Piana

    2009-06-01

    Full Text Available Human papillomaviruses (HPVs comprise more than 180 genotypes. HPV infection is mainly diagnosed by molecular methods. The aim of our study was to review the main molecular methods used to diagnose HPV infection, underscoring their characteristics. Several methods have been developed for molecular diagnosis of Papilloma infection, such as those based on PCR technique. Another commercial non-PCR based diagnostic method is Hybrid Capture test; it is the only commercially available HPV DNA detection test approved by the FDA. Several Authors have suggested that viral load and E6/E7 transcripts could be used as surrogate markers of persistent HPV infection, being more specific predictors of progressive disease than the simple presence of HPV DNA. Validating clinical sensitivity and specificity of each technique and improving the interpretation of the results are essential; consequently, there is a clear need for well characterized international quality control panels to compare the various diagnostic methods. HPV DNA testing could be useful both as a primary screening test, alone or in combination with a Pap smear, for the early detection of cervical cancer precursors, and as triage test to select women with minor cytological abnormalities who will need further follow-up and to predict possible treatment failure in women with diagnosed high-grade intraepithelial lesions who have undergone excisional therapy. In the next future surveillance for HPV infections, based on these molecular methods, could represent an important step for the development of primary and secondary prophylactic interventions, such as new vaccines targeted to genotypes who might replace those previously prevalent.

  2. Surveillance Pleasures

    DEFF Research Database (Denmark)

    Albrechtslund, Anders

    and leisure have not been studied with the same intensity as e.g. policing, civil liberties and social sorting. This paper offers a study of trends in surveillance pleasures, i.e. watching and eavesdropping in popular culture. My focus is the existential aspects and ethical dilemmas of surveillance...

  3. Surveillance Culture

    DEFF Research Database (Denmark)

    2017-01-01

    What does it mean to live in a world full of surveillance? In this documentary film, we take a look at everyday life in Denmark and how surveillance technologies and practices influence our norms and social behaviour. Researched and directed by Btihaj Ajana and Anders Albrechtslund....

  4. Overview of molecular typing methods for outbreak detection and epidemiological surveillance

    NARCIS (Netherlands)

    Sabat, A. J.; Budimir, A.; Nashev, D.; Sa-Leao, R.; van Dijl, J. M.; Laurent, F.; Grundmann, H.; Friedrich, A. W.

    2013-01-01

    Typing methods for discriminating different bacterial isolates of the same species are essential epidemiological tools in infection prevention and control. Traditional typing systems based on phenotypes, such as serotype, biotype, phage-type, or antibiogram, have been used for many years. However,

  5. Feasible method for routine surveillance culturing of stools from neutropenic patients.

    OpenAIRE

    Smith, J A; Sherlock, C H; Burdge, D R

    1984-01-01

    This study was undertaken to develop an accurate, yet inexpensive, method for determining whether the bowel of a neutropenic patient is colonized with bacteria resistant to the antimicrobial agents used in empiric therapy. Selective agar media were prepared in which Mueller-Hinton agar or MacConkey agar were supplemented with one of the following antimicrobial agents: carbenicillin (16 micrograms/ml), gentamicin (4 micrograms/ml), or tobramycin (4 micrograms/ml). Moxalactam was incorporated i...

  6. A Super-resolution Reconstruction Algorithm for Surveillance Video

    Directory of Open Access Journals (Sweden)

    Jian Shao

    2017-01-01

    Full Text Available Recent technological developments have resulted in surveillance video becoming a primary method of preserving public security. Many city crimes are observed in surveillance video. The most abundant evidence collected by the police is also acquired through surveillance video sources. Surveillance video footage offers very strong support for solving criminal cases, therefore, creating an effective policy, and applying useful methods to the retrieval of additional evidence is becoming increasingly important. However, surveillance video has had its failings, namely, video footage being captured in low resolution (LR and bad visual quality. In this paper, we discuss the characteristics of surveillance video and describe the manual feature registration – maximum a posteriori – projection onto convex sets to develop a super-resolution reconstruction method, which improves the quality of surveillance video. From this method, we can make optimal use of information contained in the LR video image, but we can also control the image edge clearly as well as the convergence of the algorithm. Finally, we make a suggestion on how to adjust the algorithm adaptability by analyzing the prior information of target image.

  7. Methods for Discovery and Surveillance of Pathogens in Hotspots of Emerging Infectious Diseases

    DEFF Research Database (Denmark)

    Jensen, Randi Holm

    Viruses are everywhere, and can infect all living things. They are constantly evolving, and new diseases are emerging as a result. Consequently, they have always been of interest to scientists and people in general. Several outbreaks of emerging infectious diseases transmitting from animals...... to virion enrichment compared to samples with no enrichment. We have used these methods to perform pathogen discovery in faecal samples collected from small mammals in Sierra Leone, to describe the presence of pathogenic viruses and bacteria in this area. From these data we were furthermore able to acquire...

  8. Applying the Mixed Methods Instrument Development and Construct Validation Process: the Transformative Experience Questionnaire

    Science.gov (United States)

    Koskey, Kristin L. K.; Sondergeld, Toni A.; Stewart, Victoria C.; Pugh, Kevin J.

    2018-01-01

    Onwuegbuzie and colleagues proposed the Instrument Development and Construct Validation (IDCV) process as a mixed methods framework for creating and validating measures. Examples applying IDCV are lacking. We provide an illustrative case integrating the Rasch model and cognitive interviews applied to the development of the Transformative…

  9. An Aural Learning Project: Assimilating Jazz Education Methods for Traditional Applied Pedagogy

    Science.gov (United States)

    Gamso, Nancy M.

    2011-01-01

    The Aural Learning Project (ALP) was developed to incorporate jazz method components into the author's classical practice and her applied woodwind lesson curriculum. The primary objective was to place a more focused pedagogical emphasis on listening and hearing than is traditionally used in the classical applied curriculum. The components of the…

  10. Surveillance of extreme hyperbilirubinaemia in Denmark. A method to identify the newborn infants

    DEFF Research Database (Denmark)

    Bjerre, J.V.; Petersen, Jes Reinholdt; Ebbesen, F.

    2008-01-01

    serum bilirubin concentration (TSB) > or = 450 micromol/L were obtained by linking laboratory data to the unique Danish personal identification number. RESULTS: In total, 113 infants were included, that is, an incidence of 45/100,000 live births. Thirty-seven infants presented in hospital, 2 after home...... birth and the others after having been discharged. The maximum TSB was 485 (450-734) micromol/L (median [range]) and appeared latest amongst those infants admitted from home, but was not different from the maximum TSB of the nondischarged infants. Forty-three infants had symptoms of early-phase acute......AIM: To describe the incidence of infants born at term or near-term with extreme hyperbilirubinaemia. METHODS: The study period was between 1 January 2002 and 31 December 2005, and included all infants born alive at term or near-term in Denmark. Medical reports on all newborn infants with a total...

  11. A comparison between brand-specific and traditional alcohol surveillance methods to assess underage drinkers' reported alcohol use.

    Science.gov (United States)

    Roberts, Sarah P; Siegel, Michael B; DeJong, William; Jernigan, David H

    2014-11-01

    Adolescent alcohol consumption remains common and is associated with many negative health outcomes. Unfortunately, common alcohol surveillance methods often underestimate consumption. Improved alcohol use measures are needed to characterize the landscape of youth drinking. We aimed to compare a standard quantity-frequency measure of youth alcohol consumption to a novel brand-specific measure. We recruited a sample of 1031 respondents across the United States to complete an online survey. Analyses included 833 male and female underage drinkers ages 13-20. Respondents reported on how many of the past 30 days they consumed alcohol, and the number of drinks consumed on an average drinking day. Using our brand-specific measure, respondents identified which brands they consumed, how many days they consumed each brand, and how many drinks per brand they usually had. Youth reported consuming significantly more alcohol (on average, 11 drinks more per month) when responding to the brand-specific versus the standard measure (p brands consumed (p brand preferences and consumption.

  12. National Collegiate Athletic Association Injury Surveillance System: Review of Methods for 2004–2005 Through 2013–2014 Data Collection

    Science.gov (United States)

    Kerr, Zachary Y.; Dompier, Thomas P.; Snook, Erin M.; Marshall, Stephen W.; Klossner, David; Hainline, Brian; Corlette, Jill

    2014-01-01

    Background: Since 1982, the National Collegiate Athletic Association has used the Injury Surveillance System (ISS) to collect injury and athlete-exposure data from a representative sample of collegiate institutions and sports. At the start of the 2004–2005 academic year, a Web-based ISS replaced the paper-based platform previously used for reporting injuries and exposures. Objective: To describe the methods of the Web-based National Collegiate Athletic Association ISS for data collection as implemented from the 2004–2005 to 2013–2014 academic years. Description: The Web-based ISS monitored National Collegiate Athletic Association–sanctioned practices and competitions, the number of participating student–athletes, and time-loss injuries during the preseason, regular season, and postseason in 25 collegiate sports. Starting in the 2009–2010 academic year, non–time-loss injuries were also tracked. Efforts were made to better integrate ISS data collection into the workflow of collegiate athletic trainers. Data for the 2004–2005 to 2013–2014 academic years are available to researchers through a standardized application process available at the Datalys Center Web site. Conclusions: As of February 2014, more than 1 dozen data sets have been provided to researchers. The Datalys Center encourages applications for access to the data. PMID:24870292

  13. National collegiate athletic association injury surveillance system: review of methods for 2004-2005 through 2013-2014 data collection.

    Science.gov (United States)

    Kerr, Zachary Y; Dompier, Thomas P; Snook, Erin M; Marshall, Stephen W; Klossner, David; Hainline, Brian; Corlette, Jill

    2014-01-01

    Since 1982, the National Collegiate Athletic Association has used the Injury Surveillance System (ISS) to collect injury and athlete-exposure data from a representative sample of collegiate institutions and sports. At the start of the 2004-2005 academic year, a Web-based ISS replaced the paper-based platform previously used for reporting injuries and exposures. To describe the methods of the Web-based National Collegiate Athletic Association ISS for data collection as implemented from the 2004-2005 to 2013-2014 academic years. The Web-based ISS monitored National Collegiate Athletic Association-sanctioned practices and competitions, the number of participating student-athletes, and time-loss injuries during the preseason, regular season, and postseason in 25 collegiate sports. Starting in the 2009-2010 academic year, non-time-loss injuries were also tracked. Efforts were made to better integrate ISS data collection into the workflow of collegiate athletic trainers. Data for the 2004-2005 to 2013-2014 academic years are available to researchers through a standardized application process available at the Datalys Center Web site. As of February 2014, more than 1 dozen data sets have been provided to researchers. The Datalys Center encourages applications for access to the data.

  14. Wielandt method applied to the diffusion equations discretized by finite element nodal methods

    International Nuclear Information System (INIS)

    Mugica R, A.; Valle G, E. del

    2003-01-01

    Nowadays the numerical methods of solution to the diffusion equation by means of algorithms and computer programs result so extensive due to the great number of routines and calculations that should carry out, this rebounds directly in the execution times of this programs, being obtained results in relatively long times. This work shows the application of an acceleration method of the convergence of the classic method of those powers that it reduces notably the number of necessary iterations for to obtain reliable results, what means that the compute times they see reduced in great measure. This method is known in the literature like Wielandt method and it has incorporated to a computer program that is based on the discretization of the neutron diffusion equations in plate geometry and stationary state by polynomial nodal methods. In this work the neutron diffusion equations are described for several energy groups and their discretization by means of those called physical nodal methods, being illustrated in particular the quadratic case. It is described a model problem widely described in the literature which is solved for the physical nodal grade schemes 1, 2, 3 and 4 in three different ways: to) with the classic method of the powers, b) method of the powers with the Wielandt acceleration and c) method of the powers with the Wielandt modified acceleration. The results for the model problem as well as for two additional problems known as benchmark problems are reported. Such acceleration method can also be implemented to problems of different geometry to the proposal in this work, besides being possible to extend their application to problems in 2 or 3 dimensions. (Author)

  15. What is the method in applying formal methods to PLC applications?

    NARCIS (Netherlands)

    Mader, Angelika H.; Engel, S.; Wupper, Hanno; Kowalewski, S.; Zaytoon, J.

    2000-01-01

    The question we investigate is how to obtain PLC applications with confidence in their proper functioning. Especially, we are interested in the contribution that formal methods can provide for their development. Our maxim is that the place of a particular formal method in the total picture of system

  16. Formal methods applied to industrial complex systems implementation of the B method

    CERN Document Server

    Boulanger, Jean-Louis

    2014-01-01

    This book presents real-world examples of formal techniques in an industrial context. It covers formal methods such as SCADE and/or the B Method, in various fields such as railways, aeronautics, and the automotive industry. The purpose of this book is to present a summary of experience on the use of "formal methods" (based on formal techniques such as proof, abstract interpretation and model-checking) in industrial examples of complex systems, based on the experience of people currently involved in the creation and assessment of safety critical system software. The involvement of people from

  17. Autonomous surveillance for biosecurity.

    Science.gov (United States)

    Jurdak, Raja; Elfes, Alberto; Kusy, Branislav; Tews, Ashley; Hu, Wen; Hernandez, Emili; Kottege, Navinda; Sikka, Pavan

    2015-04-01

    The global movement of people and goods has increased the risk of biosecurity threats and their potential to incur large economic, social, and environmental costs. Conventional manual biosecurity surveillance methods are limited by their scalability in space and time. This article focuses on autonomous surveillance systems, comprising sensor networks, robots, and intelligent algorithms, and their applicability to biosecurity threats. We discuss the spatial and temporal attributes of autonomous surveillance technologies and map them to three broad categories of biosecurity threat: (i) vector-borne diseases; (ii) plant pests; and (iii) aquatic pests. Our discussion reveals a broad range of opportunities to serve biosecurity needs through autonomous surveillance. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  18. The Wigner method applied to the photodissociation of CH3I

    DEFF Research Database (Denmark)

    Henriksen, Niels Engholm

    1985-01-01

    The Wigner method is applied to the Shapiro-Bersohn model of the photodissociation of CH3I. The partial cross sections obtained by this semiclassical method are in very good agreement with results of exact quantum calculations. It is also shown that a harmonic approximation to the vibrational...

  19. A new clamp method for firing bricks | Obeng | Journal of Applied ...

    African Journals Online (AJOL)

    A new clamp method for firing bricks. ... Journal of Applied Science and Technology ... To overcome this operational deficiencies, a new method of firing bricks that uses brick clamp technique that incorporates a clamp wall of 60 cm thickness, a six tier approach of sealing the top of the clamp (by combination of green bricks) ...

  20. Determination methods for plutonium as applied in the field of reprocessing

    International Nuclear Information System (INIS)

    1983-07-01

    The papers presented report on Pu-determination methods, which are routinely applied in process control, and also on new developments which could supercede current methods either because they are more accurate or because they are simpler and faster. (orig./DG) [de

  1. A method to evaluate performance reliability of individual subjects in laboratory research applied to work settings.

    Science.gov (United States)

    1978-10-01

    This report presents a method that may be used to evaluate the reliability of performance of individual subjects, particularly in applied laboratory research. The method is based on analysis of variance of a tasks-by-subjects data matrix, with all sc...

  2. Water Permeability of Pervious Concrete Is Dependent on the Applied Pressure and Testing Methods

    Directory of Open Access Journals (Sweden)

    Yinghong Qin

    2015-01-01

    Full Text Available Falling head method (FHM and constant head method (CHM are, respectively, used to test the water permeability of permeable concrete, using different water heads on the testing samples. The results indicate the apparent permeability of pervious concrete decreasing with the applied water head. The results also demonstrate the permeability measured from the FHM is lower than that from the CHM. The fundamental difference between the CHM and FHM is examined from the theory of fluid flowing through porous media. The testing results suggest that the water permeability of permeable concrete should be reported with the applied pressure and the associated testing method.

  3. Accurate Simulation of MPPT Methods Performance When Applied to Commercial Photovoltaic Panels

    Directory of Open Access Journals (Sweden)

    Javier Cubas

    2015-01-01

    Full Text Available A new, simple, and quick-calculation methodology to obtain a solar panel model, based on the manufacturers’ datasheet, to perform MPPT simulations, is described. The method takes into account variations on the ambient conditions (sun irradiation and solar cells temperature and allows fast MPPT methods comparison or their performance prediction when applied to a particular solar panel. The feasibility of the described methodology is checked with four different MPPT methods applied to a commercial solar panel, within a day, and under realistic ambient conditions.

  4. Accurate simulation of MPPT methods performance when applied to commercial photovoltaic panels.

    Science.gov (United States)

    Cubas, Javier; Pindado, Santiago; Sanz-Andrés, Ángel

    2015-01-01

    A new, simple, and quick-calculation methodology to obtain a solar panel model, based on the manufacturers' datasheet, to perform MPPT simulations, is described. The method takes into account variations on the ambient conditions (sun irradiation and solar cells temperature) and allows fast MPPT methods comparison or their performance prediction when applied to a particular solar panel. The feasibility of the described methodology is checked with four different MPPT methods applied to a commercial solar panel, within a day, and under realistic ambient conditions.

  5. Efficient Integration of Highly Eccentric Orbits by Scaling Methods Applied to Kustaanheimo-Stiefel Regularization

    Science.gov (United States)

    Fukushima, Toshio

    2004-12-01

    We apply our single scaling method to the numerical integration of perturbed two-body problems regularized by the Kustaanheimo-Stiefel (K-S) transformation. The scaling is done by multiplying a single scaling factor with the four-dimensional position and velocity vectors of an associated harmonic oscillator in order to maintain the Kepler energy relation in terms of the K-S variables. As with the so-called energy rectification of Aarseth, the extra cost for the scaling is negligible, since the integration of the Kepler energy itself is already incorporated in the original K-S formulation. On the other hand, the single scaling method can be applied at every integration step without facing numerical instabilities. For unperturbed cases, the single scaling applied at every step gives a better result than either the original K-S formulation, the energy rectification applied at every apocenter, or the single scaling method applied at every apocenter. For the perturbed cases, however, the single scaling method applied at every apocenter provides the best performance for all perturbation types, whether the main source of error is truncation or round-off.

  6. Active Problem Solving and Applied Research Methods in a Graduate Course on Numerical Methods

    Science.gov (United States)

    Maase, Eric L.; High, Karen A.

    2008-01-01

    "Chemical Engineering Modeling" is a first-semester graduate course traditionally taught in a lecture format at Oklahoma State University. The course as taught by the author for the past seven years focuses on numerical and mathematical methods as necessary skills for incoming graduate students. Recent changes to the course have included Visual…

  7. Viewpoint: An Alternative Teaching Method. The WFTU Applies Active Methods to Educate Workers.

    Science.gov (United States)

    Courbe, Jean-Francois

    1989-01-01

    Develops a set of ideas and practices acquired from experience in organizing trade union education sessions. The method is based on observations that lecturing has not proved highly efficient, although traditional approaches--lecture, reading, discussion--are not totally rejected. (JOW)

  8. Proposal and Evaluation of Management Method for College Mechatronics Education Applying the Project Management

    Science.gov (United States)

    Ando, Yoshinobu; Eguchi, Yuya; Mizukawa, Makoto

    In this research, we proposed and evaluated a management method of college mechatronics education. We applied the project management to college mechatronics education. We practiced our management method to the seminar “Microcomputer Seminar” for 3rd grade students who belong to Department of Electrical Engineering, Shibaura Institute of Technology. We succeeded in management of Microcomputer Seminar in 2006. We obtained the good evaluation for our management method by means of questionnaire.

  9. Forensic chemistry: perspective of new analytical methods applied to documentoscopy, ballistic and drugs of abuse

    OpenAIRE

    Romão, Wanderson; Schwab, Nicolas V; Bueno, Maria Izabel M. S; Sparrapan, Regina; Eberlin, Marcos N; Martiny, Andrea; Sabino, Bruno D; Maldaner, Adriano O

    2011-01-01

    In this review recent methods developed and applied to solve criminal occurences related to documentoscopy, ballistic and drugs of abuse are discussed. In documentoscopy, aging of ink writings, the sequence of line crossings and counterfeiting of documents are aspects to be solved with reproducible, fast and non-destructive methods. In ballistic, the industries are currently producing ''lead-free'' or ''nontoxic'' handgun ammunitions, so new methods of gunshot residues characterization are be...

  10. Apparatus and method for applying an end plug to a fuel rod tube end

    International Nuclear Information System (INIS)

    Rieben, S.L.; Wylie, M.E.

    1987-01-01

    An apparatus is described for applying an end plug to a hollow end of a nuclear fuel rod tube, comprising: support means mounted for reciprocal movement between remote and adjacent positions relative to a nuclear fuel rod tube end to which an end plug is to be applied; guide means supported on the support means for movement; and drive means coupled to the support means and being actuatable for movement between retracted and extended positions for reciprocally moving the support means between its respective remote and adjacent positions. A method for applying an end plug to a hollow end of a nuclear fuel rod tube is also described

  11. Method of levelized discounted costs applied in economic evaluation of nuclear power plant project

    International Nuclear Information System (INIS)

    Tian Li; Wang Yongqing; Liu Jingquan; Guo Jilin; Liu Wei

    2000-01-01

    The main methods of economic evaluation of bid which are in common use are introduced. The characteristics of levelized discounted cost method and its application are presented. The method of levelized discounted cost is applied to the cost calculation of a 200 MW nuclear heating reactor economic evaluation. The results indicate that the method of levelized discounted costs is simple, feasible and which is considered most suitable for the economic evaluation of various case. The method is suggested which is used in the national economic evaluation

  12. Method to detect substances in a body and device to apply the method

    International Nuclear Information System (INIS)

    Voigt, H.

    1978-01-01

    The method and the measuring disposition serve to localize pellets doped with Gd 2 O 3 , lying between UO 2 pellets within a reactor fuel rod. The fuel rod is penetrating a homogeneous magnetic field generated between two pole shoes. The magnetic stray field caused by the doping substances is then measured by means of Hall probes (e.g. InAs) for quantitative discrimination from UO 2 . The position of the Gd 2 O 3 -doped pellets is determined by moving the fuel rod through the magnetic field in a direction perpendicular to the homogeneous field. The measuring signal is caused by the different susceptibility of Gd 2 O 3 with respect to UO 2 . (DG) [de

  13. Applying a mixed-methods evaluation to Healthy Kids, Healthy Communities.

    Science.gov (United States)

    Brownson, Ross C; Kemner, Allison L; Brennan, Laura K

    2015-01-01

    From 2008 to 2014, the Healthy Kids, Healthy Communities (HKHC) national program funded 49 communities across the United States and Puerto Rico to implement healthy eating and active living policy, system, and environmental changes to support healthier communities for children and families, with special emphasis on reaching children at highest risk for obesity on the basis of race, ethnicity, income, or geographic location. Evaluators designed a mixed-methods evaluation to capture the complexity of the HKHC projects, understand implementation, and document perceived and actual impacts of these efforts. Eight complementary evaluation methods addressed 4 primary aims seeking to (1) coordinate data collection for the evaluation through the web-based project management system (HKHC Community Dashboard) and provide training and technical assistance for use of this system; (2) guide data collection and analysis through use of the Assessment and Evaluation Toolkit; (3) conduct a quantitative cross-site impact evaluation among a subset of community partnership sites; and (4) conduct a qualitative cross-site process and impact evaluation among all 49 community partnership sites. Evaluators identified successes and challenges in relation to the following methods: an online performance-monitoring HKHC Community Dashboard system, environmental audits, direct observations, individual and group interviews, partnership and community capacity surveys, group model building, photographs and videos, and secondary data sources (surveillance data and record review). Several themes emerged, including the value of systems approaches, the need for capacity building for evaluation, the value of focusing on upstream and downstream outcomes, and the importance of practical approaches for dissemination. The mixed-methods evaluation of HKHC advances evaluation science related to community-based efforts for addressing childhood obesity in complex community settings. The findings are likely to

  14. Using short-message-service notification as a method to improve acute flaccid paralysis surveillance in Papua New Guinea.

    Science.gov (United States)

    Datta, Siddhartha Sankar; Ropa, Berry; Sui, Gerard Pai; Khattar, Ramzi; Krishnan, Ravi Shankar Santhana Gopala; Okayasu, Hiromasa

    2016-05-17

    High quality acute flaccid paralysis (AFP) surveillance is required to maintain polio-free status of a country. Papua New Guinea (PNG) is considered as one of the highest risk countries for polio re-importation and circulation in the Western Pacific Region (WPRO) of the World Health Organization due to poor healthcare infrastructure and inadequate performance in AFP surveillance. The Government of PNG, in collaboration with WHO, piloted the introduction of short-message-service (SMS) to sensitize pediatricians and provincial disease control officers on AFP and to receive notification of possible AFP cases to improve surveillance quality in PNG. Ninety six health care professionals were registered to receive SMS reminders to report any case of acute flaccid paralysis. Fourteen SMS messages were sent to each participant from September 2012 to November 2013. The number of reported AFP cases were compared before and after the introduction of SMS. Two hundred fifty three unique responses were received with an overall response rate of 21 %. More than 80 % of responses were reported within 3 days of sending the SMS. The number of reported AFP cases increased from 10 cases per year in 2009-2012 to 25 cases per year during the study period and correlated with provincial participation of the health care professionals. Combined with improved sensitization of health care professionals on AFP reporting criteria and sample collection, SMS messaging provides an effective means to increase timely reporting and improve the availability of epidemiologic information on polio surveillance in PNG.

  15. Several methods applied to measuring residual stress in a known specimen

    International Nuclear Information System (INIS)

    Prime, M.B.; Rangaswamy, P.; Daymond, M.R.; Abelin, T.G.

    1998-01-01

    In this study, a beam with a precisely known residual stress distribution provided a unique experimental opportunity. A plastically bent beam was carefully prepared in order to provide a specimen with a known residual stress profile. 21Cr-6Ni-9Mn austenitic stainless steel was obtained as 43 mm square forged stock. Several methods were used to determine the residual stresses, and the results were compared to the known values. Some subtleties of applying the various methods were exposed

  16. Method of applying single higher order polynomial basis function over multiple domains

    CSIR Research Space (South Africa)

    Lysko, AA

    2010-03-01

    Full Text Available M) with higher-order polynomial basis functions, and applied to a surface form of the electrical field integral equation, under thin wire approximation. The main advantage of the proposed method is in permitting to reduce the required number of unknowns when...

  17. Method of applying single higher order polynomial basis function over multiple domains

    CSIR Research Space (South Africa)

    Lysko, AA

    2010-03-01

    Full Text Available A novel method has been devised where one set of higher order polynomial-based basis functions can be applied over several wire segments, thus permitting to decouple the number of unknowns from the number of segments, and so from the geometrical...

  18. 21 CFR 111.320 - What requirements apply to laboratory methods for testing and examination?

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false What requirements apply to laboratory methods for testing and examination? 111.320 Section 111.320 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION CURRENT GOOD MANUFACTURING...

  19. Critical path method applied to research project planning: Fire Economics Evaluation System (FEES)

    Science.gov (United States)

    Earl B. Anderson; R. Stanton Hales

    1986-01-01

    The critical path method (CPM) of network analysis (a) depicts precedence among the many activities in a project by a network diagram; (b) identifies critical activities by calculating their starting, finishing, and float times; and (c) displays possible schedules by constructing time charts. CPM was applied to the development of the Forest Service's Fire...

  20. Applying terminological methods and description logic for creating and implementing and ontology on inhibition

    DEFF Research Database (Denmark)

    Zambach, Sine; Madsen, Bodil Nistrup

    2009-01-01

    By applying formal terminological methods to model an ontology within the domain of enzyme inhibition, we aim to clarify concepts and to obtain consistency. Additionally, we propose a procedure for implementing this ontology in OWL with the aim of obtaining a strict structure which can form...

  1. [Alternative medicine methods applied in patients before surgical treatment of lumbar discopathy].

    Science.gov (United States)

    Rutkowska, E; Kamiński, S; Kucharczyk, A

    2001-01-01

    Case records of 200 patients operated on in 1998/99 for herniated lumbar disc in Neurosurgery Dept. showed that 95 patients (47.5%) had been treated previously by 148 alternative medical or non-medical procedures. The authors discuss the problem of non-conventional treatment methods applied for herniated lumbar disc by professionals or non professionals. The procedures are often dangerous.

  2. The Effect Of The Applied Performance Methods On The Objective Of The Managers

    Directory of Open Access Journals (Sweden)

    Derya Kara

    2009-09-01

    Full Text Available Within the changing management concept, employees and employers have the constant feeling of keeping up with the changing environment. In this regard, performance evaluation activities are regarded as an indispensable element. Data obtained from the results of the performance evaluation activities, shed light on the development of the employees and enable the enterprises to stand in the fierce competitive environment. This study sets out to find out the effect of the applied performance methods on the objective of the managers. The population of the study comprises 182 hotel enterprises with five stars operating in Antalya, İzmir and Muğla with 2184 managers. Sample population was comprised of 578 managers. The results of the study suggest that the effect of the applied performance methods on the objective of the managers counts. The objective of the managers applying 360-degree performance evaluation method is found to be “finding out the training and development needs”, while the objective of the managers applying conventional performance evaluation methods is found to be “enhancing the existing performance”.

  3. Applying Activity Based Costing (ABC) Method to Calculate Cost Price in Hospital and Remedy Services.

    Science.gov (United States)

    Rajabi, A; Dabiri, A

    2012-01-01

    Activity Based Costing (ABC) is one of the new methods began appearing as a costing methodology in the 1990's. It calculates cost price by determining the usage of resources. In this study, ABC method was used for calculating cost price of remedial services in hospitals. To apply ABC method, Shahid Faghihi Hospital was selected. First, hospital units were divided into three main departments: administrative, diagnostic, and hospitalized. Second, activity centers were defined by the activity analysis method. Third, costs of administrative activity centers were allocated into diagnostic and operational departments based on the cost driver. Finally, with regard to the usage of cost objectives from services of activity centers, the cost price of medical services was calculated. The cost price from ABC method significantly differs from tariff method. In addition, high amount of indirect costs in the hospital indicates that capacities of resources are not used properly. Cost price of remedial services with tariff method is not properly calculated when compared with ABC method. ABC calculates cost price by applying suitable mechanisms but tariff method is based on the fixed price. In addition, ABC represents useful information about the amount and combination of cost price services.

  4. Aiding surveillance

    International Development Research Centre (IDRC) Digital Library (Canada)

    arashid

    generator in itself.4 Yet surveillance unconstrained by legal frameworks, human rights protections, and the rule of law has the ... Analysis of the potential adverse implications of using personal information is often completely ..... acknowledge the potential of new information technologies to strengthen electoral processes,.

  5. An applied study using systems engineering methods to prioritize green systems options

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Sonya M [Los Alamos National Laboratory; Macdonald, John M [Los Alamos National Laboratory

    2009-01-01

    For many years, there have been questions about the effectiveness of applying different green solutions. If you're building a home and wish to use green technologies, where do you start? While all technologies sound promising, which will perform the best over time? All this has to be considered within the cost and schedule of the project. The amount of information available on the topic can be overwhelming. We seek to examine if Systems Engineering methods can be used to help people choose and prioritize technologies that fit within their project and budget. Several methods are used to gain perspective into how to select the green technologies, such as the Analytic Hierarchy Process (AHP) and Kepner-Tregoe. In our study, subjects applied these methods to analyze cost, schedule, and trade-offs. Results will document whether the experimental approach is applicable to defining system priorities for green technologies.

  6. Economic consequences assessment for scenarios and actual accidents do the same methods apply

    International Nuclear Information System (INIS)

    Brenot, J.

    1991-01-01

    Methods for estimating the economic consequences of major technological accidents, and their corresponding computer codes, are briefly presented with emphasis on the basic choices. When applied to hypothetic scenarios, those methods give results that are of interest for risk managers with a decision aiding perspective. Simultaneously the various costs, and the procedures for their estimation are reviewed for some actual accidents (Three Mile Island, Chernobyl,..). These costs are used in a perspective of litigation and compensation. The comparison of the methods used and cost estimates obtained for scenarios and actual accidents shows the points of convergence and discrepancies that are discussed

  7. Police surveillance and driving speed.

    NARCIS (Netherlands)

    2008-01-01

    Although speed plays a large part in the occurrence of crashes, drivers often exceed the speed limit. The police use various methods when carrying out their speed surveillance. In the Netherlands positive effects have been found of speed surveillance with radar cars (without stopping). It is to be

  8. Applying the Support Vector Machine Method to Matching IRAS and SDSS Catalogues

    Directory of Open Access Journals (Sweden)

    Chen Cao

    2007-10-01

    Full Text Available This paper presents results of applying a machine learning technique, the Support Vector Machine (SVM, to the astronomical problem of matching the Infra-Red Astronomical Satellite (IRAS and Sloan Digital Sky Survey (SDSS object catalogues. In this study, the IRAS catalogue has much larger positional uncertainties than those of the SDSS. A model was constructed by applying the supervised learning algorithm (SVM to a set of training data. Validation of the model shows a good identification performance (∼ 90% correct, better than that derived from classical cross-matching algorithms, such as the likelihood-ratio method used in previous studies.

  9. A stochastic root finding approach: the homotopy analysis method applied to Dyson-Schwinger equations

    Science.gov (United States)

    Pfeffer, Tobias; Pollet, Lode

    2017-04-01

    We present the construction and stochastic summation of rooted-tree diagrams, based on the expansion of a root finding algorithm applied to the Dyson-Schwinger equations. The mathematical formulation shows superior convergence properties compared to the bold diagrammatic Monte Carlo approach and the developed algorithm allows one to tackle generic high-dimensional integral equations, to avoid the curse of dealing explicitly with high-dimensional objects and to access non-perturbative regimes. The sign problem remains the limiting factor, but it is not found to be worse than in other approaches. We illustrate the method for {φ }4 theory but note that it applies in principle to any model.

  10. Health surveillance under adverse ergonomics conditions--validity of a screening method adapted for the occupational health service.

    Science.gov (United States)

    Jonker, Dirk; Gustafsson, Ewa; Rolander, Bo; Arvidsson, Inger; Nordander, Catarina

    2015-01-01

    A new health surveillance protocol for work-related upper-extremity musculoskeletal disorders has been validated by comparing the results with a reference protocol. The studied protocol, Health Surveillance in Adverse Ergonomics Conditions (HECO), is a new version of the reference protocol modified for application in the Occupational Health Service (OHS). The HECO protocol contains both a screening part and a diagnosing part. Sixty-three employees were examined. The screening in HECO did not miss any diagnosis found when using the reference protocol, but in comparison to the reference protocol considerable time savings could be achieved. Fair to good agreement between the protocols was obtained for one or more diagnoses in neck/shoulders (86%, k = 0.62) and elbow/hands (84%, k = 0.49). Therefore, the results obtained using the HECO protocol can be compared with a reference material collected with the reference protocol, and thus provide information of the magnitude of disorders in an examined work group. Practitioner Summary: The HECO protocol is a relatively simple physical examination protocol for identification of musculoskeletal disorders in the neck and upper extremities. The protocol is a reliable and cost-effective tool for the OHS to use for occupational health surveillance in order to detect workplaces at high risk for developing musculoskeletal disorders.

  11. Health surveillance under adverse ergonomics conditions – validity of a screening method adapted for the occupational health service

    Science.gov (United States)

    Jonker, Dirk; Gustafsson, Ewa; Rolander, Bo; Arvidsson, Inger; Nordander, Catarina

    2015-01-01

    A new health surveillance protocol for work-related upper-extremity musculoskeletal disorders has been validated by comparing the results with a reference protocol. The studied protocol, Health Surveillance in Adverse Ergonomics Conditions (HECO), is a new version of the reference protocol modified for application in the Occupational Health Service (OHS). The HECO protocol contains both a screening part and a diagnosing part. Sixty-three employees were examined. The screening in HECO did not miss any diagnosis found when using the reference protocol, but in comparison to the reference protocol considerable time savings could be achieved. Fair to good agreement between the protocols was obtained for one or more diagnoses in neck/shoulders (86%, k = 0.62) and elbow/hands (84%, k = 0.49). Therefore, the results obtained using the HECO protocol can be compared with a reference material collected with the reference protocol, and thus provide information of the magnitude of disorders in an examined work group. Practitioner Summary: The HECO protocol is a relatively simple physical examination protocol for identification of musculoskeletal disorders in the neck and upper extremities. The protocol is a reliable and cost-effective tool for the OHS to use for occupational health surveillance in order to detect workplaces at high risk for developing musculoskeletal disorders. PMID:25761380

  12. Control Method for Electromagnetic Unmanned Robot Applied to Automotive Test Based on Improved Smith Predictor Compensator

    Directory of Open Access Journals (Sweden)

    Gang Chen

    2015-07-01

    Full Text Available A new control method for an electromagnetic unmanned robot applied to automotive testing (URAT and based on improved Smith predictor compensator, and considering a time delay, is proposed. The mechanical system structure and the control system structure are presented. The electromagnetic URAT adopts pulse width modulation (PWM control, while the displacement and the current doubles as a closed-loop control strategy. The coordinated control method of multiple manipulators for the electromagnetic URAT, e.g., a skilled human driver with intelligent decision-making ability is provided, and the improved Smith predictor compensator controller for the electromagnetic URAT considering a time delay is designed. Experiments are conducted using a Ford FOCUS automobile. Comparisons between the PID control method and the proposed method are conducted. Experimental results show that the proposed method can achieve the accurate tracking of the target vehicle's speed and reduce the mileage derivation of autonomous driving, which meets the requirements of national test standards.

  13. Optimization methods of the net emission computation applied to cylindrical sodium vapor plasma

    International Nuclear Information System (INIS)

    Hadj Salah, S.; Hajji, S.; Ben Hamida, M. B.; Charrada, K.

    2015-01-01

    An optimization method based on a physical analysis of the temperature profile and different terms in the radiative transfer equation is developed to reduce the time computation of the net emission. This method has been applied for the cylindrical discharge in sodium vapor. Numerical results show a relative error of spectral flux density values lower than 5% with an exact solution, whereas the computation time is about 10 orders of magnitude less. This method is followed by a spectral method based on the rearrangement of the lines profile. Results are shown for Lorentzian profile and they demonstrated a relative error lower than 10% with the reference method and gain in computation time about 20 orders of magnitude

  14. Multigrid method applied to the solution of an elliptic, generalized eigenvalue problem

    Energy Technology Data Exchange (ETDEWEB)

    Alchalabi, R.M. [BOC Group, Murray Hill, NJ (United States); Turinsky, P.J. [North Carolina State Univ., Raleigh, NC (United States)

    1996-12-31

    The work presented in this paper is concerned with the development of an efficient MG algorithm for the solution of an elliptic, generalized eigenvalue problem. The application is specifically applied to the multigroup neutron diffusion equation which is discretized by utilizing the Nodal Expansion Method (NEM). The underlying relaxation method is the Power Method, also known as the (Outer-Inner Method). The inner iterations are completed using Multi-color Line SOR, and the outer iterations are accelerated using Chebyshev Semi-iterative Method. Furthermore, the MG algorithm utilizes the consistent homogenization concept to construct the restriction operator, and a form function as a prolongation operator. The MG algorithm was integrated into the reactor neutronic analysis code NESTLE, and numerical results were obtained from solving production type benchmark problems.

  15. Least Square NUFFT Methods Applied to 2D and 3D Radially Encoded MR Image Reconstruction

    Science.gov (United States)

    Song, Jiayu; Liu, Qing H.; Gewalt, Sally L.; Cofer, Gary; Johnson, G. Allan

    2009-01-01

    Radially encoded MR imaging (MRI) has gained increasing attention in applications such as hyperpolarized gas imaging, contrast-enhanced MR angiography, and dynamic imaging, due to its motion insensitivity and improved artifact properties. However, since the technique collects k-space samples nonuniformly, multidimensional (especially 3D) radially sampled MRI image reconstruction is challenging. The balance between reconstruction accuracy and speed becomes critical when a large data set is processed. Kaiser-Bessel gridding reconstruction has been widely used for non-Cartesian reconstruction. The objective of this work is to provide an alternative reconstruction option in high dimensions with on-the-fly kernels calculation. The work develops general multi-dimensional least square nonuniform fast Fourier transform (LS-NUFFT) algorithms and incorporates them into a k-space simulation and image reconstruction framework. The method is then applied to reconstruct the radially encoded k-space, although the method addresses general nonuniformity and is applicable to any non-Cartesian patterns. Performance assessments are made by comparing the LS-NUFFT based method with the conventional Kaiser-Bessel gridding method for 2D and 3D radially encoded computer simulated phantoms and physically scanned phantoms. The results show that the LS-NUFFT reconstruction method has better accuracy-speed efficiency than the Kaiser-Bessel gridding method when the kernel weights are calculated on the fly. The accuracy of the LS-NUFFT method depends on the choice of scaling factor, and it is found that for a particular conventional kernel function, using its corresponding deapodization function as scaling factor and utilizing it into the LS-NUFFT framework has the potential to improve accuracy. When a cosine scaling factor is used, in particular, the LS-NUFFT method is faster than Kaiser-Bessel gridding method because of a quasi closed-form solution. The method is successfully applied to 2D and

  16. Agglomeration multigrid methods with implicit Runge-Kutta smoothers applied to aerodynamic simulations on unstructured grids

    Science.gov (United States)

    Langer, Stefan

    2014-11-01

    For unstructured finite volume methods an agglomeration multigrid with an implicit multistage Runge-Kutta method as a smoother is developed for solving the compressible Reynolds averaged Navier-Stokes (RANS) equations. The implicit Runge-Kutta method is interpreted as a preconditioned explicit Runge-Kutta method. The construction of the preconditioner is based on an approximate derivative. The linear systems are solved approximately with a symmetric Gauss-Seidel method. To significantly improve this solution method grid anisotropy is treated within the Gauss-Seidel iteration in such a way that the strong couplings in the linear system are resolved by tridiagonal systems constructed along these directions of strong coupling. The agglomeration strategy is adapted to this procedure by taking into account exactly these anisotropies in such a way that a directional coarsening is applied along these directions of strong coupling. Turbulence effects are included by a Spalart-Allmaras model, and the additional transport-type equation is approximately solved in a loosely coupled manner with the same method. For two-dimensional and three-dimensional numerical examples and a variety of differently generated meshes we show the wide range of applicability of the solution method. Finally, we exploit the GMRES method to determine approximate spectral information of the linearized RANS equations. This approximate spectral information is used to discuss and compare characteristics of multistage Runge-Kutta methods.

  17. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

    Science.gov (United States)

    Said, Nadia; Engelhart, Michael; Kirches, Christian; Körkel, Stefan; Holt, Daniel V

    2016-01-01

    Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.

  18. The 2D Spectral Intrinsic Decomposition Method Applied to Image Analysis

    Directory of Open Access Journals (Sweden)

    Samba Sidibe

    2017-01-01

    Full Text Available We propose a new method for autoadaptive image decomposition and recomposition based on the two-dimensional version of the Spectral Intrinsic Decomposition (SID. We introduce a faster diffusivity function for the computation of the mean envelope operator which provides the components of the SID algorithm for any signal. The 2D version of SID algorithm is implemented and applied to some very known images test. We extracted relevant components and obtained promising results in images analysis applications.

  19. Accuracy of the Adomian decomposition method applied to the Lorenz system

    International Nuclear Information System (INIS)

    Hashim, I.; Noorani, M.S.M.; Ahmad, R.; Bakar, S.A.; Ismail, E.S.; Zakaria, A.M.

    2006-01-01

    In this paper, the Adomian decomposition method (ADM) is applied to the famous Lorenz system. The ADM yields an analytical solution in terms of a rapidly convergent infinite power series with easily computable terms. Comparisons between the decomposition solutions and the fourth-order Runge-Kutta (RK4) numerical solutions are made for various time steps. In particular we look at the accuracy of the ADM as the Lorenz system changes from a non-chaotic system to a chaotic one

  20. Applying the Goal-Question-Indicator-Metric (GQIM) Method to Perform Military Situational Analysis

    Science.gov (United States)

    2016-05-11

    MAXIMUM 200 WORDS ) When developing situational awareness in support of military operations, the U.S. armed forces use a mnemonic, or memory aide, to...REV-03.18.2016.0 Applying the Goal- Question -Indicator- Metric (GQIM) Method to Perform Military Situational Analysis Douglas Gray May 2016...Acknowledgments The subject matter covered in this technical note evolved from an excellent question from Capt. Tomomi Ogasawara, Japan Ground Self

  1. Applied Ecosystem Analysis - - a Primer : EDT the Ecosystem Diagnosis and Treatment Method.

    Energy Technology Data Exchange (ETDEWEB)

    Lestelle, Lawrence C.; Mobrand, Lars E.

    1996-05-01

    The aim of this document is to inform and instruct the reader about an approach to ecosystem management that is based upon salmon as an indicator species. It is intended to provide natural resource management professionals with the background information needed to answer questions about why and how to apply the approach. The methods and tools the authors describe are continually updated and refined, so this primer should be treated as a first iteration of a sequentially revised manual.

  2. Applied ecosystem analysis - a primer; the ecosystem diagnosis and treatment method

    International Nuclear Information System (INIS)

    Lestelle, L.C.; Mobrand, L.E.; Lichatowich, J.A.; Vogel, T.S.

    1996-05-01

    The aim of this document is to inform and instruct the reader about an approach to ecosystem management that is based upon salmon as an indicator species. It is intended to provide natural resource management professionals with the background information needed to answer questions about why and how to apply the approach. The methods and tools the authors describe are continually updated and refined, so this primer should be treated as a first iteration of a sequentially revised manual

  3. Health surveillance

    International Nuclear Information System (INIS)

    1981-01-01

    The Code includes a number of requirements for the health surveillance of employees associated with the mining and milling of radioactive ores. This guideline is particularly directed at determining the level of fitness of employees and prospective employees, detecting any symptom which might contraindicate exposure to the environment encountered in mine/mill situations, examination of any employee who may have been exposed to radiation in excess of defined limits and the accumulation and provision of data on the health of employees

  4. Rinderpest surveillance

    International Nuclear Information System (INIS)

    2003-01-01

    Rinderpest is probably the most lethal virus disease of cattle and buffalo and can destroy whole populations; damaging economies; undermining food security and ruining the livelihood of farmers and pastoralists. The disease can be eradicated by vaccination and control of livestock movement. The Department of Technical Co-operation is sponsoring a programme, with technical support from the Joint FAO/IAEA Division to provide advice, training and materials to thirteen states through the 'Support for Rinderpest Surveillance in West Asia' project. (IAEA)

  5. A Comparison of Parametric and Non-Parametric Methods Applied to a Likert Scale

    OpenAIRE

    Mircioiu, Constantin; Atkinson, Jeffrey

    2017-01-01

    A trenchant and passionate dispute over the use of parametric versus non-parametric methods for the analysis of Likert scale ordinal data has raged for the past eight decades. The answer is not a simple “yes” or “no” but is related to hypotheses, objectives, risks, and paradigms. In this paper, we took a pragmatic approach. We applied both types of methods to the analysis of actual Likert data on responses from different professional subgroups of European pharmacists regarding competencies fo...

  6. A method for finding the ridge between saddle points applied to rare event rate estimates

    DEFF Research Database (Denmark)

    Maronsson, Jon Bergmann; Jónsson, Hannes; Vegge, Tejs

    2012-01-01

    A method is presented for finding the ridge between first order saddle points on a multidimensional surface. For atomic scale systems, such saddle points on the energy surface correspond to atomic rearrangement mechanisms. Information about the ridge can be used to test the validity of the harmonic...... to the path. The method is applied to Al adatom diffusion on the Al(100) surface to find the ridge between 2-, 3- and 4-atom concerted displacements and hop mechanisms. A correction to the harmonic approximation of transition state theory was estimated by direct evaluation of the configuration integral along...

  7. Development of a tracking method for augmented reality applied to nuclear plant maintenance work

    International Nuclear Information System (INIS)

    Shimoda, Hiroshi; Maeshima, Masayuki; Nakai, Toshinori; Bian, Zhiqiang; Ishii, Hirotake; Yoshikawa, Hidekazu

    2005-01-01

    In this paper, a plant maintenance support method is described, which employs the state-of-the-art information technology, Augmented Reality (AR), in order to improve efficiency of NPP maintenance work and to prevent from human error. Although AR has a great possibility to support various works in real world, it is difficult to apply it to actual work support because the tracking method is the bottleneck for the practical use. In this study, a bar code marker tracking method is proposed to apply AR system for a maintenance work support in NPP field. The proposed method calculates the users position and orientation in real time by two long markers, which are captured by the user-mounted camera. The markers can be easily pasted on the pipes in plant field, and they can be easily recognized in long distance in order to reduce the number of pasted markers in the work field. Experiments were conducted in a laboratory and plant field to evaluate the proposed method. The results show that (1) fast and stable tracking can be realized, (2) position error in camera view is less than 1%, which is almost perfect under the limitation of camera resolution, and (3) it is relatively difficult to catch two markers in one camera view especially in short distance

  8. Lessons learned applying CASE methods/tools to Ada software development projects

    Science.gov (United States)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  9. Method to integrate clinical guidelines into the electronic health record (EHR) by applying the archetypes approach.

    Science.gov (United States)

    Garcia, Diego; Moro, Claudia Maria Cabral; Cicogna, Paulo Eduardo; Carvalho, Deborah Ribeiro

    2013-01-01

    Clinical guidelines are documents that assist healthcare professionals, facilitating and standardizing diagnosis, management, and treatment in specific areas. Computerized guidelines as decision support systems (DSS) attempt to increase the performance of tasks and facilitate the use of guidelines. Most DSS are not integrated into the electronic health record (EHR), ordering some degree of rework especially related to data collection. This study's objective was to present a method for integrating clinical guidelines into the EHR. The study developed first a way to identify data and rules contained in the guidelines, and then incorporate rules into an archetype-based EHR. The proposed method tested was anemia treatment in the Chronic Kidney Disease Guideline. The phases of the method are: data and rules identification; archetypes elaboration; rules definition and inclusion in inference engine; and DSS-EHR integration and validation. The main feature of the proposed method is that it is generic and can be applied toany type of guideline.

  10. Applying the response matrix method for solving coupled neutron diffusion and transport problems

    International Nuclear Information System (INIS)

    Sibiya, G.S.

    1980-01-01

    The numerical determination of the flux and power distribution in the design of large power reactors is quite a time-consuming procedure if the space under consideration is to be subdivided into very fine weshes. Many computing methods applied in reactor physics (such as the finite-difference method) require considerable computing time. In this thesis it is shown that the response matrix method can be successfully used as an alternative approach to solving the two-dimension diffusion equation. Furthermore it is shown that sufficient accuracy of the method is achieved by assuming a linear space dependence of the neutron currents on the boundaries of the geometries defined for the given space. (orig.) [de

  11. Health surveillance - myth and reality

    International Nuclear Information System (INIS)

    Sharp, C.

    1998-01-01

    This paper discusses the principles, health benefit and cost-effectiveness of health surveillance in the occupational setting, which apply to exposure to ionising radiations in the same manner as to other hazards in the workplace. It highlights the techniques for undertaking health surveillance, discusses their relative advantages and disadvantages and illustrates these in relation to specific hazards. The responsibilities of the medical staff and of the worker are also discussed. (author)

  12. Applying some methods to process the data coming from the nuclear reactions

    International Nuclear Information System (INIS)

    Suleymanov, M.K.; Abdinov, O.B.; Belashev, B.Z.

    2010-01-01

    Full text : The methods of a posterior increasing the resolution of the spectral lines are offered to process the data coming from the nuclear reactions. The methods have applied to process the data coming from the nuclear reactions at high energies. They give possibilities to get more detail information on a structure of the spectra of particles emitted in the nuclear reactions. The nuclear reactions are main source of the information on the structure and physics of the atomic nuclei. Usually the spectrums of the fragments of the reactions are complex ones. Apparently it is not simple to extract the necessary for investigation information. In the talk we discuss the methods of a posterior increasing the resolution of the spectral lines. The methods could be useful to process the complex data coming from the nuclear reactions. We consider the Fourier transformation method and maximum entropy one. The complex structures were identified by the method. One can see that at lest two selected points are indicated by the method. Recent we presented a talk where we shown that the results of the analyzing the structure of the pseudorapidity spectra of charged relativistic particles with ≥ 0.7 measured in Au+Em and Pb+Em at AGS and SPS energies using the Fourier transformation method and maximum entropy one. The dependences of these spectra on the number of fast target protons were studied. These distribution shown visually some plateau and shoulder that was at least three selected points on the distributions. The plateaus become wider in PbEm reactions. The existing of plateau is necessary for the parton models. The maximum entropy method could confirm the existing of the plateau and the shoulder on the distributions. The figure shows the results of applying the maximum entropy method. One can see that the method indicates several clean selected points. Some of them same with observed visually ones. We would like to note that the Fourier transformation method could not

  13. The development and evaluation of a PDA-based method for public health surveillance data collection in developing countries

    DEFF Research Database (Denmark)

    Yu, Ping; de Courten, Maximilian; Pan, Elaine

    2009-01-01

    of mobile public health data collectors. The goal of this project is to explore the opportunity of filling this gap through developing and trial of a personal digital assistant (PDA) based data collection/entry system. It evaluated whether such a system could increase efficiency and reduce data......EpiData and Epi Info are often used together by public health agencies around the world, particularly in developing countries, to meet their needs of low-cost public health data management; however, the current open source data management technology lacks a mobile component to meet the needs...... transcription errors for public surveillance data collection in developing countries represented by Fiji....

  14. Should methods of correction for multiple comparisons be applied in pharmacovigilance?

    Directory of Open Access Journals (Sweden)

    Lorenza Scotti

    2015-12-01

    Full Text Available Purpose. In pharmacovigilance, spontaneous reporting databases are devoted to the early detection of adverse event ‘signals’ of marketed drugs. A common limitation of these systems is the wide number of concurrently investigated associations, implying a high probability of generating positive signals simply by chance. However it is not clear if the application of methods aimed to adjust for the multiple testing problems are needed when at least some of the drug-outcome relationship under study are known. To this aim we applied a robust estimation method for the FDR (rFDR particularly suitable in the pharmacovigilance context. Methods. We exploited the data available for the SAFEGUARD project to apply the rFDR estimation methods to detect potential false positive signals of adverse reactions attributable to the use of non-insulin blood glucose lowering drugs. Specifically, the number of signals generated from the conventional disproportionality measures and after the application of the rFDR adjustment method was compared. Results. Among the 311 evaluable pairs (i.e., drug-event pairs with at least one adverse event report, 106 (34% signals were considered as significant from the conventional analysis. Among them 1 resulted in false positive signals according to rFDR method. Conclusions. The results of this study seem to suggest that when a restricted number of drug-outcome pairs is considered and warnings about some of them are known, multiple comparisons methods for recognizing false positive signals are not so useful as suggested by theoretical considerations.

  15. Solution of the neutron point kinetics equations with temperature feedback effects applying the polynomial approach method

    Energy Technology Data Exchange (ETDEWEB)

    Tumelero, Fernanda, E-mail: fernanda.tumelero@yahoo.com.br [Universidade Federal do Rio Grande do Sul (UFRGS), Porto Alegre, RS (Brazil). Programa de Pos-Graduacao em Engenharia Mecanica; Petersen, Claudio Z.; Goncalves, Glenio A.; Lazzari, Luana, E-mail: claudiopeteren@yahoo.com.br, E-mail: gleniogoncalves@yahoo.com.br, E-mail: luana-lazzari@hotmail.com [Universidade Federal de Pelotas (DME/UFPEL), Capao do Leao, RS (Brazil). Instituto de Fisica e Matematica

    2015-07-01

    In this work, we present a solution of the Neutron Point Kinetics Equations with temperature feedback effects applying the Polynomial Approach Method. For the solution, we consider one and six groups of delayed neutrons precursors with temperature feedback effects and constant reactivity. The main idea is to expand the neutron density, delayed neutron precursors and temperature as a power series considering the reactivity as an arbitrary function of the time in a relatively short time interval around an ordinary point. In the first interval one applies the initial conditions of the problem and the analytical continuation is used to determine the solutions of the next intervals. With the application of the Polynomial Approximation Method it is possible to overcome the stiffness problem of the equations. In such a way, one varies the time step size of the Polynomial Approach Method and performs an analysis about the precision and computational time. Moreover, we compare the method with different types of approaches (linear, quadratic and cubic) of the power series. The answer of neutron density and temperature obtained by numerical simulations with linear approximation are compared with results in the literature. (author)

  16. A methodological framework applied to the choice of the best method in replacement of nuclear systems

    International Nuclear Information System (INIS)

    Vianna Filho, Alfredo Marques

    2009-01-01

    The economic equipment replacement problem is a central question in Nuclear Engineering. On the one hand, new equipment are more attractive given their best performance, better reliability, lower maintenance cost etc. New equipment, however, require a higher initial investment. On the other hand, old equipment represent the other way around, with lower performance, lower reliability and specially higher maintenance costs, but in contrast having lower financial and insurance costs. The weighting of all these costs can be made with deterministic and probabilistic methods applied to the study of equipment replacement. Two types of distinct problems will be examined, substitution imposed by the wearing and substitution imposed by the failures. In order to solve the problem of nuclear system substitution imposed by wearing, deterministic methods are discussed. In order to solve the problem of nuclear system substitution imposed by failures, probabilistic methods are discussed. The aim of this paper is to present a methodological framework to the choice of the most useful method applied in the problem of nuclear system substitution.(author)

  17. Power secant method applied to natural frequency extraction of Timoshenko beam structures

    Directory of Open Access Journals (Sweden)

    C.A.N. Dias

    Full Text Available This work deals with an improved plane frame formulation whose exact dynamic stiffness matrix (DSM presents, uniquely, null determinant for the natural frequencies. In comparison with the classical DSM, the formulation herein presented has some major advantages: local mode shapes are preserved in the formulation so that, for any positive frequency, the DSM will never be ill-conditioned; in the absence of poles, it is possible to employ the secant method in order to have a more computationally efficient eigenvalue extraction procedure. Applying the procedure to the more general case of Timoshenko beams, we introduce a new technique, named "power deflation", that makes the secant method suitable for the transcendental nonlinear eigenvalue problems based on the improved DSM. In order to avoid overflow occurrences that can hinder the secant method iterations, limiting frequencies are formulated, with scaling also applied to the eigenvalue problem. Comparisons with results available in the literature demonstrate the strength of the proposed method. Computational efficiency is compared with solutions obtained both by FEM and by the Wittrick-Williams algorithm.

  18. Error diffusion method applied to design combined CSG-BSG element used in ICF driver

    Science.gov (United States)

    Zhang, Yixiao; Yao, Xin; Gao, Fuhua; Guo, Yongkang; Wang, Lei; Hou, Xi

    2006-08-01

    In the final optics assembly of Inertial Confinement Fusion (ICF) driver, Diffractive Optical Elements (DOEs) are applied to achieve some important functions, such as harmonic wave separation, beam sampling, beam smoothing and pulse compression etc. However, in order to optimize the system structure, decrease the energy loss and avoid the damage of laser induction or self-focusing effect, the number of elements used in the ICF system, especially in the final optics assembly, should be minimized. The multiple exposure method has been proposed, for this purpose, to fabricate BSG and CSG on one surface of a silica plate. But the multiple etch processes utilized in this method is complex and will introduce large alignment error. Error diffusion method that based on pulse-density modulation has been widely used in signal processing and computer generated hologram (CGH). In this paper, according to error diffusion method in CGH and partial coherent imaging theory, we present a new method to design coding mask of combine CSG-BSG element with error diffusion method. With the designed mask, only one exposure process is needed in fabricating combined element, which will greatly reduce the fabrication difficulty and avoid the alignment error introduced by multiple etch processes. We illustrate the designed coding mask for CSG-BSG element with this method and compare the intensity distribution of the spatial image in partial coherent imaging system with desired relief.

  19. Solution and study of nodal neutron transport equation applying the LTSN-DiagExp method

    International Nuclear Information System (INIS)

    Hauser, Eliete Biasotto; Pazos, Ruben Panta; Vilhena, Marco Tullio de; Barros, Ricardo Carvalho de

    2003-01-01

    In this paper we report advances about the three-dimensional nodal discrete-ordinates approximations of neutron transport equation for Cartesian geometry. We use the combined collocation method of the angular variables and nodal approach for the spatial variables. By nodal approach we mean the iterated transverse integration of the S N equations. This procedure leads to the set of one-dimensional averages angular fluxes in each spatial variable. The resulting system of equations is solved with the LTS N method, first applying the Laplace transform to the set of the nodal S N equations and then obtained the solution by symbolic computation. We include the LTS N method by diagonalization to solve the nodal neutron transport equation and then we outline the convergence of these nodal-LTS N approximations with the help of a norm associated to the quadrature formula used to approximate the integral term of the neutron transport equation. (author)

  20. Single trial EEG classification applied to a face recognition experiment using different feature extraction methods.

    Science.gov (United States)

    Li, Yudu; Ma, Sen; Hu, Zhongze; Chen, Jiansheng; Su, Guangda; Dou, Weibei

    2015-01-01

    Research on brain machine interface (BMI) has been developed very fast in recent years. Numerous feature extraction methods have successfully been applied to electroencephalogram (EEG) classification in various experiments. However, little effort has been spent on EEG based BMI systems regarding familiarity of human faces cognition. In this work, we have implemented and compared the classification performances of four common feature extraction methods, namely, common spatial pattern, principal component analysis, wavelet transform and interval features. High resolution EEG signals were collected from fifteen healthy subjects stimulated by equal number of familiar and novel faces. Principal component analysis outperforms other methods with average classification accuracy reaching 94.2% leading to possible real life applications. Our findings thereby may contribute to the BMI systems for face recognition.

  1. A reflective lens: applying critical systems thinking and visual methods to ecohealth research.

    Science.gov (United States)

    Cleland, Deborah; Wyborn, Carina

    2010-12-01

    Critical systems methodology has been advocated as an effective and ethical way to engage with the uncertainty and conflicting values common to ecohealth problems. We use two contrasting case studies, coral reef management in the Philippines and national park management in Australia, to illustrate the value of critical systems approaches in exploring how people respond to environmental threats to their physical and spiritual well-being. In both cases, we used visual methods--participatory modeling and rich picturing, respectively. The critical systems methodology, with its emphasis on reflection, guided an appraisal of the research process. A discussion of these two case studies suggests that visual methods can be usefully applied within a critical systems framework to offer new insights into ecohealth issues across a diverse range of socio-political contexts. With this article, we hope to open up a conversation with other practitioners to expand the use of visual methods in integrated research.

  2. A note on the accuracy of spectral method applied to nonlinear conservation laws

    Science.gov (United States)

    Shu, Chi-Wang; Wong, Peter S.

    1994-01-01

    Fourier spectral method can achieve exponential accuracy both on the approximation level and for solving partial differential equations if the solutions are analytic. For a linear partial differential equation with a discontinuous solution, Fourier spectral method produces poor point-wise accuracy without post-processing, but still maintains exponential accuracy for all moments against analytic functions. In this note we assess the accuracy of Fourier spectral method applied to nonlinear conservation laws through a numerical case study. We find that the moments with respect to analytic functions are no longer very accurate. However the numerical solution does contain accurate information which can be extracted by a post-processing based on Gegenbauer polynomials.

  3. Artificial intelligence methods applied for quantitative analysis of natural radioactive sources

    International Nuclear Information System (INIS)

    Medhat, M.E.

    2012-01-01

    Highlights: ► Basic description of artificial neural networks. ► Natural gamma ray sources and problem of detections. ► Application of neural network for peak detection and activity determination. - Abstract: Artificial neural network (ANN) represents one of artificial intelligence methods in the field of modeling and uncertainty in different applications. The objective of the proposed work was focused to apply ANN to identify isotopes and to predict uncertainties of their activities of some natural radioactive sources. The method was tested for analyzing gamma-ray spectra emitted from natural radionuclides in soil samples detected by a high-resolution gamma-ray spectrometry based on HPGe (high purity germanium). The principle of the suggested method is described, including, relevant input parameters definition, input data scaling and networks training. It is clear that there is satisfactory agreement between obtained and predicted results using neural network.

  4. Finite volume and finite element methods applied to 3D laminar and turbulent channel flows

    Science.gov (United States)

    Louda, Petr; Sváček, Petr; Kozel, Karel; Příhoda, Jaromír

    2014-12-01

    The work deals with numerical simulations of incompressible flow in channels with rectangular cross section. The rectangular cross section itself leads to development of various secondary flow patterns, where accuracy of simulation is influenced by numerical viscosity of the scheme and by turbulence modeling. In this work some developments of stabilized finite element method are presented. Its results are compared with those of an implicit finite volume method also described, in laminar and turbulent flows. It is shown that numerical viscosity can cause errors of same magnitude as different turbulence models. The finite volume method is also applied to 3D turbulent flow around backward facing step and good agreement with 3D experimental results is obtained.

  5. Finite volume and finite element methods applied to 3D laminar and turbulent channel flows

    Energy Technology Data Exchange (ETDEWEB)

    Louda, Petr; Příhoda, Jaromír [Institute of Thermomechanics, Czech Academy of Sciences, Prague (Czech Republic); Sváček, Petr; Kozel, Karel [Czech Technical University in Prague, Fac. of Mechanical Engineering (Czech Republic)

    2014-12-10

    The work deals with numerical simulations of incompressible flow in channels with rectangular cross section. The rectangular cross section itself leads to development of various secondary flow patterns, where accuracy of simulation is influenced by numerical viscosity of the scheme and by turbulence modeling. In this work some developments of stabilized finite element method are presented. Its results are compared with those of an implicit finite volume method also described, in laminar and turbulent flows. It is shown that numerical viscosity can cause errors of same magnitude as different turbulence models. The finite volume method is also applied to 3D turbulent flow around backward facing step and good agreement with 3D experimental results is obtained.

  6. The reduction method of statistic scale applied to study of climatic change

    International Nuclear Information System (INIS)

    Bernal Suarez, Nestor Ricardo; Molina Lizcano, Alicia; Martinez Collantes, Jorge; Pabon Jose Daniel

    2000-01-01

    In climate change studies the global circulation models of the atmosphere (GCMAs) enable one to simulate the global climate, with the field variables being represented on a grid points 300 km apart. One particular interest concerns the simulation of possible changes in rainfall and surface air temperature due to an assumed increase of greenhouse gases. However, the models yield the climatic projections on grid points that in most cases do not correspond to the sites of major interest. To achieve local estimates of the climatological variables, methods like the one known as statistical down scaling are applied. In this article we show a case in point by applying canonical correlation analysis (CCA) to the Guajira Region in the northeast of Colombia

  7. An Effective Method on Applying Feedback Error Learning Scheme to Functional Electrical Stimulation Controller

    Science.gov (United States)

    Watanabe, Takashi; Kurosawa, Kenji; Yoshizawa, Makoto

    A Feedback Error Learning (FEL) scheme was found to be applicable to joint angle control by Functional Electrical Stimulation (FES) in our previous study. However, the FEL-FES controller had a problem in learning of the inverse dynamics model (IDM) in some cases. In this paper, methods of applying the FEL to FES control were examined in controlling 1-DOF movement of the wrist joint stimulating 2 muscles through computer simulation under several control conditions with several subject models. The problems in applying FEL to FES controller were suggested to be in restricting stimulation intensity to positive values between the minimum and the maximum intensities and in the case of very small output values of the IDM. Learning of the IDM was greatly improved by considering the IDM output range with setting the minimum ANN output value in calculating ANN connection weight change.

  8. Parallel Implicit Runge-Kutta Methods Applied to Coupled Orbit/Attitude Propagation

    Science.gov (United States)

    Hatten, Noble; Russell, Ryan P.

    2017-12-01

    A variable-step Gauss-Legendre implicit Runge-Kutta (GLIRK) propagator is applied to coupled orbit/attitude propagation. Concepts previously shown to improve efficiency in 3DOF propagation are modified and extended to the 6DOF problem, including the use of variable-fidelity dynamics models. The impact of computing the stage dynamics of a single step in parallel is examined using up to 23 threads and 22 associated GLIRK stages; one thread is reserved for an extra dynamics function evaluation used in the estimation of the local truncation error. Efficiency is found to peak for typical examples when using approximately 8 to 12 stages for both serial and parallel implementations. Accuracy and efficiency compare favorably to explicit Runge-Kutta and linear-multistep solvers for representative scenarios. However, linear-multistep methods are found to be more efficient for some applications, particularly in a serial computing environment, or when parallelism can be applied across multiple trajectories.

  9. ADVANTAGES AND DISADVANTAGES OF APPLYING EVOLVED METHODS IN MANAGEMENT ACCOUNTING PRACTICE

    Directory of Open Access Journals (Sweden)

    SABOU FELICIA

    2014-05-01

    Full Text Available The evolved methods of management accounting have been developed with the purpose of removing the disadvantages of the classical methods, they are methods adapted to the new market conditions, which provide much more useful cost-related information so that the management of the company is able to take certain strategic decisions. Out of the category of evolved methods, the most used is the one of standard-costs due to the advantages that it presents, being used widely in calculating the production costs in some developed countries. The main advantages of the standard-cost method are: in-advance knowledge of the production costs and the measures that ensure compliance to these; with the help of the deviations calculated from the standard costs, one manages a systematic control over the costs, thus allowing the making of decision in due time, in as far as the elimination of the deviations and the improvement of the activity are concerned and it is a method of analysis, control and cost forecast; Although the advantages of using standards are significant, there are a few disadvantages to the employment of the standard-cost method: sometimes there can appear difficulties in establishing the deviations from the standard costs, the method does not allow an accurate calculation of the fixed costs. As a result of the study, we can observe the fact that the evolved methods of management accounting, as compared to the classical ones, present a series of advantages linked to a better analysis, control, and foreseeing of costs, whereas the main disadvantage is related to the large amount of work necessary for these methods to be applied

  10. ADVANTAGES AND DISADVANTAGES OF APPLYING EVOLVED METHODS IN MANAGEMENT ACCOUNTING PRACTICE

    Directory of Open Access Journals (Sweden)

    SABOU FELICIA

    2014-05-01

    Full Text Available The evolved methods of management accounting have been developed with the purpose of removing the disadvantages of the classical methods, they are methods adapted to the new market conditions, which provide much more useful cost-related information so that the management of the company is able to take certain strategic decisions. Out of the category of evolved methods, the most used is the one of standard-costs due to the advantages that it presents, being used widely in calculating the production costs in some developed countries. The main advantages of the standard-cost method are: in-advance knowledge of the production costs and the measures that ensure compliance to these; with the help of the deviations calculated from the standard costs, one manages a systematic control over the costs, thus allowing the making of decision in due time, in as far as the elimination of the deviations and the improvement of the activity are concerned and it is a method of analysis, control and cost forecast; Although the advantages of using standards are significant, there are a few disadvantages to the employment of the standard-cost method: sometimes there can appear difficulties in establishing the deviations from the standard costs, the method does not allow an accurate calculation of the fixed costs. As a result of the study, we can observe the fact that the evolved methods of management accounting, as compared to the classical ones, present a series of advantages linked to a better analysis, control, and foreseeing of costs, whereas the main disadvantage is related to the large amount of work necessary for these methods to be applied.

  11. A Rapid Coordinate Transformation Method Applied in Industrial Robot Calibration Based on Characteristic Line Coincidence.

    Science.gov (United States)

    Liu, Bailing; Zhang, Fumin; Qu, Xinghua; Shi, Xiaojia

    2016-02-18

    Coordinate transformation plays an indispensable role in industrial measurements, including photogrammetry, geodesy, laser 3-D measurement and robotics. The widely applied methods of coordinate transformation are generally based on solving the equations of point clouds. Despite the high accuracy, this might result in no solution due to the use of ill conditioned matrices. In this paper, a novel coordinate transformation method is proposed, not based on the equation solution but based on the geometric transformation. We construct characteristic lines to represent the coordinate systems. According to the space geometry relation, the characteristic line scan is made to coincide by a series of rotations and translations. The transformation matrix can be obtained using matrix transformation theory. Experiments are designed to compare the proposed method with other methods. The results show that the proposed method has the same high accuracy, but the operation is more convenient and flexible. A multi-sensor combined measurement system is also presented to improve the position accuracy of a robot with the calibration of the robot kinematic parameters. Experimental verification shows that the position accuracy of robot manipulator is improved by 45.8% with the proposed method and robot calibration.

  12. A Rapid Coordinate Transformation Method Applied in Industrial Robot Calibration Based on Characteristic Line Coincidence

    Directory of Open Access Journals (Sweden)

    Bailing Liu

    2016-02-01

    Full Text Available Coordinate transformation plays an indispensable role in industrial measurements, including photogrammetry, geodesy, laser 3-D measurement and robotics. The widely applied methods of coordinate transformation are generally based on solving the equations of point clouds. Despite the high accuracy, this might result in no solution due to the use of ill conditioned matrices. In this paper, a novel coordinate transformation method is proposed, not based on the equation solution but based on the geometric transformation. We construct characteristic lines to represent the coordinate systems. According to the space geometry relation, the characteristic line scan is made to coincide by a series of rotations and translations. The transformation matrix can be obtained using matrix transformation theory. Experiments are designed to compare the proposed method with other methods. The results show that the proposed method has the same high accuracy, but the operation is more convenient and flexible. A multi-sensor combined measurement system is also presented to improve the position accuracy of a robot with the calibration of the robot kinematic parameters. Experimental verification shows that the position accuracy of robot manipulator is improved by 45.8% with the proposed method and robot calibration.

  13. A Rapid Coordinate Transformation Method Applied in Industrial Robot Calibration Based on Characteristic Line Coincidence

    Science.gov (United States)

    Liu, Bailing; Zhang, Fumin; Qu, Xinghua; Shi, Xiaojia

    2016-01-01

    Coordinate transformation plays an indispensable role in industrial measurements, including photogrammetry, geodesy, laser 3-D measurement and robotics. The widely applied methods of coordinate transformation are generally based on solving the equations of point clouds. Despite the high accuracy, this might result in no solution due to the use of ill conditioned matrices. In this paper, a novel coordinate transformation method is proposed, not based on the equation solution but based on the geometric transformation. We construct characteristic lines to represent the coordinate systems. According to the space geometry relation, the characteristic line scan is made to coincide by a series of rotations and translations. The transformation matrix can be obtained using matrix transformation theory. Experiments are designed to compare the proposed method with other methods. The results show that the proposed method has the same high accuracy, but the operation is more convenient and flexible. A multi-sensor combined measurement system is also presented to improve the position accuracy of a robot with the calibration of the robot kinematic parameters. Experimental verification shows that the position accuracy of robot manipulator is improved by 45.8% with the proposed method and robot calibration. PMID:26901203

  14. Projector methods applied to numerical integration of the SN transport equation

    International Nuclear Information System (INIS)

    Hristea, V.; Covaci, St.

    2003-01-01

    We are developing two methods of integration for the S N transport equation in x - y geometry, methods based on projector technique. By cellularization of the phase space and by choosing a finite basis of orthogonal functions, which characterize the angular flux, the non-selfadjoint transport equation is reduced to a cellular automaton. This automaton is completely described by the transition Matrix T. Within this paper two distinct methods of projection are described. One of them uses the transversal integration technique. As an alternative to this we applied the method of the projectors for the integral S N transport equation. We show that the constant spatial approximation of the integral S N transport equation does not lead to negative fluxes. One of the problems with the projector method, namely the appearance of numerical instability for small intervals is solved by the Pade representation of the elements for Matrix T. Numerical tests here presented compare the numerical performances of the algorithms obtained by the two projection methods. The Pade representation was also taken into account for these two algorithm types. (authors)

  15. Efficient combination of acceleration techniques applied to high frequency methods for solving radiation and scattering problems

    Science.gov (United States)

    Lozano, Lorena; Algar, Ma Jesús; García, Eliseo; González, Iván; Cátedra, Felipe

    2017-12-01

    An improved ray-tracing method applied to high-frequency techniques such as the Uniform Theory of Diffraction (UTD) is presented. The main goal is to increase the speed of the analysis of complex structures while considering a vast number of observation directions and taking into account multiple bounces. The method is based on a combination of the Angular Z-Buffer (AZB), the Space Volumetric Partitioning (SVP) algorithm and the A∗ heuristic search method to treat multiple bounces. In addition, a Master Point strategy was developed to analyze efficiently a large number of Near-Field points or Far-Field directions. This technique can be applied to electromagnetic radiation problems, scattering analysis, propagation at urban or indoor environments and to the mutual coupling between antennas. Due to its efficiency, its application is suitable to study large antennas radiation patterns and even its interactions with complex environments, including satellites, ships, aircrafts, cities or another complex electrically large bodies. The new technique appears to be extremely efficient at these applications even when considering multiple bounces.

  16. Methodical basis of training of cadets for the military applied heptathlon competitions

    Directory of Open Access Journals (Sweden)

    R.V. Anatskyi

    2017-12-01

    Full Text Available The purpose of the research is to develop methodical bases of training of cadets for the military applied heptathlon competitions. Material and methods: Cadets of 2-3 courses at the age of 19-20 years (n=20 participated in researches. Cadets were selected by the best results of exercises performing included into the program of military applied heptathlon competitions (100 m run, 50 m freestyle swimming, Kalashnikov rifle shooting, pull-up, obstacle course, grenade throwing, 3000 m run. Preparation took place on the basis of training center. All trainings were organized and carried out according to the methodical basics: in a week preparation microcycle five days cadets had two trainings a day (on Saturday was one training, on Sunday they had rest. The selected exercises with individual loads were performed, Results : Sport scores demonstrated top results in the performance of 100 m run, 3000 m run and pull-up. The indices of performing exercise "obstacle course" were much lower than expected. Rather low results were demonstrated in swimming and shooting. Conclusions . Results of researches indicate the necessity of quality improvement: cadets’ weapons proficiency; physical readiness to perform the exercises requiring complex demonstration of all physical qualities.

  17. Nutritional surveillance.

    Science.gov (United States)

    Mason, J B; Mitchell, J T

    1983-01-01

    The concept of nutritional surveillance is derived from disease surveillance, and means "to watch over nutrition, in order to make decisions that lead to improvements in nutrition in populations". Three distinct objectives have been defined for surveillance systems, primarily in relation to problems of malnutrition in developing countries: to aid long-term planning in health and development; to provide input for programme management and evaluation; and to give timely warning of the need for intervention to prevent critical deteriorations in food consumption. Decisions affecting nutrition are made at various administrative levels, and the uses of different types of nutritional surveillance information can be related to national policies, development programmes, public health and nutrition programmes, and timely warning and intervention programmes. The information should answer specific questions, for example concerning the nutritional status and trends of particular population groups.Defining the uses and users of the information is the first essential step in designing a system; this is illustrated with reference to agricultural and rural development planning, the health sector, and nutrition and social welfare programmes. The most usual data outputs are nutritional outcome indicators (e.g., prevalence of malnutrition among preschool children), disaggregated by descriptive or classifying variables, of which the commonest is simply administrative area. Often, additional "status" indicators, such as quality of housing or water supply, are presented at the same time. On the other hand, timely warning requires earlier indicators of the possibility of nutritional deterioration, and agricultural indicators are often the most appropriate.DATA COME FROM TWO MAIN TYPES OF SOURCE: administrative (e.g., clinics and schools) and household sample surveys. Each source has its own advantages and disadvantages: for example, administrative data often already exist, and can be

  18. Surveillance of the environmental radioactivity

    International Nuclear Information System (INIS)

    Schneider, Th.; Gitzinger, C.; Jaunet, P.; Eberbach, F.; Clavel, B.; Hemidy, P.Y.; Perrier, G.; Kiper, Ch.; Peres, J.M.; Josset, M.; Calvez, M.; Leclerc, M.; Leclerc, E.; Aubert, C.; Levelut, M.N.; Debayle, Ch.; Mayer, St.; Renaud, Ph.; Leprieur, F.; Petitfrere, M.; Catelinois, O.; Monfort, M.; Baron, Y.; Target, A.

    2008-01-01

    The objective of these days was to present the organisation of the surveillance of the environmental radioactivity and to allow an experience sharing and a dialog on this subject between the different actors of the radiation protection in france. The different presentations were as follow: evolution and stakes of the surveillance of radioactivity in environment; the part of the European commission, regulatory aspects; the implementation of the surveillance: the case of Germany; Strategy and logic of environmental surveillance around the EDF national centers of energy production; environmental surveillance: F.B.F.C. site of Romans on Isere; steps of the implementation 'analysis for release decree at the F.B.F.C./C.E.R.C.A. laboratory of Romans; I.R.S.N. and the environmental surveillance: situation and perspectives; the part of a non institutional actor, the citizenship surveillance done by A.C.R.O.; harmonization of sampling methods: the results of inter operators G.T. sampling; sustainable observatory of environment: data traceability and samples conservation; inter laboratories tests of radioactivity measurements; national network of environmental radioactivity measurement: laboratories agreements; the networks of environmental radioactivity telemetry: modernization positioning; programme of observation and surveillance of surface environment and installations of the H.A.-M.A.V.L. project (high activity and long life medium activity); Evolution of radionuclides concentration in environment and adaptation of measurements techniques to the surveillance needs; the national network of radioactivity measurement in environment; modes of data restoration of surveillance: the results of the Loire environment pilot action; method of sanitary impacts estimation in the area of ionizing radiations; the radiological impact of atmospheric nuclear tests in French Polynesia; validation of models by the measure; network of measurement and alert management of the atmospheric

  19. Symanzik's method applied to fractional quantum Hall edge states

    Energy Technology Data Exchange (ETDEWEB)

    Blasi, A.; Ferraro, D.; Maggiore, N.; Magnoli, N. [Dipartimento di Fisica, Universita di Genova (Italy); LAMIA-INFM-CNR, Genova (Italy); Sassetti, M.

    2008-11-15

    The method of separability, introduced by Symanzik, is applied in order to describe the effect of a boundary for a fractional quantum Hall liquid in the Laughlin series. An Abelian Chern-Simons theory with plane boundary is considered and the Green functions both in the bulk and on the edge are constructed, following a rigorous, perturbative, quantum field theory treatment. We show that the conserved boundary currents find an explicit interpretation in terms of the continuity equation with the electron density satisfying the Tomonaga-Luttinger commutation relation. (Abstract Copyright [2008], Wiley Periodicals, Inc.)

  20. Method for applying a photoresist layer to a substrate having a preexisting topology

    Science.gov (United States)

    Morales, Alfredo M.; Gonzales, Marcela

    2004-01-20

    The present invention describes a method for preventing a photoresist layer from delaminating, peeling, away from the surface of a substrate that already contains an etched three dimensional structure such as a hole or a trench. The process comprises establishing a saturated vapor phase of the solvent media used to formulate the photoresist layer, above the surface of the coated substrate as the applied photoresist is heated in order to "cure" or drive off the retained solvent constituent within the layer. By controlling the rate and manner in which solvent is removed from the photoresist layer the layer is stabilized and kept from differentially shrinking and peeling away from the substrate.

  1. The Trojan Horse Method Applied to the Astrophysically Relevant Proton Capture Reactions on Li Isotopes

    Science.gov (United States)

    Tumino, A.; Spitaleri, C.; Musumarra, A.; Pellegriti, M. G.; Pizzone, R. G.; Rinollo, A.; Romano, S.; Pappalardo, L.; Bonomo, C.; Del Zoppo, A.; Di Pietro, A.; Figuera, P.; La Cognata, M.; Lamia, L.; Cherubini, S.; Rolfs, C.; Typel, S.

    2005-12-01

    The 7Li(p,α)4He 6Li(d,α)4He and 6Li(p,α)3He reactions was performed and studied in the framework of the Trojan Horse Method applied to the d(7Li,αα)n, 6Li(6Li,αα)4He and d(6Li,α3He)n three-body reactions respectively. Their bare astrophysical S-factors were extracted and from the comparison with the behavior of the screened direct data, an independent estimate of the screening potential was obtained.

  2. Making Design Decisions Visible: Applying the Case-Based Method in Designing Online Instruction

    Directory of Open Access Journals (Sweden)

    Heng Luo,

    2011-01-01

    Full Text Available The instructional intervention in this design case is a self-directed online tutorial that applies the case-based method to teach educators how to design and conduct entrepreneurship programs for elementary school students. In this article, the authors describe the major decisions made in each phase of the design and development process, explicate the rationales behind them, and demonstrate their effect on the production of the tutorial. Based on such analysis, the guidelines for designing case-based online instruction are summarized for the design case.

  3. A Review of Auditing Methods Applied to the Content of Controlled Biomedical Terminologies

    Science.gov (United States)

    Zhu, Xinxin; Fan, Jung-Wei; Baorto, David M.; Weng, Chunhua; Cimino, James J.

    2012-01-01

    Although controlled biomedical terminologies have been with us for centuries, it is only in the last couple of decades that close attention has been paid to the quality of these terminologies. The result of this attention has been the development of auditing methods that apply formal methods to assessing whether terminologies are complete and accurate. We have performed an extensive literature review to identify published descriptions of these methods and have created a framework for characterizing them. The framework considers manual, systematic and heuristic methods that use knowledge (within or external to the terminology) to measure quality factors of different aspects of the terminology content (terms, semantic classification, and semantic relationships). The quality factors examined included concept orientation, consistency, non-redundancy, soundness and comprehensive coverage. We reviewed 130 studies that were retrieved based on keyword search on publications in PubMed, and present our assessment of how they fit into our framework. We also identify which terminologies have been audited with the methods and provide examples to illustrate each part of the framework. PMID:19285571

  4. A Comparison of Parametric and Non-Parametric Methods Applied to a Likert Scale.

    Science.gov (United States)

    Mircioiu, Constantin; Atkinson, Jeffrey

    2017-05-10

    A trenchant and passionate dispute over the use of parametric versus non-parametric methods for the analysis of Likert scale ordinal data has raged for the past eight decades. The answer is not a simple "yes" or "no" but is related to hypotheses, objectives, risks, and paradigms. In this paper, we took a pragmatic approach. We applied both types of methods to the analysis of actual Likert data on responses from different professional subgroups of European pharmacists regarding competencies for practice. Results obtained show that with "large" (>15) numbers of responses and similar (but clearly not normal) distributions from different subgroups, parametric and non-parametric analyses give in almost all cases the same significant or non-significant results for inter-subgroup comparisons. Parametric methods were more discriminant in the cases of non-similar conclusions. Considering that the largest differences in opinions occurred in the upper part of the 4-point Likert scale (ranks 3 "very important" and 4 "essential"), a "score analysis" based on this part of the data was undertaken. This transformation of the ordinal Likert data into binary scores produced a graphical representation that was visually easier to understand as differences were accentuated. In conclusion, in this case of Likert ordinal data with high response rates, restraining the analysis to non-parametric methods leads to a loss of information. The addition of parametric methods, graphical analysis, analysis of subsets, and transformation of data leads to more in-depth analyses.

  5. Geometric methods for estimating representative sidewalk widths applied to Vienna's streetscape surfaces database

    Science.gov (United States)

    Brezina, Tadej; Graser, Anita; Leth, Ulrich

    2017-04-01

    Space, and in particular public space for movement and leisure, is a valuable and scarce resource, especially in today's growing urban centres. The distribution and absolute amount of urban space—especially the provision of sufficient pedestrian areas, such as sidewalks—is considered crucial for shaping living and mobility options as well as transport choices. Ubiquitous urban data collection and today's IT capabilities offer new possibilities for providing a relation-preserving overview and for keeping track of infrastructure changes. This paper presents three novel methods for estimating representative sidewalk widths and applies them to the official Viennese streetscape surface database. The first two methods use individual pedestrian area polygons and their geometrical representations of minimum circumscribing and maximum inscribing circles to derive a representative width of these individual surfaces. The third method utilizes aggregated pedestrian areas within the buffered street axis and results in a representative width for the corresponding road axis segment. Results are displayed as city-wide means in a 500 by 500 m grid and spatial autocorrelation based on Moran's I is studied. We also compare the results between methods as well as to previous research, existing databases and guideline requirements on sidewalk widths. Finally, we discuss possible applications of these methods for monitoring and regression analysis and suggest future methodological improvements for increased accuracy.

  6. Complex Method Mixed with PSO Applying to Optimization Design of Bridge Crane Girder

    Directory of Open Access Journals (Sweden)

    He Yan

    2017-01-01

    Full Text Available In engineer design, basic complex method has not enough global search ability for the nonlinear optimization problem, so it mixed with particle swarm optimization (PSO has been presented in the paper,that is the optimal particle evaluated from fitness function of particle swarm displacement complex vertex in order to realize optimal principle of the largest complex central distance.This method is applied to optimization design problems of box girder of bridge crane with constraint conditions.At first a mathematical model of the girder optimization has been set up,in which box girder cross section area of bridge crane is taken as the objective function, and its four sizes parameters as design variables, girder mechanics performance, manufacturing process, border sizes and so on requirements as constraint conditions. Then complex method mixed with PSO is used to solve optimization design problem of cane box girder from constrained optimization studying approach, and its optimal results have achieved the goal of lightweight design and reducing the crane manufacturing cost . The method is reliable, practical and efficient by the practical engineer calculation and comparative analysis with basic complex method.

  7. Knowledge-Based Trajectory Error Pattern Method Applied to an Active Force Control Scheme

    Directory of Open Access Journals (Sweden)

    Endra Pitowarno, Musa Mailah, Hishamuddin Jamaluddin

    2012-08-01

    Full Text Available The active force control (AFC method is known as a robust control scheme that dramatically enhances the performance of a robot arm particularly in compensating the disturbance effects. The main task of the AFC method is to estimate the inertia matrix in the feedback loop to provide the correct (motor torque required to cancel out these disturbances. Several intelligent control schemes have already been introduced to enhance the estimation methods of acquiring the inertia matrix such as those using neural network, iterative learning and fuzzy logic. In this paper, we propose an alternative scheme called Knowledge-Based Trajectory Error Pattern Method (KBTEPM to suppress the trajectory track error of the AFC scheme. The knowledge is developed from the trajectory track error characteristic based on the previous experimental results of the crude approximation method. It produces a unique, new and desirable error pattern when a trajectory command is forced. An experimental study was performed using simulation work on the AFC scheme with KBTEPM applied to a two-planar manipulator in which a set of rule-based algorithm is derived. A number of previous AFC schemes are also reviewed as benchmark. The simulation results show that the AFC-KBTEPM scheme successfully reduces the trajectory track error significantly even in the presence of the introduced disturbances.Key Words:  Active force control, estimated inertia matrix, robot arm, trajectory error pattern, knowledge-based.

  8. Efficient alpha particle detection by CR-39 applying 50 Hz-HV electrochemical etching method

    International Nuclear Information System (INIS)

    Sohrabi, M.; Soltani, Z.

    2016-01-01

    Alpha particles can be detected by CR-39 by applying either chemical etching (CE), electrochemical etching (ECE), or combined pre-etching and ECE usually through a multi-step HF-HV ECE process at temperatures much higher than room temperature. By applying pre-etching, characteristics responses of fast-neutron-induced recoil tracks in CR-39 by HF-HV ECE versus KOH normality (N) have shown two high-sensitivity peaks around 5–6 and 15–16 N and a large-diameter peak with a minimum sensitivity around 10–11 N at 25°C. On the other hand, 50 Hz-HV ECE method recently advanced in our laboratory detects alpha particles with high efficiency and broad registration energy range with small ECE tracks in polycarbonate (PC) detectors. By taking advantage of the CR-39 sensitivity to alpha particles, efficacy of 50 Hz-HV ECE method and CR-39 exotic responses under different KOH normalities, detection characteristics of 0.8 MeV alpha particle tracks were studied in 500 μm CR-39 for different fluences, ECE duration and KOH normality. Alpha registration efficiency increased as ECE duration increased to 90 ± 2% after 6–8 h beyond which plateaus are reached. Alpha track density versus fluence is linear up to 10 6  tracks cm −2 . The efficiency and mean track diameter versus alpha fluence up to 10 6  alphas cm −2 decrease as the fluence increases. Background track density and minimum detection limit are linear functions of ECE duration and increase as normality increases. The CR-39 processed for the first time in this study by 50 Hz-HV ECE method proved to provide a simple, efficient and practical alpha detection method at room temperature. - Highlights: • Alpha particles of 0.8 MeV were detected in CR-39 by 50 Hz-HV ECE method. • Efficiency/track diameter was studied vs fluence and time for 3 KOH normality. • Background track density and minimum detection limit vs duration were studied. • A new simple, efficient and low-cost alpha detection method

  9. Simple hyperaemia test as a screening method in the postoperative surveillance of infrainguinal in situ vein bypasses

    DEFF Research Database (Denmark)

    Nielsen, Tina G; Sillesen, H; Schroeder, T V

    1995-01-01

    OBJECTIVES: To develop a simple protocol for ultrasound Duplex surveillance of infrainguinal vein bypasses. DESIGN: The value of three Doppler waveform parameters, obtained from a single point of the bypass, for identification of stenoses was studied in 91 in situ vein bypasses. Midgraft peak...... systolic velocity (PSV), pulsatility index (PI) and ratio of hyperaemic and resting time-average mean velocities (TAMV), (TAMV ratio = TAMVhyperaemia/TAMVrest) were correlated with the presence and severity of stenoses as assessed by conventional Duplex scanning and ankle-brachial index (ABI) measurements....... The optimal value of the waveform parameters for discrimination between bypasses with and without evidence of stenoses was determined by receiver operating characteristics (ROC) analysis. MAIN RESULTS: Complete Duplex scanning of the entire graft revealed an increase in the peak systolic velocity by a factor...

  10. Faraday Rotation of Automatic Dependent Surveillance-Broadcast (ADS-B) Signals as a Method of Ionospheric Characterization

    Science.gov (United States)

    Cushley, A. C.; Kabin, K.; Noël, J.-M.

    2017-10-01

    Radio waves propagating through plasma in the Earth's ambient magnetic field experience Faraday rotation; the plane of the electric field of a linearly polarized wave changes as a function of the distance travelled through a plasma. Linearly polarized radio waves at 1090 MHz frequency are emitted by Automatic Dependent Surveillance Broadcast (ADS-B) devices that are installed on most commercial aircraft. These radio waves can be detected by satellites in low Earth orbits, and the change of the polarization angle caused by propagation through the terrestrial ionosphere can be measured. In this manuscript we discuss how these measurements can be used to characterize the ionospheric conditions. In the present study, we compute the amount of Faraday rotation from a prescribed total electron content value and two of the profile parameters of the NeQuick ionospheric model.

  11. Faraday rotation of Automatic Dependent Surveillance Broadcast (ADS-B) signals as a method of ionospheric characterization

    Science.gov (United States)

    Cushley, A. C.; Kabin, K.; Noel, J. M. A.

    2017-12-01

    Radio waves propagating through plasma in the Earth's ambient magnetic field experience Faraday rotation; the plane of the electric field of a linearly polarized wave changes as a function of the distance travelled through a plasma. Linearly polarized radio waves at 1090 MHz frequency are emitted by Automatic Dependent Surveillance Broadcast (ADS-B) devices which are installed on most commercial aircraft. These radio waves can be detected by satellites in low earth orbits, and the change of the polarization angle caused by propagation through the terrestrial ionosphere can be measured. In this work we discuss how these measurements can be used to characterize the ionospheric conditions. In the present study, we compute the amount of Faraday rotation from a prescribed total electron content value and two of the profile parameters of the NeQuick model.

  12. A METHOD FOR PREPARING A SUBSTRATE BY APPLYING A SAMPLE TO BE ANALYSED

    DEFF Research Database (Denmark)

    2017-01-01

    The invention relates to a method for preparing a substrate (105a) comprising a sample reception area (110) and a sensing area (111). The method comprises the steps of: 1) applying a sample on the sample reception area; 2) rotating the substrate around a predetermined axis; 3) during rotation......, at least part of the liquid travels from the sample reception area to the sensing area due to capillary forces acting between the liquid and the substrate; and 4) removing the wave of particles and liquid formed at one end of the substrate. The sensing area is closer to the predetermined axis than...... the sample reception area. The sample comprises a liquid part and particles suspended therein....

  13. Super-convergence of Discontinuous Galerkin Method Applied to the Navier-Stokes Equations

    Science.gov (United States)

    Atkins, Harold L.

    2009-01-01

    The practical benefits of the hyper-accuracy properties of the discontinuous Galerkin method are examined. In particular, we demonstrate that some flow attributes exhibit super-convergence even in the absence of any post-processing technique. Theoretical analysis suggest that flow features that are dominated by global propagation speeds and decay or growth rates should be super-convergent. Several discrete forms of the discontinuous Galerkin method are applied to the simulation of unsteady viscous flow over a two-dimensional cylinder. Convergence of the period of the naturally occurring oscillation is examined and shown to converge at 2p+1, where p is the polynomial degree of the discontinuous Galerkin basis. Comparisons are made between the different discretizations and with theoretical analysis.

  14. SANS contrast variation method applied in experiments on ferrofluids at MURN instrument of IBR-2 reactor

    Science.gov (United States)

    Balasoiu, Maria; Kuklin, Alexander

    2012-03-01

    Separate determination of the nuclear and magnetic contributions to the scattering intensity by means of a contrast variation method applied in a small angle neutron scattering experiment of nonpolarized neutrons in ferrofluids in early 90 's at the MURN instrument is reviewed. The nuclear scattering contribution gives the features of the colloidal particle dimensions, surfactant shell structure and the solvent degree penetration to the macromolecular layer. The magnetic scattering part is compatible to the models where is supposed that the particle surface has a nonmagnetic layer. Details on experimental "Grabcev method" in obtaining separate nuclear and magnetic contributions to the small angle neutron scattering intensity of unpolarized neutrons are emphasized for the case of a high quality ultrastabile benzene-based ferrofluid with magnetite nanoparticles.

  15. SANS contrast variation method applied in experiments on ferrofluids at MURN instrument of IBR-2 reactor

    International Nuclear Information System (INIS)

    Balasoiu, Maria; Kuklin, Alexander

    2012-01-01

    Separate determination of the nuclear and magnetic contributions to the scattering intensity by means of a contrast variation method applied in a small angle neutron scattering experiment of nonpolarized neutrons in ferrofluids in early 90 's at the MURN instrument is reviewed. The nuclear scattering contribution gives the features of the colloidal particle dimensions, surfactant shell structure and the solvent degree penetration to the macromolecular layer. The magnetic scattering part is compatible to the models where is supposed that the particle surface has a nonmagnetic layer. Details on experimental 'Grabcev method' in obtaining separate nuclear and magnetic contributions to the small angle neutron scattering intensity of unpolarized neutrons are emphasized for the case of a high quality ultrastabile benzene-based ferrofluid with magnetite nanoparticles.

  16. Infrared thermography inspection methods applied to the target elements of W7-X divertor

    Energy Technology Data Exchange (ETDEWEB)

    Missirlian, M. [Association Euratom-CEA, CEA/DSM/DRFC, CEA/Cadarache, F-13108 Saint Paul Lez Durance (France)], E-mail: marc.missirlian@cea.fr; Traxler, H. [PLANSEE SE, Technology Center, A-6600 Reutte (Austria); Boscary, J. [Max-Planck-Institut fuer Plasmaphysik, Euratom Association, Boltzmannstr. 2, D-85748 Garching (Germany); Durocher, A.; Escourbiac, F.; Schlosser, J. [Association Euratom-CEA, CEA/DSM/DRFC, CEA/Cadarache, F-13108 Saint Paul Lez Durance (France); Schedler, B.; Schuler, P. [PLANSEE SE, Technology Center, A-6600 Reutte (Austria)

    2007-10-15

    The non-destructive examination (NDE) method is one of the key issues in developing highly loaded plasma-facing components (PFCs) for a next generation fusion devices such as W7-X and ITER. The most critical step is certainly the fabrication and the examination of the bond between the armour and the heat sink. Two inspection systems based on the infrared thermography methods, namely, the transient thermography (SATIR-CEA) and the pulsed thermography (ARGUS-PLANSEE), are being developed and have been applied to the pre-series of target elements of the W7-X divertor. Results obtained from qualification experiences performed on target elements with artificial calibrated defects allowed to demonstrate the capability of the two techniques and raised the efficiency of inspection to a level which is appropriate for industrial application.

  17. Infrared thermography inspection methods applied to the target elements of W7-X divertor

    International Nuclear Information System (INIS)

    Missirlian, M.; Traxler, H.; Boscary, J.; Durocher, A.; Escourbiac, F.; Schlosser, J.; Schedler, B.; Schuler, P.

    2007-01-01

    The non-destructive examination (NDE) method is one of the key issues in developing highly loaded plasma-facing components (PFCs) for a next generation fusion devices such as W7-X and ITER. The most critical step is certainly the fabrication and the examination of the bond between the armour and the heat sink. Two inspection systems based on the infrared thermography methods, namely, the transient thermography (SATIR-CEA) and the pulsed thermography (ARGUS-PLANSEE), are being developed and have been applied to the pre-series of target elements of the W7-X divertor. Results obtained from qualification experiences performed on target elements with artificial calibrated defects allowed to demonstrate the capability of the two techniques and raised the efficiency of inspection to a level which is appropriate for industrial application

  18. Data Analytics of Mobile Serious Games: Applying Bayesian Data Analysis Methods

    Directory of Open Access Journals (Sweden)

    Heide Lukosch

    2018-03-01

    Full Text Available Traditional teaching methods in the field of resuscitation training show some limitations, while teaching the right actions in critical situations could increase the number of people saved after a cardiac arrest. For our study, we developed a mobile game to support the transfer of theoretical knowledge on resuscitation.  The game has been tested at three schools of further education. A number of data has been collected from 171 players. To analyze this large data set from different sources and quality, different types of data modeling and analyses had to be applied. This approach showed its usefulness in analyzing the large set of data from different sources. It revealed some interesting findings, such as that female players outperformed the male ones, and that the game fostering informal, self-directed is equally efficient as the traditional formal learning method.

  19. Performance comparison of two efficient genomic selection methods (gsbay & MixP) applied in aquacultural organisms

    Science.gov (United States)

    Su, Hailin; Li, Hengde; Wang, Shi; Wang, Yangfan; Bao, Zhenmin

    2017-02-01

    Genomic selection is more and more popular in animal and plant breeding industries all around the world, as it can be applied early in life without impacting selection candidates. The objective of this study was to bring the advantages of genomic selection to scallop breeding. Two different genomic selection tools MixP and gsbay were applied on genomic evaluation of simulated data and Zhikong scallop ( Chlamys farreri) field data. The data were compared with genomic best linear unbiased prediction (GBLUP) method which has been applied widely. Our results showed that both MixP and gsbay could accurately estimate single-nucleotide polymorphism (SNP) marker effects, and thereby could be applied for the analysis of genomic estimated breeding values (GEBV). In simulated data from different scenarios, the accuracy of GEBV acquired was ranged from 0.20 to 0.78 by MixP; it was ranged from 0.21 to 0.67 by gsbay; and it was ranged from 0.21 to 0.61 by GBLUP. Estimations made by MixP and gsbay were expected to be more reliable than those estimated by GBLUP. Predictions made by gsbay were more robust, while with MixP the computation is much faster, especially in dealing with large-scale data. These results suggested that both algorithms implemented by MixP and gsbay are feasible to carry out genomic selection in scallop breeding, and more genotype data will be necessary to produce genomic estimated breeding values with a higher accuracy for the industry.

  20. Estimating the Impacts of Local Policy Innovation: The Synthetic Control Method Applied to Tropical Deforestation

    Science.gov (United States)

    Sills, Erin O.; Herrera, Diego; Kirkpatrick, A. Justin; Brandão, Amintas; Dickson, Rebecca; Hall, Simon; Pattanayak, Subhrendu; Shoch, David; Vedoveto, Mariana; Young, Luisa; Pfaff, Alexander

    2015-01-01

    Quasi-experimental methods increasingly are used to evaluate the impacts of conservation interventions by generating credible estimates of counterfactual baselines. These methods generally require large samples for statistical comparisons, presenting a challenge for evaluating innovative policies implemented within a few pioneering jurisdictions. Single jurisdictions often are studied using comparative methods, which rely on analysts’ selection of best case comparisons. The synthetic control method (SCM) offers one systematic and transparent way to select cases for comparison, from a sizeable pool, by focusing upon similarity in outcomes before the intervention. We explain SCM, then apply it to one local initiative to limit deforestation in the Brazilian Amazon. The municipality of Paragominas launched a multi-pronged local initiative in 2008 to maintain low deforestation while restoring economic production. This was a response to having been placed, due to high deforestation, on a federal “blacklist” that increased enforcement of forest regulations and restricted access to credit and output markets. The local initiative included mapping and monitoring of rural land plus promotion of economic alternatives compatible with low deforestation. The key motivation for the program may have been to reduce the costs of blacklisting. However its stated purpose was to limit deforestation, and thus we apply SCM to estimate what deforestation would have been in a (counterfactual) scenario of no local initiative. We obtain a plausible estimate, in that deforestation patterns before the intervention were similar in Paragominas and the synthetic control, which suggests that after several years, the initiative did lower deforestation (significantly below the synthetic control in 2012). This demonstrates that SCM can yield helpful land-use counterfactuals for single units, with opportunities to integrate local and expert knowledge and to test innovations and permutations on

  1. Estimating the Impacts of Local Policy Innovation: The Synthetic Control Method Applied to Tropical Deforestation.

    Science.gov (United States)

    Sills, Erin O; Herrera, Diego; Kirkpatrick, A Justin; Brandão, Amintas; Dickson, Rebecca; Hall, Simon; Pattanayak, Subhrendu; Shoch, David; Vedoveto, Mariana; Young, Luisa; Pfaff, Alexander

    2015-01-01

    Quasi-experimental methods increasingly are used to evaluate the impacts of conservation interventions by generating credible estimates of counterfactual baselines. These methods generally require large samples for statistical comparisons, presenting a challenge for evaluating innovative policies implemented within a few pioneering jurisdictions. Single jurisdictions often are studied using comparative methods, which rely on analysts' selection of best case comparisons. The synthetic control method (SCM) offers one systematic and transparent way to select cases for comparison, from a sizeable pool, by focusing upon similarity in outcomes before the intervention. We explain SCM, then apply it to one local initiative to limit deforestation in the Brazilian Amazon. The municipality of Paragominas launched a multi-pronged local initiative in 2008 to maintain low deforestation while restoring economic production. This was a response to having been placed, due to high deforestation, on a federal "blacklist" that increased enforcement of forest regulations and restricted access to credit and output markets. The local initiative included mapping and monitoring of rural land plus promotion of economic alternatives compatible with low deforestation. The key motivation for the program may have been to reduce the costs of blacklisting. However its stated purpose was to limit deforestation, and thus we apply SCM to estimate what deforestation would have been in a (counterfactual) scenario of no local initiative. We obtain a plausible estimate, in that deforestation patterns before the intervention were similar in Paragominas and the synthetic control, which suggests that after several years, the initiative did lower deforestation (significantly below the synthetic control in 2012). This demonstrates that SCM can yield helpful land-use counterfactuals for single units, with opportunities to integrate local and expert knowledge and to test innovations and permutations on policies

  2. Applying Mathematical Optimization Methods to an ACT-R Instance-Based Learning Model.

    Directory of Open Access Journals (Sweden)

    Nadia Said

    Full Text Available Computational models of cognition provide an interface to connect advanced mathematical tools and methods to empirically supported theories of behavior in psychology, cognitive science, and neuroscience. In this article, we consider a computational model of instance-based learning, implemented in the ACT-R cognitive architecture. We propose an approach for obtaining mathematical reformulations of such cognitive models that improve their computational tractability. For the well-established Sugar Factory dynamic decision making task, we conduct a simulation study to analyze central model parameters. We show how mathematical optimization techniques can be applied to efficiently identify optimal parameter values with respect to different optimization goals. Beyond these methodological contributions, our analysis reveals the sensitivity of this particular task with respect to initial settings and yields new insights into how average human performance deviates from potential optimal performance. We conclude by discussing possible extensions of our approach as well as future steps towards applying more powerful derivative-based optimization methods.

  3. Labile soil phosphorus as influenced by methods of applying radioactive phosphorus

    International Nuclear Information System (INIS)

    Selvaratnam, V.V.; Andersen, A.J.; Thomsen, J.D.; Gissel-Nielsen, G.

    1980-03-01

    The influence of different methods of applying radioactive phosphorus on the E- and L-values was studied in four foil types using barley, buckwheat, and rye grass for the L-value determination. The four soils differed greatly in their E- and L-values. The experiment was carried out both with and without carrier-P. The presence of carrier-P had no influence on the E-values, while carrier-P in some cases gave a lower L-value. Both E- and L-values dependent on the method of application. When the 32 P was applied on a small soil or sand sample and dried before mixing with the total amount of soil, the E-values were higher than at direct application most likely because of a stronger fixation to the soil/sand particles. This was not the case for the L-values that are based on a much longer equilibrium time. On the contrary, the direct application of the 32 p-solution to the whole amount of soil gave higher L-values of a non-homogeneous distribution of the 32 p in the soil. (author)

  4. The National Athletic Treatment, Injury and Outcomes Network (NATION): Methods of the Surveillance Program, 2011-2012 Through 2013-2014.

    Science.gov (United States)

    Dompier, Thomas P; Marshall, Stephen W; Kerr, Zachary Y; Hayden, Ross

    2015-08-01

    Previous epidemiologic researchers have examined time-loss (TL) injuries in high school student-athletes, but little is known about the frequency of non-time-loss (NTL) injuries in these athletes. To describe the methods of the National Athletic Treatment, Injury and Outcomes Network (NATION) Surveillance Program and provide descriptive epidemiology of TL and NTL injuries across athletes in 27 high school sports. Descriptive epidemiology study. Aggregate injury and exposure data collected from 147 high schools in 26 states. High school student-athletes participating in 13 boys' sports and 14 girls' sports during the 2011-2012 through 2013-2014 academic years. Athletic trainers documented injuries and exposures using commercially available injury-tracking software packages. Standard injury-tracking software was modified by the software vendors to conform to the surveillance needs of this project. The modified software exported a set of common data elements, stripped of personally identifiable information, to a centralized automated verification and validation system before they were included in the centralized research database. Dependent measures were injury and exposure frequencies and injury rates with 95% confidence intervals stratified by sport, sex, and injury type (TL or NTL). Over the 3-year period, a total of 2337 team seasons across 27 sports resulted in 47 014 injuries and 5 146 355 athlete-exposures. The NTL injuries accounted for 38 765 (82.45%) and TL injuries for 8249 (17.55%) of the total. The NTL injuries accounted for a substantial amount of the total number of injuries sustained by high school student-athletes. This project demonstrates the feasibility of creating large-scale injury surveillance systems using commercially available injury-tracking software.

  5. Analysis of coupled neutron-gamma radiations, applied to shieldings in multigroup albedo method

    International Nuclear Information System (INIS)

    Dunley, Leonardo Souza

    2002-01-01

    The principal mathematical tools frequently available for calculations in Nuclear Engineering, including coupled neutron-gamma radiations shielding problems, involve the full Transport Theory or the Monte Carlo techniques. The Multigroup Albedo Method applied to shieldings is characterized by following the radiations through distinct layers of materials, allowing the determination of the neutron and gamma fractions reflected from, transmitted through and absorbed in the irradiated media when a neutronic stream hits the first layer of material, independently of flux calculations. Then, the method is a complementary tool of great didactic value due to its clarity and simplicity in solving neutron and/or gamma shielding problems. The outstanding results achieved in previous works motivated the elaboration and the development of this study that is presented in this dissertation. The radiation balance resulting from the incidence of a neutronic stream into a shielding composed by 'm' non-multiplying slab layers for neutrons was determined by the Albedo method, considering 'n' energy groups for neutrons and 'g' energy groups for gammas. It was taken into account there is no upscattering of neutrons and gammas. However, it was considered that neutrons from any energy groups are able to produce gammas of all energy groups. The ANISN code, for an angular quadrature order S 2 , was used as a standard for comparison of the results obtained by the Albedo method. So, it was necessary to choose an identical system configuration, both for ANISN and Albedo methods. This configuration was six neutron energy groups and eight gamma energy groups, using three slab layers (iron aluminum - manganese). The excellent results expressed in comparative tables show great agreement between the values determined by the deterministic code adopted as standard and, the values determined by the computational program created using the Albedo method and the algorithm developed for coupled neutron

  6. In silico toxicology: comprehensive benchmarking of multi-label classification methods applied to chemical toxicity data

    KAUST Repository

    Raies, Arwa B.

    2017-12-05

    One goal of toxicity testing, among others, is identifying harmful effects of chemicals. Given the high demand for toxicity tests, it is necessary to conduct these tests for multiple toxicity endpoints for the same compound. Current computational toxicology methods aim at developing models mainly to predict a single toxicity endpoint. When chemicals cause several toxicity effects, one model is generated to predict toxicity for each endpoint, which can be labor and computationally intensive when the number of toxicity endpoints is large. Additionally, this approach does not take into consideration possible correlation between the endpoints. Therefore, there has been a recent shift in computational toxicity studies toward generating predictive models able to predict several toxicity endpoints by utilizing correlations between these endpoints. Applying such correlations jointly with compounds\\' features may improve model\\'s performance and reduce the number of required models. This can be achieved through multi-label classification methods. These methods have not undergone comprehensive benchmarking in the domain of predictive toxicology. Therefore, we performed extensive benchmarking and analysis of over 19,000 multi-label classification models generated using combinations of the state-of-the-art methods. The methods have been evaluated from different perspectives using various metrics to assess their effectiveness. We were able to illustrate variability in the performance of the methods under several conditions. This review will help researchers to select the most suitable method for the problem at hand and provide a baseline for evaluating new approaches. Based on this analysis, we provided recommendations for potential future directions in this area.

  7. The Cn method applied to problems with an anisotropic diffusion law

    International Nuclear Information System (INIS)

    Grandjean, P.M.

    A 2-dimensional Cn calculation has been applied to homogeneous media subjected to the Rayleigh impact law. Results obtained with collision probabilities and Chandrasekhar calculations are compared to those from Cn method. Introducing in the expression of the transport equation, an expansion truncated on a polynomial basis for the outgoing angular flux (or possibly entrance flux) gives two Cn systems of algebraic linear equations for the expansion coefficients. The matrix elements of these equations are the moments of the Green function in infinite medium. The search for the Green function is effected through the Fourier transformation of the integrodifferential equation and its moments are derived from their Fourier transforms through a numerical integration in the complex plane. The method has been used for calculating the albedo in semi-infinite media, the extrapolation length of the Milne problem, and the albedo and transmission factor of a slab (a concise study of convergence is presented). A system of integro-differential equations bearing on the moments of the angular flux inside the medium has been derived, for the collision probability method. It is numerically solved with approximately the bulk flux by step functions. The albedo in semi-infinite medium has also been computed through the semi-analytical Chandrasekhar method. In the latter, the outgoing flux is expressed as a function of the entrance flux by means of a integral whose kernel is numerically derived [fr

  8. Study on Feasibility of Applying Function Approximation Moment Method to Achieve Reliability-Based Design Optimization

    International Nuclear Information System (INIS)

    Huh, Jae Sung; Kwak, Byung Man

    2011-01-01

    Robust optimization or reliability-based design optimization are some of the methodologies that are employed to take into account the uncertainties of a system at the design stage. For applying such methodologies to solve industrial problems, accurate and efficient methods for estimating statistical moments and failure probability are required, and further, the results of sensitivity analysis, which is needed for searching direction during the optimization process, should also be accurate. The aim of this study is to employ the function approximation moment method into the sensitivity analysis formulation, which is expressed as an integral form, to verify the accuracy of the sensitivity results, and to solve a typical problem of reliability-based design optimization. These results are compared with those of other moment methods, and the feasibility of the function approximation moment method is verified. The sensitivity analysis formula with integral form is the efficient formulation for evaluating sensitivity because any additional function calculation is not needed provided the failure probability or statistical moments are calculated

  9. LOGICAL CONDITIONS ANALYSIS METHOD FOR DIAGNOSTIC TEST RESULTS DECODING APPLIED TO COMPETENCE ELEMENTS PROFICIENCY

    Directory of Open Access Journals (Sweden)

    V. I. Freyman

    2015-11-01

    Full Text Available Subject of Research.Representation features of education results for competence-based educational programs are analyzed. Solution importance of decoding and proficiency estimation for elements and components of discipline parts of competences is shown. The purpose and objectives of research are formulated. Methods. The paper deals with methods of mathematical logic, Boolean algebra, and parametrical analysis of complex diagnostic test results, that controls proficiency of some discipline competence elements. Results. The method of logical conditions analysis is created. It will give the possibility to formulate logical conditions for proficiency determination of each discipline competence element, controlled by complex diagnostic test. Normalized test result is divided into noncrossing zones; a logical condition about controlled elements proficiency is formulated for each of them. Summarized characteristics for test result zones are imposed. An example of logical conditions forming for diagnostic test with preset features is provided. Practical Relevance. The proposed method of logical conditions analysis is applied in the decoding algorithm of proficiency test diagnosis for discipline competence elements. It will give the possibility to automate the search procedure for elements with insufficient proficiency, and is also usable for estimation of education results of a discipline or a component of competence-based educational program.

  10. An IMU-to-Body Alignment Method Applied to Human Gait Analysis

    Directory of Open Access Journals (Sweden)

    Laura Susana Vargas-Valencia

    2016-12-01

    Full Text Available This paper presents a novel calibration procedure as a simple, yet powerful, method to place and align inertial sensors with body segments. The calibration can be easily replicated without the need of any additional tools. The proposed method is validated in three different applications: a computer mathematical simulation; a simplified joint composed of two semi-spheres interconnected by a universal goniometer; and a real gait test with five able-bodied subjects. Simulation results demonstrate that, after the calibration method is applied, the joint angles are correctly measured independently of previous sensor placement on the joint, thus validating the proposed procedure. In the cases of a simplified joint and a real gait test with human volunteers, the method also performs correctly, although secondary plane errors appear when compared with the simulation results. We believe that such errors are caused by limitations of the current inertial measurement unit (IMU technology and fusion algorithms. In conclusion, the presented calibration procedure is an interesting option to solve the alignment problem when using IMUs for gait analysis.

  11. An IMU-to-Body Alignment Method Applied to Human Gait Analysis.

    Science.gov (United States)

    Vargas-Valencia, Laura Susana; Elias, Arlindo; Rocon, Eduardo; Bastos-Filho, Teodiano; Frizera, Anselmo

    2016-12-10

    This paper presents a novel calibration procedure as a simple, yet powerful, method to place and align inertial sensors with body segments. The calibration can be easily replicated without the need of any additional tools. The proposed method is validated in three different applications: a computer mathematical simulation; a simplified joint composed of two semi-spheres interconnected by a universal goniometer; and a real gait test with five able-bodied subjects. Simulation results demonstrate that, after the calibration method is applied, the joint angles are correctly measured independently of previous sensor placement on the joint, thus validating the proposed procedure. In the cases of a simplified joint and a real gait test with human volunteers, the method also performs correctly, although secondary plane errors appear when compared with the simulation results. We believe that such errors are caused by limitations of the current inertial measurement unit (IMU) technology and fusion algorithms. In conclusion, the presented calibration procedure is an interesting option to solve the alignment problem when using IMUs for gait analysis.

  12. Applying of whole-tree harvesting method; Kokopuujuontomenetelmaen soveltaminen aines- ja energiapuun hankintaan

    Energy Technology Data Exchange (ETDEWEB)

    Vesisenaho, T. [VTT Energy, Jyvaeskylae (Finland); Liukkonen, S. [VTT Manufacturing Technology, Espoo (Finland)

    1997-12-01

    The objective of this project is to apply whole-tree harvesting method to Finnish timber harvesting conditions in order to lower the harvesting costs of energy wood and timber in spruce-dominant final cuttings. In Finnish conditions timber harvesting is normally based on the log-length method. Because of small landings and the high level of thinning cuttings, whole-tree skidding methods cannot be utilised extensively. The share of stands which could be harvested with whole-tree skidding method showed up to be about 10 % of the total harvesting amount of 50 mill. m{sup 3}. The corresponding harvesting potential of energy wood is 0,25 Mtoe. The aim of the structural measurements made in this project was to get information about the effect of different hauling methods into the structural response of the tractor, and thus reveal the possible special requirements that the new whole-tree skidding places forest tractor design. Altogether 7 strain gauge based sensors were mounted into the rear frame structures and drive shafts of the forest tractor. Five strain gauges measured local strains in some critical details and two sensors measured the torque moments of the front and rear bogie drive shafts. Also the revolution speed of the rear drive shaft was recorded. Signal time histories, maximum peaks, Time at Level distributions and Rainflow distributions were gathered in different hauling modes. From these, maximum values, average stress levels and fatigue life estimates were calculated for each mode, and a comparison of the different methods from the structural point of view was performed

  13. Brucellosis Prevention Program: Applying “Child to Family Health Education” Method

    Directory of Open Access Journals (Sweden)

    H. Allahverdipour

    2010-04-01

    Full Text Available Introduction & Objective: Pupils have efficient potential to increase community awareness and promoting community health through participating in the health education programs. Child to family health education program is one of the communicative strategies that was applied in this field trial study. Because of high prevalence of Brucellosis in Hamadan province, Iran, the aim of this study was promoting families’ knowledge and preventive behaviors about Brucellosis in the rural areas by using child to family health education method.Materials & Methods: In this nonequivalent control group design study three rural schools were chosen (one as intervention and two others as control. At first knowledge and behavior of families about Brucellosis were determined using a designed questionnaire. Then the families were educated through “child to family” procedure. At this stage the students gained information. Then they were instructed to teach their parents what they had learned. After 3 months following the last session of education, the level of knowledge and behavior changes of the families about Brucellosis were determined and analyzed by paired t-test.Results: The results showed significant improvement in the knowledge of the mothers. The knowledge of the mothers about the signs of Brucellosis disease in human increased from 1.81 to 3.79 ( t:-21.64 , sig:0.000 , and also the knowledge on the signs of Brucellosis in animals increased from 1.48 to 2.82 ( t:-10.60 , sig:0.000. Conclusion: Child to family health education program is one of the effective and available methods, which would be useful and effective in most communities, and also Students potential would be effective for applying in the health promotion programs.

  14. Evaluation of cleaning methods applied in home environments after renovation and remodeling activities

    International Nuclear Information System (INIS)

    Yiin, L.-M.; Lu, S.-E.; Sannoh, Sulaiman; Lim, B.S.; Rhoads, G.G.

    2004-01-01

    We conducted a cleaning trial in 40 northern New Jersey homes where home renovation and remodeling (R and R) activities were undertaken. Two cleaning protocols were used in the study: a specific method recommended by the US Department of Housing and Urban Development (HUD), in the 1995 'Guidelines for the Evaluation and Control of Lead-Based Paint Hazards in Housing', using a high-efficiency particulate air (HEPA)-filtered vacuum cleaner and a tri-sodium phosphate solution (TSP); and an alternative method using a household vacuum cleaner and a household detergent. Eligible homes were built before the 1970s with potential lead-based paint and had recent R and R activities without thorough cleaning. The two cleaning protocols were randomly assigned to the participants' homes and followed the HUD-recommended three-step procedure: vacuuming, wet washing, and repeat vacuuming. Wipe sampling was conducted on floor surfaces or windowsills before and after cleaning to evaluate the efficacy. All floor and windowsill data indicated that both methods (TSP/HEPA and non-TSP/non-HEPA) were effective in reducing lead loading on the surfaces (P<0.001). When cleaning was applied to surfaces with initial lead loading above the clearance standards, the reductions were even greater, above 95% for either cleaning method. The mixed-effect model analysis showed no significant difference between the two methods. Baseline lead loading was found to be associated with lead loading reduction significantly on floors (P<0.001) and marginally on windowsills (P=0.077). Such relations were different between the two cleaning methods significantly on floors (P<0.001) and marginally on windowsills (P=0.066), with the TSP/HEPA method being favored for higher baseline levels and the non-TSP/non-HEPA method for lower baseline levels. For the 10 homes with lead abatement, almost all post-cleaning lead loadings were below the standards using either cleaning method. Based on our results, we recommend that

  15. Method for pulse to pulse dose reproducibility applied to electron linear accelerators

    International Nuclear Information System (INIS)

    Ighigeanu, D.; Martin, D.; Oproiu, C.; Cirstea, E.; Craciun, G.

    2002-01-01

    An original method for obtaining programmed beam single shots and pulse trains with programmed pulse number, pulse repetition frequency, pulse duration and pulse dose is presented. It is particularly useful for automatic control of absorbed dose rate level, irradiation process control as well as in pulse radiolysis studies, single pulse dose measurement or for research experiments where pulse-to-pulse dose reproducibility is required. This method is applied to the electron linear accelerators, ALIN-10 of 6.23 MeV and 82 W and ALID-7, of 5.5 MeV and 670 W, built in NILPRP. In order to implement this method, the accelerator triggering system (ATS) consists of two branches: the gun branch and the magnetron branch. ATS, which synchronizes all the system units, delivers trigger pulses at a programmed repetition rate (up to 250 pulses/s) to the gun (80 kV, 10 A and 4 ms) and magnetron (45 kV, 100 A, and 4 ms).The accelerated electron beam existence is determined by the electron gun and magnetron pulses overlapping. The method consists in controlling the overlapping of pulses in order to deliver the beam in the desired sequence. This control is implemented by a discrete pulse position modulation of gun and/or magnetron pulses. The instabilities of the gun and magnetron transient regimes are avoided by operating the accelerator with no accelerated beam for a certain time. At the operator 'beam start' command, the ATS controls electron gun and magnetron pulses overlapping and the linac beam is generated. The pulse-to-pulse absorbed dose variation is thus considerably reduced. Programmed absorbed dose, irradiation time, beam pulse number or other external events may interrupt the coincidence between the gun and magnetron pulses. Slow absorbed dose variation is compensated by the control of the pulse duration and repetition frequency. Two methods are reported in the electron linear accelerators' development for obtaining the pulse to pulse dose reproducibility: the method

  16. Surveillance and Critical Theory

    Directory of Open Access Journals (Sweden)

    Christian Fuchs

    2015-09-01

    Full Text Available In this comment, the author reflects on surveillance from a critical theory approach, his involvement in surveillance research and projects, and the status of the study of surveillance. The comment ascertains a lack of critical thinking about surveillance, questions the existence of something called “surveillance studies” as opposed to a critical theory of society, and reflects on issues such as Edward Snowden’s revelations, and Foucault and Marx in the context of surveillance.

  17. Analyzing Information Seeking and Drug-Safety Alert Response by Health Care Professionals as New Methods for Surveillance.

    Science.gov (United States)

    Callahan, Alison; Pernek, Igor; Stiglic, Gregor; Leskovec, Jure; Strasberg, Howard R; Shah, Nigam Haresh

    2015-08-20

    Patterns in general consumer online search logs have been used to monitor health conditions and to predict health-related activities, but the multiple contexts within which consumers perform online searches make significant associations difficult to interpret. Physician information-seeking behavior has typically been analyzed through survey-based approaches and literature reviews. Activity logs from health care professionals using online medical information resources are thus a valuable yet relatively untapped resource for large-scale medical surveillance. To analyze health care professionals' information-seeking behavior and assess the feasibility of measuring drug-safety alert response from the usage logs of an online medical information resource. Using two years (2011-2012) of usage logs from UpToDate, we measured the volume of searches related to medical conditions with significant burden in the United States, as well as the seasonal distribution of those searches. We quantified the relationship between searches and resulting page views. Using a large collection of online mainstream media articles and Web log posts we also characterized the uptake of a Food and Drug Administration (FDA) alert via changes in UpToDate search activity compared with general online media activity related to the subject of the alert. Diseases and symptoms dominate UpToDate searches. Some searches result in page views of only short duration, while others consistently result in longer-than-average page views. The response to an FDA alert for Celexa, characterized by a change in UpToDate search activity, differed considerably from general online media activity. Changes in search activity appeared later and persisted longer in UpToDate logs. The volume of searches and page view durations related to Celexa before the alert also differed from those after the alert. Understanding the information-seeking behavior associated with online evidence sources can offer insight into the information

  18. Postgraduate Education in Quality Improvement Methods: Initial Results of the Fellows' Applied Quality Training (FAQT) Curriculum.

    Science.gov (United States)

    Winchester, David E; Burkart, Thomas A; Choi, Calvin Y; McKillop, Matthew S; Beyth, Rebecca J; Dahm, Phillipp

    2016-06-01

    Training in quality improvement (QI) is a pillar of the next accreditation system of the Accreditation Committee on Graduate Medical Education and a growing expectation of physicians for maintenance of certification. Despite this, many postgraduate medical trainees are not receiving training in QI methods. We created the Fellows Applied Quality Training (FAQT) curriculum for cardiology fellows using both didactic and applied components with the goal of increasing confidence to participate in future QI projects. Fellows completed didactic training from the Institute for Healthcare Improvement's Open School and then designed and completed a project to improve quality of care or patient safety. Self-assessments were completed by the fellows before, during, and after the first year of the curriculum. The primary outcome for our curriculum was the median score reported by the fellows regarding their self-confidence to complete QI activities. Self-assessments were completed by 23 fellows. The majority of fellows (15 of 23, 65.2%) reported no prior formal QI training. Median score on baseline self-assessment was 3.0 (range, 1.85-4), which was significantly increased to 3.27 (range, 2.23-4; P = 0.004) on the final assessment. The distribution of scores reported by the fellows indicates that 30% were slightly confident at conducting QI activities on their own, which was reduced to 5% after completing the FAQT curriculum. An interim assessment was conducted after the fellows completed didactic training only; median scores were not different from the baseline (mean, 3.0; P = 0.51). After completion of the FAQT, cardiology fellows reported higher self-confidence to complete QI activities. The increase in self-confidence seemed to be limited to the applied component of the curriculum, with no significant change after the didactic component.

  19. High-Resolution Seismics Methods Applied to Till Covered Hard Rock Environments

    International Nuclear Information System (INIS)

    Bergman, Bjoern

    2005-01-01

    Reflection seismic and seismic tomography methods can be used to image the upper kilometer of hard bedrock and the loose unconsolidated sediments covering it. Developments of these two methods and their application, as well as identifying issues concerning their usage, are the main focus of the thesis. Data used for this development were acquired at three different sites in Sweden, in Forsmark 140 km north of Stockholm, in the Oskarshamn area in southern Sweden, and in the northern part of the Siljan Ring impact crater area. The reflection seismic data were acquired with long source-receiver offsets relative to some of the targeted depths to be imaged. In the initial processing standard steps were applied, but the uppermost part of the sections were not always clear. The longer offsets imply that pre-stack migration is necessary in order to image the uppermost bedrock as clearly as possible. Careful choice of filters and velocity functions improve the pre-stack migrated image, allowing better correlation with near-surface geological information. The seismic tomography method has been enhanced to calculate, simultaneously with the velocity inversion, optimal corrections to the picked first break travel times in order to compensate for the delays due to the seismic waves passing through the loose sediments covering the bedrock. The reflection seismic processing used in this thesis has produced high-quality images of the upper kilometers, and in one example from the Forsmark site, the image of the uppermost 250 meters of the bedrock has been improved. The three-dimensional orientation of reflections has been determined at the Oskarshamn site. Correlation with borehole data shows that many of these reflections originate from fracture zones. The developed seismic tomography method produces high-detail velocity models for the site in the Siljan impact area and for the Forsmark site. In Forsmark, detailed estimates of the bedrock topography were calculated with the use of

  20. Applying system engineering methods to site characterization research for nuclear waste repositories

    International Nuclear Information System (INIS)

    Woods, T.W.

    1985-01-01

    Nuclear research and engineering projects can benefit from the use of system engineering methods. This paper is brief overview illustrating how system engineering methods could be applied in structuring a site characterization effort for a candidate nuclear waste repository. System engineering is simply an orderly process that has been widely used to transform a recognized need into a fully defined system. Such a system may be physical or abstract, natural or man-made, hardware or procedural, as is appropriate to the system's need or objective. It is a way of mentally visualizing all the constituent elements and their relationships necessary to fulfill a need, and doing so compliant with all constraining requirements attendant to that need. Such a system approach provides completeness, order, clarity, and direction. Admittedly, system engineering can be burdensome and inappropriate for those project objectives having simple and familiar solutions that are easily held and controlled mentally. However, some type of documented and structured approach is needed for those objectives that dictate extensive, unique, or complex programs, and/or creation of state-of-the-art machines and facilities. System engineering methods have been used extensively and successfully in these cases. The scientific methods has served well in ordering countless technical undertakings that address a specific question. Similarly, conventional construction and engineering job methods will continue to be quite adequate to organize routine building projects. Nuclear waste repository site characterization projects involve multiple complex research questions and regulatory requirements that interface with each other and with advanced engineering and subsurface construction techniques. There is little doubt that system engineering is an appropriate orchestrating process to structure such diverse elements into a cohesive, well defied project

  1. Resampling method for applying density-dependent habitat selection theory to wildlife surveys.

    Science.gov (United States)

    Tardy, Olivia; Massé, Ariane; Pelletier, Fanie; Fortin, Daniel

    2015-01-01

    Isodar theory can be used to evaluate fitness consequences of density-dependent habitat selection by animals. A typical habitat isodar is a regression curve plotting competitor densities in two adjacent habitats when individual fitness is equal. Despite the increasing use of habitat isodars, their application remains largely limited to areas composed of pairs of adjacent habitats that are defined a priori. We developed a resampling method that uses data from wildlife surveys to build isodars in heterogeneous landscapes without having to predefine habitat types. The method consists in randomly placing blocks over the survey area and dividing those blocks in two adjacent sub-blocks of the same size. Animal abundance is then estimated within the two sub-blocks. This process is done 100 times. Different functional forms of isodars can be investigated by relating animal abundance and differences in habitat features between sub-blocks. We applied this method to abundance data of raccoons and striped skunks, two of the main hosts of rabies virus in North America. Habitat selection by raccoons and striped skunks depended on both conspecific abundance and the difference in landscape composition and structure between sub-blocks. When conspecific abundance was low, raccoons and striped skunks favored areas with relatively high proportions of forests and anthropogenic features, respectively. Under high conspecific abundance, however, both species preferred areas with rather large corn-forest edge densities and corn field proportions. Based on random sampling techniques, we provide a robust method that is applicable to a broad range of species, including medium- to large-sized mammals with high mobility. The method is sufficiently flexible to incorporate multiple environmental covariates that can reflect key requirements of the focal species. We thus illustrate how isodar theory can be used with wildlife surveys to assess density-dependent habitat selection over large

  2. Photonic simulation method applied to the study of structural color in Myxomycetes.

    Science.gov (United States)

    Dolinko, Andrés; Skigin, Diana; Inchaussandague, Marina; Carmaran, Cecilia

    2012-07-02

    We present a novel simulation method to investigate the multicolored effect of the Diachea leucopoda (Physarales order, Myxomycetes class), which is a microorganism that has a characteristic pointillistic iridescent appearance. It was shown that this appearance is of structural origin, and is produced within the peridium -protective layer that encloses the mass of spores-, which is basically a corrugated sheet of a transparent material. The main characteristics of the observed color were explained in terms of interference effects using a simple model of homogeneous planar slab. In this paper we apply a novel simulation method to investigate the electromagnetic response of such structure in more detail, i.e., taking into account the inhomogeneities of the biological material within the peridium and its curvature. We show that both features, which could not be considered within the simplified model, affect the observed color. The proposed method is of great potential for the study of biological structures, which present a high degree of complexity in the geometrical shapes as well as in the materials involved.

  3. Impact of gene patents on diagnostic testing: a new patent landscaping method applied to spinocerebellar ataxia.

    Science.gov (United States)

    Berthels, Nele; Matthijs, Gert; Van Overwalle, Geertrui

    2011-11-01

    Recent reports in Europe and the United States raise concern about the potential negative impact of gene patents on the freedom to operate of diagnosticians and on the access of patients to genetic diagnostic services. Patents, historically seen as legal instruments to trigger innovation, could cause undesired side effects in the public health domain. Clear empirical evidence on the alleged hindering effect of gene patents is still scarce. We therefore developed a patent categorization method to determine which gene patents could indeed be problematic. The method is applied to patents relevant for genetic testing of spinocerebellar ataxia (SCA). The SCA test is probably the most widely used DNA test in (adult) neurology, as well as one of the most challenging due to the heterogeneity of the disease. Typically tested as a gene panel covering the five common SCA subtypes, we show that the patenting of SCA genes and testing methods and the associated licensing conditions could have far-reaching consequences on legitimate access to this gene panel. Moreover, with genetic testing being increasingly standardized, simply ignoring patents is unlikely to hold out indefinitely. This paper aims to differentiate among so-called 'gene patents' by lifting out the truly problematic ones. In doing so, awareness is raised among all stakeholders in the genetic diagnostics field who are not necessarily familiar with the ins and outs of patenting and licensing.

  4. IAEA-ASSET's root cause analysis method applied to sodium leakage incident at Monju

    International Nuclear Information System (INIS)

    Watanabe, Norio; Hirano, Masashi

    1997-08-01

    The present study applied the ASSET (Analysis and Screening of Safety Events Team) methodology (This method identifies occurrences such as component failures and operator errors, identifies their respective direct/root causes and determines corrective actions.) to the analysis of the sodium leakage incident at Monju, based on the published reports by mainly the Science and Technology Agency, aiming at systematic identification of direct/root causes and corrective actions, and discussed the effectiveness and problems of the ASSET methodology. The results revealed the following seven occurrences and showed the direct/root causes and contributing factors for the individual occurrences: failure of thermometer well tube, delayed reactor manual trip, inadequate continuous monitoring of leakage, misjudgment of leak rate, non-required operator action (turbine trip), retarded emergency sodium drainage, and retarded securing of ventilation system. Most of the occurrences stemmed from deficiencies in emergency operating procedures (EOPs), which were mainly caused by defects in the EOP preparation process and operator training programs. The corrective actions already proposed in the published reports were reviewed, identifying issues to be further studied. Possible corrective actions were discussed for these issues. The present study also demonstrated the effectiveness of the ASSET methodology and pointed out some problems, for example, in delineating causal relations among occurrences, for applying it to the detail and systematic analysis of event direct/root causes and determination of concrete measures. (J.P.N.)

  5. The Application of Intensive Longitudinal Methods to Investigate Change: Stimulating the Field of Applied Family Research.

    Science.gov (United States)

    Bamberger, Katharine T

    2016-03-01

    The use of intensive longitudinal methods (ILM)-rapid in situ assessment at micro timescales-can be overlaid on RCTs and other study designs in applied family research. Particularly, when done as part of a multiple timescale design-in bursts over macro timescales-ILM can advance the study of the mechanisms and effects of family interventions and processes of family change. ILM confers measurement benefits in accurately assessing momentary and variable experiences and captures fine-grained dynamic pictures of time-ordered processes. Thus, ILM allows opportunities to investigate new research questions about intervention effects on within-subject (i.e., within-person, within-family) variability (i.e., dynamic constructs) and about the time-ordered change process that interventions induce in families and family members beginning with the first intervention session. This paper discusses the need and rationale for applying ILM to family intervention evaluation, new research questions that can be addressed with ILM, example research using ILM in the related fields of basic family research and the evaluation of individual-based interventions. Finally, the paper touches on practical challenges and considerations associated with ILM and points readers to resources for the application of ILM.

  6. A method of applying two-pump system in automatic transmissions for energy conservation

    Directory of Open Access Journals (Sweden)

    Peng Dong

    2015-06-01

    Full Text Available In order to improve the hydraulic efficiency, modern automatic transmissions tend to apply electric oil pump in their hydraulic system. The electric oil pump can support the mechanical oil pump for cooling, lubrication, and maintaining the line pressure at low engine speeds. In addition, the start–stop function can be realized by means of the electric oil pump; thus, the fuel consumption can be further reduced. This article proposes a method of applying two-pump system (one electric oil pump and one mechanical oil pump in automatic transmissions based on the forward driving simulation. A mathematical model for calculating the transmission power loss is developed. The power loss transfers to heat which requires oil flow for cooling and lubrication. A leakage model is developed to calculate the leakage of the hydraulic system. In order to satisfy the flow requirement, a flow-based control strategy for the electric oil pump is developed. Simulation results of different driving cycles show that there is a best combination of the size of electric oil pump and the size of mechanical oil pump with respect to the optimal energy conservation. Besides, the two-pump system can also satisfy the requirement of the start–stop function. This research is extremely valuable for the forward design of a two-pump system in automatic transmissions with respect to energy conservation and start–stop function.

  7. Improvement in the DTVG detection method as applied to cast austeno-ferritic steels

    International Nuclear Information System (INIS)

    Francois, D.

    1996-05-01

    Initially, the so-called DTVG method was developed to improve detection and (lengthwise) dimensioning of cracks in austenitic steel assembly welds. The results obtained during the study and the structural similarity between austenitic and austeno-ferritic steels led us to carry out research into adapting the method on a sample the material of which is representative of the cast steels used in PWR primary circuit bends. The method was first adapted for use on thick-wall cast austeno-ferritic steel structures and was validated for zero ultrasonic beam incidence and for a flat sample with machine-finished reflectors. A second study was carried out notably to allow for non-zero ultrasonic beam incidence and to look at the method's validity when applied to a non-flat geometry. There were three principal goals to the research; adapting the process to take into account the special case of oblique ultrasonic beam incidence (B image handling), examining the effect of non-flat geometry on the detection method, and evaluating the performance of the method on actual defects (shrinkage cavities). We began by focusing on solving the problem of oblique incidence. Having decided on automatic refracted angle determination, the problem could only be solves by locking the algorithm on a representative image of the suspect material comprising an indicator. We then used a simple geometric model to quantify the deformation of the indicators on a B-scan image due to a non-flat translator/part interface. Finally, tests were carried out on measurements acquired from flat samples containing artificial and real defects so that the overall performance of the method after development could be assessed. This work has allowed the DTVG detection method to be adapted for use with B-scan images acquired with a non-zero ultrasonic beam incidence angle. Moreover, we have been able to show that for similar geometries to those of the cast bends and for deep defects the deformation of the indicators due

  8. Applying RP-FDM Technology to Produce Prototype Castings Using the Investment Casting Method

    Directory of Open Access Journals (Sweden)

    M. Macků

    2012-09-01

    Full Text Available The research focused on the production of prototype castings, which is mapped out starting from the drawing documentation up to theproduction of the casting itself. The FDM method was applied for the production of the 3D pattern. Its main objective was to find out whatdimensional changes happened during individual production stages, starting from the 3D pattern printing through a silicon mouldproduction, wax patterns casting, making shells, melting out wax from shells and drying, up to the production of the final casting itself.Five measurements of determined dimensions were made during the production, which were processed and evaluated mathematically.A determination of shrinkage and a proposal of measures to maintain the dimensional stability of the final casting so as to meetrequirements specified by a customer were the results.

  9. Comparison of gradient methods for gain tuning of a PD controller applied on a quadrotor system

    Science.gov (United States)

    Kim, Jinho; Wilkerson, Stephen A.; Gadsden, S. Andrew

    2016-05-01

    Many mechanical and electrical systems have utilized the proportional-integral-derivative (PID) control strategy. The concept of PID control is a classical approach but it is easy to implement and yields a very good tracking performance. Unmanned aerial vehicles (UAVs) are currently experiencing a significant growth in popularity. Due to the advantages of PID controllers, UAVs are implementing PID controllers for improved stability and performance. An important consideration for the system is the selection of PID gain values in order to achieve a safe flight and successful mission. There are a number of different algorithms that can be used for real-time tuning of gains. This paper presents two algorithms for gain tuning, and are based on the method of steepest descent and Newton's minimization of an objective function. This paper compares the results of applying these two gain tuning algorithms in conjunction with a PD controller on a quadrotor system.

  10. Study of different ultrasonic focusing methods applied to non destructive testing

    International Nuclear Information System (INIS)

    El Amrani, M.

    1995-01-01

    The work presented in this thesis concerns the study of different ultrasonic focusing techniques applied to Nondestructive Testing (mechanical focusing and electronic focusing) and compares their capabilities. We have developed a model to predict the ultrasonic field radiated into a solid by water-coupled transducers. The model is based upon the Rayleigh integral formulation, modified to take account the refraction at the liquid-solid interface. The model has been validated by numerous experiments in various configurations. Running this model and the associated software, we have developed new methods to optimize focused transducers and studied the characteristics of the beam generated by transducers using various focusing techniques. (author). 120 refs., 95 figs., 4 appends

  11. Adding randomness controlling parameters in GRASP method applied in school timetabling problem

    Directory of Open Access Journals (Sweden)

    Renato Santos Pereira

    2017-09-01

    Full Text Available This paper studies the influence of randomness controlling parameters (RCP in first stage GRASP method applied in graph coloring problem, specifically school timetabling problems in a public high school. The algorithm (with the inclusion of RCP was based on critical variables identified through focus groups, whose weights can be adjusted by the user in order to meet the institutional needs. The results of the computational experiment, with 11-year-old data (66 observations processed at the same high school show that the inclusion of RCP leads to significantly lowering the distance between initial solutions and local minima. The acceptance and the use of the solutions found allow us to conclude that the modified GRASP, as has been constructed, can make a positive contribution to this timetabling problem of the school in question.

  12. Applying RP-FDM Technology to Produce Prototype Castings Using the Investment Casting Method

    Directory of Open Access Journals (Sweden)

    Macků M.

    2012-09-01

    Full Text Available The research focused on the production of prototype castings, which is mapped out starting from the drawing documentation up to the production of the casting itself. The FDM method was applied for the production of the 3D pattern. Its main objective was to find out what dimensional changes happened during individual production stages, starting from the 3D pattern printing through a silicon mould production, wax patterns casting, making shells, melting out wax from shells and drying, up to the production of the final casting itself. Five measurements of determined dimensions were made during the production, which were processed and evaluated mathematically. A determination of shrinkage and a proposal of measures to maintain the dimensional stability of the final casting so as to meet requirements specified by a customer were the results.

  13. Applied methods for mitigation of damage by stress corrosion in BWR type reactors

    International Nuclear Information System (INIS)

    Hernandez C, R.; Diaz S, A.; Gachuz M, M.; Arganis J, C.

    1998-01-01

    The Boiling Water nuclear Reactors (BWR) have presented stress corrosion problems, mainly in components and pipes of the primary system, provoking negative impacts in the performance of energy generator plants, as well as the increasing in the radiation exposure to personnel involucred. This problem has caused development of research programs, which are guided to find solution alternatives for the phenomena control. Among results of greater relevance the control for the reactor water chemistry stands out particularly in the impurities concentration and oxidation of radiolysis products; as well as the supervision in the materials selection and the stresses levels reduction. The present work presents the methods which can be applied to diminish the problems of stress corrosion in BWR reactors. (Author)

  14. Applied methods and techniques for mechatronic systems modelling, identification and control

    CERN Document Server

    Zhu, Quanmin; Cheng, Lei; Wang, Yongji; Zhao, Dongya

    2014-01-01

    Applied Methods and Techniques for Mechatronic Systems brings together the relevant studies in mechatronic systems with the latest research from interdisciplinary theoretical studies, computational algorithm development and exemplary applications. Readers can easily tailor the techniques in this book to accommodate their ad hoc applications. The clear structure of each paper, background - motivation - quantitative development (equations) - case studies/illustration/tutorial (curve, table, etc.) is also helpful. It is mainly aimed at graduate students, professors and academic researchers in related fields, but it will also be helpful to engineers and scientists from industry. Lei Liu is a lecturer at Huazhong University of Science and Technology (HUST), China; Quanmin Zhu is a professor at University of the West of England, UK; Lei Cheng is an associate professor at Wuhan University of Science and Technology, China; Yongji Wang is a professor at HUST; Dongya Zhao is an associate professor at China University o...

  15. Borehole-to-borehole geophysical methods applied to investigations of high level waste repository sites

    International Nuclear Information System (INIS)

    Ramirez, A.L.

    1983-01-01

    This discussion focuses on the use of borehole to borehole geophysical measurements to detect geological discontinuities in High Level Waste (HLW) repository sites. The need for these techniques arises from: (a) the requirement that a HLW repository's characteristics and projected performance be known with a high degree of confidence; and (b) the inadequacy of other geophysical methods in mapping fractures. Probing configurations which can be used to characterize HLW sites are described. Results from experiments in which these techniques were applied to problems similar to those expected at repository sites are briefly discussed. The use of a procedure designed to reduce uncertainty associated with all geophysical exploration techniques is proposed; key components of the procedure are defined

  16. [Proposal for an evaluation method of the psyco-social risks (stress) and for the orientation of the sanitary surveillance].

    Science.gov (United States)

    Tangredi, G; Monaco, M R; Scano, L; Perfetti, B

    2007-01-01

    The interest for the problematics linked to the stress in work environments has been till now limited to the consideration of its effects on health, even if the D.Lgs 626/94 obliges the employer to evaluate also the psyco-social risk, as has confirmed a sentence of the European Court of Justice. The present survey, lacking valid instruments to be found in literature, aims to experiment and evaluate a principle for the identification of causes, thus creating a model for the evaluation of risk also according to the indications published in the Document for the Consent of SIMLII in 2005, which can be used by Prevention and Protection Services and by Competent Medical Doctors. In the area of risks evaluation and of the attainment deriving from them. The model of evaluation of the risk deriving from work organization (stress), object of the present survey, has been sperimented in a sample composed of 268 employees in 13 municipal administration belonging to biographically known categories for stress risk afferent to 23 homogenuous organizational structures (traffic officers and nursery school teachers). The valued risk has been introduced in the VDR document and the indications for the sanitary surveillance have been formulated.

  17. Fuzzy logic and optical correlation-based face recognition method for patient monitoring application in home video surveillance

    Science.gov (United States)

    Elbouz, Marwa; Alfalou, Ayman; Brosseau, Christian

    2011-06-01

    Home automation is being implemented into more and more domiciles of the elderly and disabled in order to maintain their independence and safety. For that purpose, we propose and validate a surveillance video system, which detects various posture-based events. One of the novel points of this system is to use adapted Vander-Lugt correlator (VLC) and joint-transfer correlator (JTC) techniques to make decisions on the identity of a patient and his three-dimensional (3-D) positions in order to overcome the problem of crowd environment. We propose a fuzzy logic technique to get decisions on the subject's behavior. Our system is focused on the goals of accuracy, convenience, and cost, which in addition does not require any devices attached to the subject. The system permits one to study and model subject responses to behavioral change intervention because several levels of alarm can be incorporated according different situations considered. Our algorithm performs a fast 3-D recovery of the subject's head position by locating eyes within the face image and involves a model-based prediction and optical correlation techniques to guide the tracking procedure. The object detection is based on (hue, saturation, value) color space. The system also involves an adapted fuzzy logic control algorithm to make a decision based on information given to the system. Furthermore, the principles described here are applicable to a very wide range of situations and robust enough to be implementable in ongoing experiments.

  18. Applied mechanics of the Puricelli osteotomy: a linear elastic analysis with the finite element method

    Directory of Open Access Journals (Sweden)

    de Paris Marcel

    2007-11-01

    Full Text Available Abstract Background Surgical orthopedic treatment of the mandible depends on the development of techniques resulting in adequate healing processes. In a new technical and conceptual alternative recently introduced by Puricelli, osteotomy is performed in a more distal region, next to the mental foramen. The method results in an increased area of bone contact, resulting in larger sliding rates among bone segments. This work aimed to investigate the mechanical stability of the Puricelli osteotomy design. Methods Laboratory tests complied with an Applied Mechanics protocol, in which results from the Control group (without osteotomy were compared with those from Test I (Obwegeser-Dal Pont osteotomy and Test II (Puricelli osteotomy groups. Mandible edentulous prototypes were scanned using computerized tomography, and digitalized images were used to build voxel-based finite element models. A new code was developed for solving the voxel-based finite elements equations, using a reconditioned conjugate gradients iterative solver. The Magnitude of Displacement and von Mises equivalent stress fields were compared among the three groups. Results In Test Group I, maximum stress was seen in the region of the rigid internal fixation plate, with value greater than those of Test II and Control groups. In Test Group II, maximum stress was in the same region as in Control group, but was lower. The results of this comparative study using the Finite Element Analysis suggest that Puricelli osteotomy presents better mechanical stability than the original Obwegeser-Dal Pont technique. The increased area of the proximal segment and consequent decrease of the size of lever arm applied to the mandible in the modified technique yielded lower stress values, and consequently greater stability of the bone segments. Conclusion This work showed that Puricelli osteotomy of the mandible results in greater mechanical stability when compared to the original technique introduced by

  19. Infrared thermography inspection methods applied to the target elements of W7-X Divertor

    International Nuclear Information System (INIS)

    Missirlian, M.; Durocher, A.; Schlosser, J.; Farjon, J.-L.; Vignal, N.; Traxler, H.; Schedler, B.; Boscary, J.

    2006-01-01

    As heat exhaust capability and lifetime of plasma-facing component (PFC) during in-situ operation are linked to the manufacturing quality, a set of non-destructive testing must be operated during R-and-D and manufacturing phases. Within this framework, advanced non-destructive examination (NDE) methods are one of the key issues to achieve a high level of quality and reliability of joining techniques in the production of high heat flux components but also to develop and built successfully PFCs for a next generation of fusion devices. In this frame, two NDE infrared thermographic approaches, which have been recently applied to the qualification of CFC target elements of the W7-X divertor during the first series production will be discussed in this paper. The first one, developed by CEA (SATIR facility) and used with successfully to the control of the mass-produced actively cooled PFCs on Tore Supra, is based on the transient thermography where the testing protocol consists in inducing a thermal transient within the heat sink structure by an alternative hot/cold water flow. The second one, recently developed by PLANSEE (ARGUS facility), is based on the pulsed thermography where the component is heated externally by a single powerful flash of light. Results obtained on qualification experiences performed during the first series production of W7-X divertor components representing about thirty mock-ups with artificial and manufacturing defects, demonstrated the capabilities of these two methods and raised the efficiency of inspection to a level which is appropriate for industrial application. This comparative study, associated to a cross-checking analysis between the high heat flux performance tests and these inspection methods by infrared thermography, showed a good reproducibility and allowed to set a detectable limit specific at each method. Finally, the detectability of relevant defects showed excellent coincidence with thermal images obtained from high heat flux

  20. Oil Spill Trajectories from HF Radars: Applied Dynamical Systems Methods vs. a Lagrangian Stochastic Model

    Science.gov (United States)

    Emery, B. M.; Washburn, L.; Mezic, I.; Loire, S.; Arbabi, H.; Ohlmann, C.; Harlan, J.

    2016-02-01

    We apply several analysis methods to HF radar ocean surface current maps to investigate improvements in trajectory modeling. Results from a Lagrangian Stochastic Model (LSM) are compared with methods based on dynamical systems theory: hypergraphs and Koopman mode analysis. The LSM produces trajectories by integrating Eulerian fields from the HF radar, and accounts for sub-grid scale velocity variability by including a random component based on the Lagrangian decorrelation time. Hypergraphs also integrate the HF radar maps in time, showing areas of strain, strain-rotation, and mixing, by plotting the relative strengths of the eigenvalues of the gradient of the time-averaged Lagrangian velocity. Koopman mode analysis decomposes the velocity field into modes of variability, similarly to EOF or a Fourier analysis, though each Koopman mode varies in time with a distinct frequency. Each method simulates oil drift from a the oil spill of May, 2015 that occurred within the coverage area of the HF radars, in the Santa Barbara Channel near Refugio Beach, CA. Preliminary results indicate some skill in determining the transport of oil when compare to publicly available observations of oil in the Santa Barbara Channel. These simulations have not shown a connection between the Refugio spill site and oil observations in the Santa Monica Bay, near Los Angeles CA, though accumulation zones shown by the hypergraphs correlate in time and space with these observations. Improvements in the HF radar coverage and accuracy were observed during the spill by the deployment of an additional HF radar site near Gaviota, CA. Presently we are collecting observations of oil on beaches and in the ocean, determining the role of winds in the oil movement, and refining the methods. Some HF radar data is being post-processed to incorporate recent antenna calibrations for sites in Santa Monica Bay. We will evaluate effects of the newly processed data on analysis results.

  1. Balancing a U-Shaped Assembly Line by Applying Nested Partitions Method

    Energy Technology Data Exchange (ETDEWEB)

    Bhagwat, Nikhil V. [Iowa State Univ., Ames, IA (United States)

    2005-01-01

    In this study, we applied the Nested Partitions method to a U-line balancing problem and conducted experiments to evaluate the application. From the results, it is quite evident that the Nested Partitions method provided near optimal solutions (optimal in some cases). Besides, the execution time is quite short as compared to the Branch and Bound algorithm. However, for larger data sets, the algorithm took significantly longer times for execution. One of the reasons could be the way in which the random samples are generated. In the present study, a random sample is a solution in itself which requires assignment of tasks to various stations. The time taken to assign tasks to stations is directly proportional to the number of tasks. Thus, if the number of tasks increases, the time taken to generate random samples for the different regions also increases. The performance index for the Nested Partitions method in the present study was the number of stations in the random solutions (samples) generated. The total idle time for the samples can be used as another performance index. ULINO method is known to have used a combination of bounds to come up with good solutions. This approach of combining different performance indices can be used to evaluate the random samples and obtain even better solutions. Here, we used deterministic time values for the tasks. In industries where majority of tasks are performed manually, the stochastic version of the problem could be of vital importance. Experimenting with different objective functions (No. of stations was used in this study) could be of some significance to some industries where in the cost associated with creation of a new station is not the same. For such industries, the results obtained by using the present approach will not be of much value. Labor costs, task incompletion costs or a combination of those can be effectively used as alternate objective functions.

  2. Surveillance for equity in primary health care: policy implications from international experience.

    Science.gov (United States)

    Taylor, C E

    1992-12-01

    Experience around the world shows that health agencies can promote community-based surveillance for equity to focus low-cost interventions on priority needs. Social inequities which have seemed intractable can be resolved if care responds directly to demonstrated need. The concept of promoting equity as a basic principle of primary health care has an interesting psychological twist. The ethical imperative of equity can strengthen services when linked with the practical management tool of surveillance. Moral conviction in applying this social justice norm can facilitate action which is made efficient by the realism of statistically based methods of surveillance. If international agencies condition their aid on surveillance for equity their assistance will more likely go to those in greatest need. This is a more efficient and effective way of tracking their money than the previous tendency to set up vertical programmes which generally have poor sustainability. Surveillance helps mobilize political will and community participation by providing practical data for local, district and national decision-makers. The many field demonstrations of successful surveillance for equity tend to have been brushed off by development experts who say they are difficult to replicate nationally. The Model County Project in China shows how a systematic extension process can test procedures in experimental areas and adapt them for general implementation. Surveillance can help bureaucracies maintain capacity for flexible and prompt response as decentralization promotes decision-making by local units which are held responsible for meeting equity targets. Surveillance for equity provides a mechanism to ensure such accountability.

  3. Privacy Implications of Surveillance Systems

    DEFF Research Database (Denmark)

    Thommesen, Jacob; Andersen, Henning Boje

    2009-01-01

    This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed for vari......This paper presents a model for assessing the privacy „cost‟ of a surveillance system. Surveillance systems collect and provide personal information or observations of people by means of surveillance technologies such as databases, video or location tracking. Such systems can be designed...... for various purposes, even as a service for those being observed, but in any case they will to some degree invade their privacy. The model provided here can indicate how invasive any particular system may be – and be used to compare the invasiveness of different systems. Applying a functional approach......, the model is established by first considering the social function of privacy in everyday life, which in turn lets us determine which different domains will be considered as private, and finally identify the different types of privacy invasion. This underlying model (function – domain – invasion) then serves...

  4. Design and fabrication of facial prostheses for cancer patient applying computer aided method and manufacturing (CADCAM)

    Science.gov (United States)

    Din, Tengku Noor Daimah Tengku; Jamayet, Nafij; Rajion, Zainul Ahmad; Luddin, Norhayati; Abdullah, Johari Yap; Abdullah, Abdul Manaf; Yahya, Suzana

    2016-12-01

    Facial defects are either congenital or caused by trauma or cancer where most of them affect the person appearance. The emotional pressure and low self-esteem are problems commonly related to patient with facial defect. To overcome this problem, silicone prosthesis was designed to cover the defect part. This study describes the techniques in designing and fabrication for facial prosthesis applying computer aided method and manufacturing (CADCAM). The steps of fabricating the facial prosthesis were based on a patient case. The patient was diagnosed for Gorlin Gotz syndrome and came to Hospital Universiti Sains Malaysia (HUSM) for prosthesis. The 3D image of the patient was reconstructed from CT data using MIMICS software. Based on the 3D image, the intercanthal and zygomatic measurements of the patient were compared with available data in the database to find the suitable nose shape. The normal nose shape for the patient was retrieved from the nasal digital library. Mirror imaging technique was used to mirror the facial part. The final design of facial prosthesis including eye, nose and cheek was superimposed to see the result virtually. After the final design was confirmed, the mould design was created. The mould of nasal prosthesis was printed using Objet 3D printer. Silicone casting was done using the 3D print mould. The final prosthesis produced from the computer aided method was acceptable to be used for facial rehabilitation to provide better quality of life.

  5. Goal oriented soil mapping: applying modern methods supported by local knowledge: A review

    Science.gov (United States)

    Pereira, Paulo; Brevik, Eric; Oliva, Marc; Estebaranz, Ferran; Depellegrin, Daniel; Novara, Agata; Cerda, Artemi; Menshov, Oleksandr

    2017-04-01

    In the recent years the amount of soil data available increased importantly. This facilitated the production of better and accurate maps, important for sustainable land management (Pereira et al., 2017). Despite these advances, the human knowledge is extremely important to understand the natural characteristics of the landscape. The knowledge accumulated and transmitted generation after generation is priceless, and should be considered as a valuable data source for soil mapping and modelling. The local knowledge and wisdom can complement the new advances in soil analysis. In addition, farmers are the most interested in the participation and incorporation of their knowledge in the models, since they are the end-users of the study that soil scientists produce. Integration of local community's vision and understanding about nature is assumed to be an important step to the implementation of decision maker's policies. Despite this, many challenges appear regarding the integration of local and scientific knowledge, since in some cases there is no spatial correlation between folk and scientific classifications, which may be attributed to the different cultural variables that influence local soil classification. The objective of this work is to review how modern soil methods incorporated local knowledge in their models. References Pereira, P., Brevik, E., Oliva, M., Estebaranz, F., Depellegrin, D., Novara, A., Cerda, A., Menshov, O. (2017) Goal Oriented soil mapping: applying modern methods supported by local knowledge. In: Pereira, P., Brevik, E., Munoz-Rojas, M., Miller, B. (Eds.) Soil mapping and process modelling for sustainable land use management (Elsevier Publishing House) ISBN: 9780128052006

  6. [An experimental assessment of methods for applying intestinal sutures in intestinal obstruction].

    Science.gov (United States)

    Akhmadudinov, M G

    1992-04-01

    The results of various methods used in applying intestinal sutures in obturation were studied. Three series of experiments were conducted on 30 dogs--resection of the intestine after obstruction with the formation of anastomoses by means of double-row suture (Albert--Shmiden--Lambert) in the first series (10 dogs), by a single-row suture after V. M. Mateshchuk [correction of Mateshuku] in the second series, and bu a single-row stretching suture suggested by the author in the third series. The postoperative complications and the parameters of physical airtightness of the intestinal anastomosis were studied in dynamics in the experimental animals. The results of the study: incompetence of the anastomosis sutures in the first series 6, in the second 4, and in the third series one. Adhesions occurred in all animals of the first and second series and in 2 of the third series. Six dogs of the first series died, 4 of the second, and one of the third. Study of the dynamics of the results showed a direct connection of the complications with the parameters of the physical airtightness of the anastomosis, and the last-named with the method of the intestinal suture. Relatively better results were noted in formation of the anastomosis by means of our suggested stretshing continuous suture passed through the serous, muscular, and submucous coats of the intestine.

  7. Medical surveillance of occupationally exposed workers

    Energy Technology Data Exchange (ETDEWEB)

    2007-05-15

    The guide covers medical surveillance of workers engaged in radiation work and their fitness for this work, protection of the foetus and infant during the worker's pregnancy or breastfeeding, and medical surveillance measures to be taken when the dose limit has been exceeded. The guide also covers recognition of practitioners responsible for medical surveillance of category A workers, medical certificates to be issued to workers, and preservation and transfer of medical records. The medical surveillance requirements specified in this Guide cover the use of radiation and nuclear energy. The guide also applies to exposure to natural radiation in accordance with section 28 of the Finnish Radiation Decree

  8. Medical surveillance of occupationally exposed workers

    International Nuclear Information System (INIS)

    2007-05-01

    The guide covers medical surveillance of workers engaged in radiation work and their fitness for this work, protection of the foetus and infant during the worker's pregnancy or breastfeeding, and medical surveillance measures to be taken when the dose limit has been exceeded. The guide also covers recognition of practitioners responsible for medical surveillance of category A workers, medical certificates to be issued to workers, and preservation and transfer of medical records. The medical surveillance requirements specified in this Guide cover the use of radiation and nuclear energy. The guide also applies to exposure to natural radiation in accordance with section 28 of the Finnish Radiation Decree

  9. A new sub-equation method applied to obtain exact travelling wave solutions of some complex nonlinear equations

    International Nuclear Information System (INIS)

    Zhang Huiqun

    2009-01-01

    By using a new coupled Riccati equations, a direct algebraic method, which was applied to obtain exact travelling wave solutions of some complex nonlinear equations, is improved. And the exact travelling wave solutions of the complex KdV equation, Boussinesq equation and Klein-Gordon equation are investigated using the improved method. The method presented in this paper can also be applied to construct exact travelling wave solutions for other nonlinear complex equations.

  10. Hazard surveillance for workplace magnetic fields. 1: Walkaround sampling method for measuring ambient field magnitude; 2: Field characteristics from waveform measurements

    Energy Technology Data Exchange (ETDEWEB)

    Methner, M.M.; Bowman, J.D.

    1998-03-01

    Recent epidemiologic research has suggested that exposure to extremely low frequency (ELF) magnetic fields (MF) may be associated with leukemia, brain cancer, spontaneous abortions, and Alzheimer`s disease. A walkaround sampling method for measuring ambient ELF-MF levels was developed for use in conducting occupational hazard surveillance. This survey was designed to determine the range of MF levels at different industrial facilities so they could be categorized by MF levels and identified for possible subsequent personal exposure assessments. Industries were selected based on their annual electric power consumption in accordance with the hypothesis that large power consumers would have higher ambient MFs when compared with lower power consumers. Sixty-two facilities within thirteen 2-digit Standard Industrial Classifications (SIC) were selected based on their willingness to participate. A traditional industrial hygiene walkaround survey was conducted to identify MF sources, with a special emphasis on work stations.

  11. Comparison of virological methods applied on african swine fever diagnosis in Brazil, 1978

    Directory of Open Access Journals (Sweden)

    Tânia Rosária Pereira Freitas

    2015-10-01

    Full Text Available ABSTRACT. Freitas T.R.P., Souza A.C., Esteves E.G. & Lyra T.M.P. [Comparison of virological methods applied on african swine fever diagnosis in Brazil, 1978.] Comparação dos métodos virológicos aplicados no diagnóstico da peste suína africana no Brasil, 1978. Revista Brasileira de Medicina Veterinária, 37(3:255-263, 2015. Laboratório Nacional Agropecuário, Ministério da Agricultura, Pecuária e Abastecimento, Avenida Rômulo Joviano, s/n, Caixa postal 35/50, Pedro Leopoldo, MG 33600-000, Brasil. taniafrei@hotmail.com The techniques of leucocytes haemadsorption (HAD for the African Swine Fever (ASF virus isolation and the fluorescent antigens tissue samples (FATS for virus antigens detection were implanted in the ASF eradication campaign in the country. The complementary of techniques was studied considering the results obtained when the HAD and FATS were concomitantly applied on the same pig tissue samples. The results of 22, 56 and 30 pigs samples from of the States of Rio de Janeiro (RJ, São Paulo (SP and Paraná (PR, respectively, showed that in RJ 11 (50%; in SP, 28 (50% and in PR, 15 (50% samples were positive in the HAD, while, RJ, 18 (82%; SP, 33 (58% and PR, 17 (57% were positive in the FATS. In the universe of 108 samples submitted to both the tests, 83 (76.85% were positive in at least one of the tests, which characterized ASF positivity. Among the positive samples, 28 (34% have presented HAD negative results and 15 (18% have presented FATS negative results. The achievement of applying simultaneously the both tests was the reduction of false- negative results, conferring more ASF accurate laboratorial diagnosis, besides to show the tests complementary. This aspect is fundamentally importance concern with a disease eradiation program to must avoid false negative results. Evidences of low virulence ASFV strains in Brazilian ASF outbreaks and also the distribution of ASF outbreaks by the mesoregions of each State were discussed

  12. A mixed methods evaluation of team-based learning for applied pathophysiology in undergraduate nursing education.

    Science.gov (United States)

    Branney, Jonathan; Priego-Hernández, Jacqueline

    2018-02-01

    It is important for nurses to have a thorough understanding of the biosciences such as pathophysiology that underpin nursing care. These courses include content that can be difficult to learn. Team-based learning is emerging as a strategy for enhancing learning in nurse education due to the promotion of individual learning as well as learning in teams. In this study we sought to evaluate the use of team-based learning in the teaching of applied pathophysiology to undergraduate student nurses. A mixed methods observational study. In a year two, undergraduate nursing applied pathophysiology module circulatory shock was taught using Team-based Learning while all remaining topics were taught using traditional lectures. After the Team-based Learning intervention the students were invited to complete the Team-based Learning Student Assessment Instrument, which measures accountability, preference and satisfaction with Team-based Learning. Students were also invited to focus group discussions to gain a more thorough understanding of their experience with Team-based Learning. Exam scores for answers to questions based on Team-based Learning-taught material were compared with those from lecture-taught material. Of the 197 students enrolled on the module, 167 (85% response rate) returned the instrument, the results from which indicated a favourable experience with Team-based Learning. Most students reported higher accountability (93%) and satisfaction (92%) with Team-based Learning. Lectures that promoted active learning were viewed as an important feature of the university experience which may explain the 76% exhibiting a preference for Team-based Learning. Most students wanted to make a meaningful contribution so as not to let down their team and they saw a clear relevance between the Team-based Learning activities and their own experiences of teamwork in clinical practice. Exam scores on the question related to Team-based Learning-taught material were comparable to those

  13. Method developments approaches in supercritical fluid chromatography applied to the analysis of cosmetics.

    Science.gov (United States)

    Lesellier, E; Mith, D; Dubrulle, I

    2015-12-04

    necessary, two-step gradient elution. The developed methods were then applied to real cosmetic samples to assess the method specificity, with regards to matrix interferences, and calibration curves were plotted to evaluate quantification. Besides, depending on the matrix and on the studied compounds, the importance of the detector type, UV or ELSD (evaporative light-scattering detection), and of the particle size of the stationary phase is discussed. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Surveillance and threat detection prevention versus mitigation

    CERN Document Server

    Kirchner, Richard

    2014-01-01

    Surveillance and Threat Detection offers readers a complete understanding of the terrorist/criminal cycle, and how to interrupt that cycle to prevent an attack. Terrorists and criminals often rely on pre-attack and pre-operational planning and surveillance activities that can last a period of weeks, months, or even years. Identifying and disrupting this surveillance is key to prevention of attacks. The systematic capture of suspicious events and the correlation of those events can reveal terrorist or criminal surveillance, allowing security professionals to employ appropriate countermeasures and identify the steps needed to apprehend the perpetrators. The results will dramatically increase the probability of prevention while streamlining protection assets and costs. Readers of Surveillance and Threat Detection will draw from real-world case studies that apply to their real-world security responsibilities. Ultimately, readers will come away with an understanding of how surveillance detection at a high-value, f...

  15. Who is Surveilling Whom?

    DEFF Research Database (Denmark)

    Mortensen, Mette

    2014-01-01

    This article concerns the particular form of counter-surveillance termed “sousveillance”, which aims to turn surveillance at the institutions responsible for surveillance. Drawing on the theoretical perspectives “mediatization” and “aerial surveillance,” the article studies WikiLeaks’ publication...

  16. Diapo, applying advanced AI methods to diagnosis of PWR reactor coolant pump

    International Nuclear Information System (INIS)

    Porcheron, M.; Ricard, B.

    1993-01-01

    Electricite de France has decided to increase the capabilities of its monitoring and diagnostic architecture with the development of an AI system for reactor coolant pump diagnostic support. This development is carried out with the cooperation of the equipment constructor Jeumont Schneider Industries. This diagnostic system will eventually be included in an integrated surveillance architecture. We present the architecture of the system and the basics of the knowledge model used. Main data for diagnosis are provided by sensor data issued by the pump monitoring system. Diagnostic reasoning is based on the cooperation of two main activities : a heuristic search among typical symptomatic situations that leads to the formulation of hypotheses and a ''deep'' causal analysis that consists in backtracking from identified situations up to initial faults or causes. This approach is well fitted to field expert reasoning, and provides powerful diagnostic capabilities that help to overcome conventional limitations of expert systems entirely based on heuristic knowledge. (authors). 9 figs., 11 refs

  17. How to reach the poor? Surveillance in low-income countries, lessons from experiences in Cambodia and Madagascar.

    Science.gov (United States)

    Goutard, F L; Binot, A; Duboz, R; Rasamoelina-Andriamanivo, H; Pedrono, M; Holl, D; Peyre, M I; Cappelle, J; Chevalier, V; Figuié, M; Molia, S; Roger, F L

    2015-06-01

    Surveillance of animal diseases in developing countries faces many constraints. Innovative tools and methods to enhance surveillance in remote and neglected areas should be defined, assessed and applied in close connection with local farmers, national stakeholders and international agencies. The authors performed a narrative synthesis of their own publications about surveillance in Madagascar and Cambodia. They analysed the data in light of their fieldwork experiences in the two countries' very challenging environments. The burden of animal and zoonotic diseases (e.g. avian influenza, African swine fever, Newcastle disease, Rift Valley fever) is huge in both countries which are among the poorest in the world. Being poor countries implies a lack of human and financial means to ensure effective surveillance of emerging and endemic diseases. Several recent projects have shown that new approaches can be proposed and tested in the field. Several advanced participatory approaches are promising and could be part of an innovative method for improving the dialogue among different actors in a surveillance system. Thus, participatory modelling, developed for natural resources management involving local stakeholders, could be applied to health management, including surveillance. Data transmission could benefit from the large mobile-phone coverage in these countries. Ecological studies and advances in the field of livestock surveillance should guide methods for enhancing wildlife monitoring and surveillance. Under the umbrella of the One Health paradigm, and in the framework of a risk-based surveillance concept, a combination of participatory methods and modern technologies could help to overcome the constraints present in low-income countries. These unconventional approaches should be merged in order to optimise surveillance of emerging and endemic diseases in challenging environments. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Non-parametric order statistics method applied to uncertainty propagation in fuel rod calculations

    International Nuclear Information System (INIS)

    Arimescu, V.E.; Heins, L.

    2001-01-01

    method, which is computationally efficient, is presented for the evaluation of the global statement. It is proved that, r, the expected fraction of fuel rods exceeding a certain limit is equal to the (1-r)-quantile of the overall distribution of all possible values from all fuel rods. In this way, the problem is reduced to that of estimating a certain quantile of the overall distribution, and the same techniques used for a single rod distribution can be applied again. A simplified test case was devised to verify and validate the methodology. The fuel code was replaced by a transfer function dependent on two input parameters. The function was chosen so that analytic results could be obtained for the distribution of the output. This offers a direct validation for the statistical procedure. Also, a sensitivity study has been performed to analyze the effect on the final outcome of the sampling procedure, simple Monte Carlo and Latin Hypercube Sampling. Also, the effect on the accuracy and bias of the statistical results due to the size of the sample was studied and the conclusion was reached that the results of the statistical methodology are typically conservative. In the end, an example of applying these statistical techniques to a PWR reload is presented together with the improvements and new insights the statistical methodology brings to fuel rod design calculations. (author)

  19. Analytical Methods INAA and PIXE Applied to Characterization of Airborne Particulate Matter in Bandung, Indonesia

    Directory of Open Access Journals (Sweden)

    D.D. Lestiani

    2011-08-01

    Full Text Available Urbanization and industrial growth have deteriorated air quality and are major cause to air pollution. Air pollution through fine and ultra-fine particles is a serious threat to human health. The source of air pollution must be known quantitatively by elemental characterization, in order to design the appropriate air quality management. The suitable methods for analysis the airborne particulate matter such as nuclear analytical techniques are hardly needed to solve the air pollution problem. The objectives of this study are to apply the nuclear analytical techniques to airborne particulate samples collected in Bandung, to assess the accuracy and to ensure the reliable of analytical results through the comparison of instrumental neutron activation analysis (INAA and particles induced X-ray emission (PIXE. Particle samples in the PM2.5 and PM2.5-10 ranges have been collected in Bandung twice a week for 24 hours using a Gent stacked filter unit. The result showed that generally there was a systematic difference between INAA and PIXE results, which the values obtained by PIXE were lower than values determined by INAA. INAA is generally more sensitive and reliable than PIXE for Na, Al, Cl, V, Mn, Fe, Br and I, therefore INAA data are preffered, while PIXE usually gives better precision than INAA for Mg, K, Ca, Ti and Zn. Nevertheless, both techniques provide reliable results and complement to each other. INAA is still a prospective method, while PIXE with the special capabilities is a promising tool that could contribute and complement the lack of NAA in determination of lead, sulphur and silicon. The combination of INAA and PIXE can advantageously be used in air pollution studies to extend the number of important elements measured as key elements in source apportionment.

  20. Stochastic Methods Applied to Power System Operations with Renewable Energy: A Review

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Z. [Argonne National Lab. (ANL), Argonne, IL (United States); Liu, C. [Argonne National Lab. (ANL), Argonne, IL (United States); Electric Reliability Council of Texas (ERCOT), Austin, TX (United States); Botterud, A. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-08-01

    Renewable energy resources have been rapidly integrated into power systems in many parts of the world, contributing to a cleaner and more sustainable supply of electricity. Wind and solar resources also introduce new challenges for system operations and planning in terms of economics and reliability because of their variability and uncertainty. Operational strategies based on stochastic optimization have been developed recently to address these challenges. In general terms, these stochastic strategies either embed uncertainties into the scheduling formulations (e.g., the unit commitment [UC] problem) in probabilistic forms or develop more appropriate operating reserve strategies to take advantage of advanced forecasting techniques. Other approaches to address uncertainty are also proposed, where operational feasibility is ensured within an uncertainty set of forecasting intervals. In this report, a comprehensive review is conducted to present the state of the art through Spring 2015 in the area of stochastic methods applied to power system operations with high penetration of renewable energy. Chapters 1 and 2 give a brief introduction and overview of power system and electricity market operations, as well as the impact of renewable energy and how this impact is typically considered in modeling tools. Chapter 3 reviews relevant literature on operating reserves and specifically probabilistic methods to estimate the need for system reserve requirements. Chapter 4 looks at stochastic programming formulations of the UC and economic dispatch (ED) problems, highlighting benefits reported in the literature as well as recent industry developments. Chapter 5 briefly introduces alternative formulations of UC under uncertainty, such as robust, chance-constrained, and interval programming. Finally, in Chapter 6, we conclude with the main observations from our review and important directions for future work.

  1. The Global Survey Method Applied to Ground-level Cosmic Ray Measurements

    Science.gov (United States)

    Belov, A.; Eroshenko, E.; Yanke, V.; Oleneva, V.; Abunin, A.; Abunina, M.; Papaioannou, A.; Mavromichalaki, H.

    2018-04-01

    The global survey method (GSM) technique unites simultaneous ground-level observations of cosmic rays in different locations and allows us to obtain the main characteristics of cosmic-ray variations outside of the atmosphere and magnetosphere of Earth. This technique has been developed and applied in numerous studies over many years by the Institute of Terrestrial Magnetism, Ionosphere and Radiowave Propagation (IZMIRAN). We here describe the IZMIRAN version of the GSM in detail. With this technique, the hourly data of the world-wide neutron-monitor network from July 1957 until December 2016 were processed, and further processing is enabled upon the receipt of new data. The result is a database of homogeneous and continuous hourly characteristics of the density variations (an isotropic part of the intensity) and the 3D vector of the cosmic-ray anisotropy. It includes all of the effects that could be identified in galactic cosmic-ray variations that were caused by large-scale disturbances of the interplanetary medium in more than 50 years. These results in turn became the basis for a database on Forbush effects and interplanetary disturbances. This database allows correlating various space-environment parameters (the characteristics of the Sun, the solar wind, et cetera) with cosmic-ray parameters and studying their interrelations. We also present features of the coupling coefficients for different neutron monitors that enable us to make a connection from ground-level measurements to primary cosmic-ray variations outside the atmosphere and the magnetosphere. We discuss the strengths and weaknesses of the current version of the GSM as well as further possible developments and improvements. The method developed allows us to minimize the problems of the neutron-monitor network, which are typical for experimental physics, and to considerably enhance its advantages.

  2. Multicriterial Hierarchy Methods Applied in Consumption Demand Analysis. The Case of Romania

    Directory of Open Access Journals (Sweden)

    Constantin Bob

    2008-03-01

    Full Text Available The basic information for computing the quantitative statistical indicators, that characterize the demand of industrial products and services are collected by the national statistics organizations, through a series of statistical surveys (most of them periodical and partial. The source for data we used in the present paper is an statistical investigation organized by the National Institute of Statistics, "Family budgets survey" that allows to collect information regarding the households composition, income, expenditure, consumption and other aspects of population living standard. In 2005, in Romania, a person spent monthly in average 391,2 RON, meaning about 115,1 Euros for purchasing the consumed food products and beverage, as well as non-foods products, services, investments and other taxes. 23% of this sum was spent for purchasing the consumed food products and beverages, 21.6% of the total sum was spent for purchasing non-food goods and 18,1%  for payment of different services. There is a discrepancy between the different development regions in Romania, regarding total households expenditure composition. For this reason, in the present paper we applied statistical methods for ranking the various development regions in Romania, using the share of householdsí expenditure on categories of products and services as ranking criteria.

  3. Applying the Communicative Methodic in Learning Lithuanian as a Second Language

    Directory of Open Access Journals (Sweden)

    Vaida Buivydienė

    2011-04-01

    Full Text Available One of the strengths of European countries is their multilingual nature. That was stressed by the European Council during different international projects. Every citizen of Europe should be given the opportunity to learn languages life long, as languages open new perspectives in the modern world. Besides, learning languages brings tolerance and understanding to people from different cultures. The article presents the idea, based on the experience of foreign language teaching, that communicative method in learning languages should be applied also to Lithuanian as a foreign language teaching. According to international SOCRATES exchange programme, every year a lot of students and teachers from abroad come to Lithuanian Higher Schools (VGTU included. They should also be provided with opportunities to gain the best language learning, cultural and educational experience. Most of the students that came to VGTU pointed out Lithuanian language learning being one of the subjects to be chosen. That leads to organizing interesting and useful short-lasting Lithuanian language courses. The survey carried in VGTU and the analysis of the materials gathered leads to the conclusion that the communicative approach in language teaching is the best to cater the needs and interests of the learners to master the survival Lithuanian.

  4. A new method of identifying target groups for pronatalist policy applied to Australia.

    Directory of Open Access Journals (Sweden)

    Mengni Chen

    Full Text Available A country's total fertility rate (TFR depends on many factors. Attributing changes in TFR to changes of policy is difficult, as they could easily be correlated with changes in the unmeasured drivers of TFR. A case in point is Australia where both pronatalist effort and TFR increased in lock step from 2001 to 2008 and then decreased. The global financial crisis or other unobserved confounders might explain both the reducing TFR and pronatalist incentives after 2008. Therefore, it is difficult to estimate causal effects of policy using econometric techniques. The aim of this study is to instead look at the structure of the population to identify which subgroups most influence TFR. Specifically, we build a stochastic model relating TFR to the fertility rates of various subgroups and calculate elasticity of TFR with respect to each rate. For each subgroup, the ratio of its elasticity to its group size is used to evaluate the subgroup's potential cost effectiveness as a pronatalist target. In addition, we measure the historical stability of group fertility rates, which measures propensity to change. Groups with a high effectiveness ratio and also high propensity to change are natural policy targets. We applied this new method to Australian data on fertility rates broken down by parity, age and marital status. The results show that targeting parity 3+ is more cost-effective than lower parities. This study contributes to the literature on pronatalist policies by investigating the targeting of policies, and generates important implications for formulating cost-effective policies.

  5. A new method of identifying target groups for pronatalist policy applied to Australia.

    Science.gov (United States)

    Chen, Mengni; Lloyd, Chris J; Yip, Paul S F

    2018-01-01

    A country's total fertility rate (TFR) depends on many factors. Attributing changes in TFR to changes of policy is difficult, as they could easily be correlated with changes in the unmeasured drivers of TFR. A case in point is Australia where both pronatalist effort and TFR increased in lock step from 2001 to 2008 and then decreased. The global financial crisis or other unobserved confounders might explain both the reducing TFR and pronatalist incentives after 2008. Therefore, it is difficult to estimate causal effects of policy using econometric techniques. The aim of this study is to instead look at the structure of the population to identify which subgroups most influence TFR. Specifically, we build a stochastic model relating TFR to the fertility rates of various subgroups and calculate elasticity of TFR with respect to each rate. For each subgroup, the ratio of its elasticity to its group size is used to evaluate the subgroup's potential cost effectiveness as a pronatalist target. In addition, we measure the historical stability of group fertility rates, which measures propensity to change. Groups with a high effectiveness ratio and also high propensity to change are natural policy targets. We applied this new method to Australian data on fertility rates broken down by parity, age and marital status. The results show that targeting parity 3+ is more cost-effective than lower parities. This study contributes to the literature on pronatalist policies by investigating the targeting of policies, and generates important implications for formulating cost-effective policies.

  6. Analysis of Typing Methods for Epidemiological Surveillance of both Methicillin-Resistant and Methicillin-Susceptible Staphylococcus aureus Strains▿ †

    OpenAIRE

    Faria, Nuno A.; Carrico, João A.; Oliveira, Duarte C.; Ramirez, Mário; de Lencastre, Hermínia

    2007-01-01

    Sequence-based methods for typing Staphylococcus aureus, such as multilocus sequence typing (MLST) and spa typing, have increased interlaboratory reproducibility, portability, and speed in obtaining results, but pulsed-field gel electrophoresis (PFGE), remains the method of choice in many laboratories due to the extensive experience with this methodology and the large body of data accumulated using the technique. Comparisons between typing methods have been overwhelmingly based on a qualitati...

  7. A Study of the Efficiency of Spatial Indexing Methods Applied to Large Astronomical Databases

    Science.gov (United States)

    Donaldson, Tom; Berriman, G. Bruce; Good, John; Shiao, Bernie

    2018-01-01

    Spatial indexing of astronomical databases generally uses quadrature methods, which partition the sky into cells used to create an index (usually a B-tree) written as database column. We report the results of a study to compare the performance of two common indexing methods, HTM and HEALPix, on Solaris and Windows database servers installed with a PostgreSQL database, and a Windows Server installed with MS SQL Server. The indexing was applied to the 2MASS All-Sky Catalog and to the Hubble Source catalog. On each server, the study compared indexing performance by submitting 1 million queries at each index level with random sky positions and random cone search radius, which was computed on a logarithmic scale between 1 arcsec and 1 degree, and measuring the time to complete the query and write the output. These simulated queries, intended to model realistic use patterns, were run in a uniform way on many combinations of indexing method and indexing level. The query times in all simulations are strongly I/O-bound and are linear with number of records returned for large numbers of sources. There are, however, considerable differences between simulations, which reveal that hardware I/O throughput is a more important factor in managing the performance of a DBMS than the choice of indexing scheme. The choice of index itself is relatively unimportant: for comparable index levels, the performance is consistent within the scatter of the timings. At small index levels (large cells; e.g. level 4; cell size 3.7 deg), there is large scatter in the timings because of wide variations in the number of sources found in the cells. At larger index levels, performance improves and scatter decreases, but the improvement at level 8 (14 min) and higher is masked to some extent in the timing scatter caused by the range of query sizes. At very high levels (20; 0.0004 arsec), the granularity of the cells becomes so high that a large number of extraneous empty cells begin to degrade

  8. Surveillance of healthcare-associated infection in hospitalised South ...

    African Journals Online (AJOL)

    Background. In 2012, the South African (SA) National Department of Health mandated surveillance of healthcare-associated infection (HAI), but made no recommendations of appropriate surveillance methods. Methods. Prospective clinical HAI surveillance (the reference method) was conducted at Tygerberg Children's ...

  9. Modern structure of methods and techniques of marketing research, applied by the world and Ukrainian research companies

    Directory of Open Access Journals (Sweden)

    Bezkrovnaya Yulia

    2015-08-01

    Full Text Available The article presents the results of empiric justification of the structure of methods and techniques of marketing research of consumer decisions, applied by the world and Ukrainian research companies.

  10. High Quality Camera Surveillance System

    OpenAIRE

    Helaakoski, Ari

    2015-01-01

    Oulu University of Applied Sciences Information Technology Author: Ari Helaakoski Title of the master’s thesis: High Quality Camera Surveillance System Supervisor: Kari Jyrkkä Term and year of completion: Spring 2015 Number of pages: 31 This master’s thesis was commissioned by iProtoXi Oy and it was done to one iProtoXi customer. The aim of the thesis was to make a camera surveillance system which is using a High Quality camera with pan and tilt possibility. It should b...

  11. Performance Evaluations for Super-Resolution Mosaicing on UAS Surveillance Videos

    Directory of Open Access Journals (Sweden)

    Aldo Camargo

    2013-05-01

    Full Text Available Abstract Unmanned Aircraft Systems (UAS have been widely applied for reconnaissance and surveillance by exploiting information collected from the digital imaging payload. The super-resolution (SR mosaicing of low-resolution (LR UAS surveillance video frames has become a critical requirement for UAS video processing and is important for further effective image understanding. In this paper we develop a novel super-resolution framework, which does not require the construction of sparse matrices. The proposed method implements image operations in the spatial domain and applies an iterated back-projection to construct super-resolution mosaics from the overlapping UAS surveillance video frames. The Steepest Descent method, the Conjugate Gradient method and the Levenberg-Marquardt algorithm are used to numerically solve the nonlinear optimization problem for estimating a super-resolution mosaic. A quantitative performance comparison in terms of computation time and visual quality of the super-resolution mosaics through the three numerical techniques is presented.

  12. Analysis of flow boiling heat transfer in narrow annular gaps applying the design of experiments method

    Directory of Open Access Journals (Sweden)

    Gunar Boye

    2015-06-01

    Full Text Available The axial heat transfer coefficient during flow boiling of n-hexane was measured using infrared thermography to determine the axial wall temperature in three geometrically similar annular gaps with different widths (s = 1.5 mm, s = 1 mm, s = 0.5 mm. During the design and evaluation process, the methods of statistical experimental design were applied. The following factors/parameters were varied: the heat flux q · = 30 − 190 kW / m 2 , the mass flux m · = 30 − 700 kg / m 2 s , the vapor quality x · = 0 . 2 − 0 . 7 , and the subcooled inlet temperature T U = 20 − 60 K . The test sections with gap widths of s = 1.5 mm and s = 1 mm had very similar heat transfer characteristics. The heat transfer coefficient increases significantly in the range of subcooled boiling, and after reaching a maximum at the transition to the saturated flow boiling, it drops almost monotonically with increasing vapor quality. With a gap width of 0.5 mm, however, the heat transfer coefficient in the range of saturated flow boiling first has a downward trend and then increases at higher vapor qualities. For each test section, two correlations between the heat transfer coefficient and the operating parameters have been created. The comparison also shows a clear trend of an increasing heat transfer coefficient with increasing heat flux for test sections s = 1.5 mm and s = 1.0 mm, but with increasing vapor quality, this trend is reversed for test section 0.5 mm.

  13. A new method of identifying target groups for pronatalist policy applied to Australia

    Science.gov (United States)

    Chen, Mengni; Lloyd, Chris J.

    2018-01-01

    A country’s total fertility rate (TFR) depends on many factors. Attributing changes in TFR to changes of policy is difficult, as they could easily be correlated with changes in the unmeasured drivers of TFR. A case in point is Australia where both pronatalist effort and TFR increased in lock step from 2001 to 2008 and then decreased. The global financial crisis or other unobserved confounders might explain both the reducing TFR and pronatalist incentives after 2008. Therefore, it is difficult to estimate causal effects of policy using econometric techniques. The aim of this study is to instead look at the structure of the population to identify which subgroups most influence TFR. Specifically, we build a stochastic model relating TFR to the fertility rates of various subgroups and calculate elasticity of TFR with respect to each rate. For each subgroup, the ratio of its elasticity to its group size is used to evaluate the subgroup’s potential cost effectiveness as a pronatalist target. In addition, we measure the historical stability of group fertility rates, which measures propensity to change. Groups with a high effectiveness ratio and also high propensity to change are natural policy targets. We applied this new method to Australian data on fertility rates broken down by parity, age and marital status. The results show that targeting parity 3+ is more cost-effective than lower parities. This study contributes to the literature on pronatalist policies by investigating the targeting of policies, and generates important implications for formulating cost-effective policies. PMID:29425220

  14. Non-destructive scanning for applied stress by the continuous magnetic Barkhausen noise method

    Science.gov (United States)

    Franco Grijalba, Freddy A.; Padovese, L. R.

    2018-01-01

    This paper reports the use of a non-destructive continuous magnetic Barkhausen noise technique to detect applied stress on steel surfaces. The stress profile generated in a sample of 1070 steel subjected to a three-point bending test is analyzed. The influence of different parameters such as pickup coil type, scanner speed, applied magnetic field and frequency band analyzed on the effectiveness of the technique is investigated. A moving smoothing window based on a second-order statistical moment is used to analyze the time signal. The findings show that the technique can be used to detect applied stress profiles.

  15. Krylov Subspace and Multigrid Methods Applied to the Incompressible Navier-Stokes Equations

    Science.gov (United States)

    Vuik, C.; Wesseling, P.; Zeng, S.

    1996-01-01

    We consider numerical solution methods for the incompressible Navier-Stokes equations discretized by a finite volume method on staggered grids in general coordinates. We use Krylov subspace and multigrid methods as well as their combinations. Numerical experiments are carried out on a scalar and a vector computer. Robustness and efficiency of these methods are studied. It appears that good methods result from suitable combinations of GCR and multigrid methods.

  16. GSFC Supplier Surveillance

    Science.gov (United States)

    Kelly, Michael P.

    2011-01-01

    Topics covered include: Develop Program/Project Quality Assurance Surveillance Plans The work activities performed by the developer and/or his suppliers are subject to evaluation and audit by government-designated representatives. CSO supports project by selecting on-site supplier representative s by one of several methods: (1) a Defense Contract Management Agency (DCMA) person via a Letter Of Delegation (LOD), (2) an independent assurance contractor (IAC) via a contract Audits, Assessments, and Assurance (A3) Contract Code 300 Mission Assurance Support Contract (MASC)

  17. The prevalence and impact of overuse injuries in five Norwegian sports: Application of a new surveillance method

    NARCIS (Netherlands)

    Clarsen, B.; Bahr, R.; Heymans, M.W.; Engedahl, M.; Midtsundstad, G.; Rosenlund, L.; Thorsen, G.; Myklebust, G.

    2015-01-01

    Little is known about the true extent and severity of overuse injuries in sport, largely because of methodological challenges involved in recording them. This study assessed the prevalence of overuse injuries among Norwegian athletes from five sports using a newly developed method designed

  18. Research on applying neutron transport Monte Carlo method in materials with continuously varying cross sections

    International Nuclear Information System (INIS)

    Li, Zeguang; Wang, Kan; Zhang, Xisi

    2011-01-01

    In traditional Monte Carlo method, the material properties in a certain cell are assumed to be constant, but this is no longer applicable in continuous varying materials where the material's nuclear cross-sections vary over the particle's flight path. So, three Monte Carlo methods, including sub stepping method, delta-tracking method and direct sampling method, are discussed in this paper to solve the problems with continuously varying materials. After the verification and comparison of these methods in 1-D models, the basic specialties of these methods are discussed and then we choose the delta-tracking method as the main method to solve the problems with continuously varying materials, especially 3-D problems. To overcome the drawbacks of the original delta-tracking method, an improved delta-tracking method is proposed in this paper to make this method more efficient in solving problems where the material's cross-sections vary sharply over the particle's flight path. To use this method in practical calculation, we implemented the improved delta-tracking method into the 3-D Monte Carlo code RMC developed by Department of Engineering Physics, Tsinghua University. Two problems based on Godiva system were constructed and calculations were made using both improved delta-tracking method and the sub stepping method, and the results proved the effects of improved delta-tracking method. (author)

  19. The Density-Enthalpy Method Applied to Model Two–phase Darcy Flow

    NARCIS (Netherlands)

    Ibrahim, D.

    2012-01-01

    In this thesis, we use a more recent method to numerically solve two-phase fluid flow problems. The method is developed at TNO and it is presented by Arendsen et al. in [1] for spatially homogeneous systems. We will refer to this method as the densityenthalpy method (DEM) because the

  20. Non-regularized inversion method from light scattering applied to ferrofluid magnetization curves for magnetic size distribution analysis

    International Nuclear Information System (INIS)

    Rijssel, Jos van; Kuipers, Bonny W.M.; Erné, Ben H.

    2014-01-01

    A numerical inversion method known from the analysis of light scattering by colloidal dispersions is now applied to magnetization curves of ferrofluids. The distribution of magnetic particle sizes or dipole moments is determined without assuming that the distribution is unimodal or of a particular shape. The inversion method enforces positive number densities via a non-negative least squares procedure. It is tested successfully on experimental and simulated data for ferrofluid samples with known multimodal size distributions. The created computer program MINORIM is made available on the web. - Highlights: • A method from light scattering is applied to analyze ferrofluid magnetization curves. • A magnetic size distribution is obtained without prior assumption of its shape. • The method is tested successfully on ferrofluids with a known size distribution. • The practical limits of the method are explored with simulated data including noise. • This method is implemented in the program MINORIM, freely available online

  1. Exploring novel diabetes surveillance methods: a comparison of administrative, laboratory and pharmacy data case definitions using THIN.

    Science.gov (United States)

    Khokhar, Bushra; Quan, Hude; Kaplan, Gilaad G; Butalia, Sonia; Rabi, Doreen

    2017-07-28

    The objective of this study was to identify patients with diabetes in a comprehensive primary care electronic medical records database using a number of different case definitions (clinical, pharmacy, laboratory definitions and a combination thereof) and understand the differences in patient populations being captured by each definition. Data for this population-based retrospective cohort study was obtained from The Health Information Network (THIN). THIN is a longitudinal, primary care medical records database of over 9 million patients in UK. Primary outcome was a diagnosis of diabetes, defined by the presence of a diabetes read code, or an abnormal laboratory result, or a prescription for an Oral Anti-diabetic drug or insulin. A 2-year washout period was applied prior to the index of diabetes to avoid inclusion of prevalent cases for each case definition. This study demonstrated that different case definitions of diabetes identify different sub-populations of patients. When the cohorts were observed based on any measure of central tendency, each of the cohorts were reasonably comparable to each other. However, the distribution of each of the cohorts when grouped by age categories and sex, reveal differences. For example, using pharmacy case definition results in a bimodal distribution among women, one between 1-19 year and 35-39 age categories, and then again between 60-64 and 85 years-however, the histogram becomes more normally distributed when metformin was removed from the case definition. Our results suggest that clinical, pharmacy, laboratory case definitions identify different sub-populations and using multiple case definitions is likely required to optimally identify the entire diabetes population within THIN. Our study also suggests that age and sex of patients may affect the indexing of diabetes in THIN and is critical to better understand these variations.

  2. Material point methods applied to one-dimensional shock waves and dual domain material point method with sub-points

    Science.gov (United States)

    Dhakal, Tilak R.; Zhang, Duan Z.

    2016-11-01

    Using a simple one-dimensional shock problem as an example, the present paper investigates numerical properties of the original material point method (MPM), the generalized interpolation material point (GIMP) method, the convected particle domain interpolation (CPDI) method, and the dual domain material point (DDMP) method. For a weak isothermal shock of ideal gas, the MPM cannot be used with accuracy. With a small number of particles per cell, GIMP and CPDI produce reasonable results. However, as the number of particles increases the methods fail to converge and produce pressure spikes. The DDMP method behaves in an opposite way. With a small number of particles per cell, DDMP results are unsatisfactory. As the number of particles increases, the DDMP results converge to correct solutions, but the large number of particles needed for convergence makes the method very expensive to use in these types of shock wave problems in two- or three-dimensional cases. The cause for producing the unsatisfactory DDMP results is identified. A simple improvement to the method is introduced by using sub-points. With this improvement, the DDMP method produces high quality numerical solutions with a very small number of particles. Although in the present paper, the numerical examples are one-dimensional, all derivations are for multidimensional problems. With the technique of approximately tracking particle domains of CPDI, the extension of this sub-point method to multidimensional problems is straightforward. This new method preserves the conservation properties of the DDMP method, which conserves mass and momentum exactly and conserves energy to the second order in both spatial and temporal discretizations.

  3. FOUR SQUARE WRITING METHOD APPLIED IN PRODUCT AND PROCESS BASED APPROACHES COMBINATION TO TEACHING WRITING DISCUSSION TEXT

    Directory of Open Access Journals (Sweden)

    Vina Agustiana

    2017-12-01

    Full Text Available Four Square Writing Method is a writing method which helps students in organizing concept to write by using a graphic organizer. This study aims to examine the influence of applying FSWM in combination of product and process based approaches to teaching writing discussion texts toward students’ writing skill, the teaching-learning writing process and the students’ attitude toward the implementation of the writing method. This study applies a mixed-method through applying an embedded design. 26 EFL university students of a private university in West Java, Indonesia, are involved in the study. There are 3 kinds of instrument used, namely tests (pre and post-test, field notes, and questionnaires. Data taken from students’ writing test are analyzed statistically to identify the influence of applying the writing method toward students’ writing skill; data taken from field notes are analyzed qualitatively to examine the learning writing activities at the time the writing method is implemented; and data taken from questionnaires are analyzed descriptive statistic to explore students’ attitude toward the implementation of the writing method. Regarding the result of paired t-test, the writing method is effective in improving students’ writing skill since level of significant (two-tailed is less than alpha (0.000<0.05. Furthermore, the result taken from field notes shows that each steps applied and graphic organizer used in the writing method lead students to compose discussion texts which meet a demand of genre. In addition, regard with the result taken from questionnaire, the students show highly positive attitude toward the treatment since the mean score is 4.32.

  4. A global method for calculating plant CSR ecological strategies applied across biomes world-wide

    NARCIS (Netherlands)

    Pierce, S.; Negreiros, D.; Cerabolini, B.E.L.; Kattge, J.; Díaz, S.; Kleyer, M.; Shipley, B.; Wright, S.J.; Soudzilovskaia, N.A.; Onipchenko, V.G.; van Bodegom, P.M.; Frenette-Dussault, C.; Weiher, E.; Pinho, B.X.; Cornelissen, J.H.C.; Grime, J.P.; Thompson, K.; Hunt, R.; Wilson, P.J.; Buffa, G.; Nyakunga, O.C.; Reich, P.B.; Caccianiga, M.; Mangili, F.; Ceriani, R.M.; Luzzaro, A.; Brusa, G.; Siefert, A.; Barbosa, N.P.U.; Chapin III, F.S.; Cornwell, W.K.; Fang, Jingyun; Wilson Fernandez, G.; Garnier, E.; Le Stradic, S.; Peñuelas, J.; Melo, F.P.L.; Slaviero, A.; Tabarrelli, M.; Tampucci, D.

    2017-01-01

    Competitor, stress-tolerator, ruderal (CSR) theory is a prominent plant functional strategy scheme previously applied to local floras. Globally, the wide geographic and phylogenetic coverage of available values of leaf area (LA), leaf dry matter content (LDMC) and specific leaf area (SLA)

  5. Structure analysis of interstellar clouds - II. Applying the Delta-variance method to interstellar turbulence

    NARCIS (Netherlands)

    Ossenkopf, V.; Krips, M.; Stutzki, J.

    Context. The Delta-variance analysis is an efficient tool for measuring the structural scaling behaviour of interstellar turbulence in astronomical maps. It has been applied both to simulations of interstellar turbulence and to observed molecular cloud maps. In Paper I we proposed essential

  6. Ambient Surveillance by Probabilistic-Possibilistic Perception

    NARCIS (Netherlands)

    Bittermann, M.S.; Ciftcioglu, O.

    2013-01-01

    A method for quantifying ambient surveillance is presented, which is based on probabilistic-possibilistic perception. The human surveillance of a scene through observing camera sensed images on a monitor is modeled in three steps. First immersion of the observer is simulated by modeling perception

  7. Evaluation of Two Fitting Methods Applied for Thin-Layer Drying of Cape Gooseberry Fruits

    Directory of Open Access Journals (Sweden)

    Erkan Karacabey

    Full Text Available ABSTRACT Drying data of cape gooseberry was used to compare two fitting methods: namely 2-step and 1-step methods. Literature data was also used to confirm the results. To demonstrate the applicability of these methods, two primary models (Page, Two-term-exponential were selected. Linear equation was used as secondary model. As well-known from the previous modelling studies on drying, 2-step method required at least two regressions: One is primary model and one is secondary (if you have only one environmental condition such as temperature. On the other hand, one regression was enough for 1-step method. Although previous studies on kinetic modelling of drying of foods were based on 2-step method, this study indicated that 1-step method may also be a good alternative with some advantages such as drawing an informative figure and reducing time of calculations.

  8. SOA-surveillance Nederland

    NARCIS (Netherlands)

    Rijlaarsdam J; Bosman A; Laar MJW van de; CIE

    2000-01-01

    In May 1999 a working group was started to evaluate the current surveillance systems for sexually transmitted diseases (STD) and to make suggestions for a renewed effective and efficient STD surveillance system in the Netherlands. The surveillance system has to provide insight into the prevalence

  9. Containment and surveillance devices

    International Nuclear Information System (INIS)

    Campbell, J.W.; Johnson, C.S.; Stieff, L.R.

    The growing acceptance of containment and surveillance as a means to increase safeguards effectiveness has provided impetus to the development of improved surveillance and containment devices. Five recently developed devices are described. The devices include one photographic and two television surveillance systems and two high security seals that can be verified while installed

  10. Effective surveillance for homeland security balancing technology and social issues

    CERN Document Server

    Flammini, Francesco; Franceschetti, Giorgio

    2013-01-01

    Effective Surveillance for Homeland Security: Balancing Technology and Social Issues provides a comprehensive survey of state-of-the-art methods and tools for the surveillance and protection of citizens and critical infrastructures against natural and deliberate threats. Focusing on current technological challenges involving multi-disciplinary problem analysis and systems engineering approaches, it provides an overview of the most relevant aspects of surveillance systems in the framework of homeland security. Addressing both advanced surveillance technologies and the related socio-ethical issues, the book consists of 21 chapters written by international experts from the various sectors of homeland security. Part I, Surveillance and Society, focuses on the societal dimension of surveillance-stressing the importance of societal acceptability as a precondition to any surveillance system. Part II, Physical and Cyber Surveillance, presents advanced technologies for surveillance. It considers developing technologie...

  11. Establishing seasonal and alert influenza thresholds in Cambodia using the WHO method: implications for effective utilization of influenza surveillance in the tropics and subtropics.

    Science.gov (United States)

    Ly, Sovann; Arashiro, Takeshi; Ieng, Vanra; Tsuyuoka, Reiko; Parry, Amy; Horwood, Paul; Heng, Seng; Hamid, Sarah; Vandemaele, Katelijn; Chin, Savuth; Sar, Borann; Arima, Yuzo

    2017-01-01

    To establish seasonal and alert thresholds and transmission intensity categories for influenza to provide timely triggers for preventive measures or upscaling control measures in Cambodia. Using Cambodia's influenza-like illness (ILI) and laboratory-confirmed influenza surveillance data from 2009 to 2015, three parameters were assessed to monitor influenza activity: the proportion of ILI patients among all outpatients, proportion of ILI samples positive for influenza and the product of the two. With these parameters, four threshold levels (seasonal, moderate, high and alert) were established and transmission intensity was categorized based on a World Health Organization alignment method. Parameters were compared against their respective thresholds. Distinct seasonality was observed using the two parameters that incorporated laboratory data. Thresholds established using the composite parameter, combining syndromic and laboratory data, had the least number of false alarms in declaring season onset and were most useful in monitoring intensity. Unlike in temperate regions, the syndromic parameter was less useful in monitoring influenza activity or for setting thresholds. Influenza thresholds based on appropriate parameters have the potential to provide timely triggers for public health measures in a tropical country where monitoring and assessing influenza activity has been challenging. Based on these findings, the Ministry of Health plans to raise general awareness regarding influenza among the medical community and the general public. Our findings have important implications for countries in the tropics/subtropics and in resource-limited settings, and categorized transmission intensity can be used to assess severity of potential pandemic influenza as well as seasonal influenza.

  12. Acoustical monitoring of diesel engines in reverberant environment; Methodes de surveillance acoustique des diesels en milieu reverberant

    Energy Technology Data Exchange (ETDEWEB)

    Mein, M.

    1995-10-01

    The feed-back knowledge of emergency diesel generators in nuclear power plants shows that some malfunctions, mainly affecting fuel-injection or distribution system of the engine can be heard and detected by experienced maintenance agents. This study consists in the feasibility,v of acoustical monitoring of those diesel engines, taking into account the reverberant environment of the machine. The operating cycle of the diesel is composed of transient events (injection, combustion, valve closure...) which generate highly non stationary acoustical signals. The detection of a malfunction appearing on such transients requires the use of adapted signal processing techniques. Visual analysis of the phenomena is first proceeded using time-frequency and time-scale representations. The second step will be parametric modeling of acoustical signatures for the extraction of characteristic parameters, in order to characterize the fault and to use an automatic classification system. The lest part of the study will concern the evaluation of the robustness of the detection methods in regard to acoustical reverberation. (author). 10 refs., 6 figs.

  13. Wallops Ship Surveillance System

    Science.gov (United States)

    Smith, Donna C.

    2011-01-01

    Approved as a Wallops control center backup system, the Wallops Ship Surveillance Software is a day-of-launch risk analysis tool for spaceport activities. The system calculates impact probabilities and displays ship locations relative to boundary lines. It enables rapid analysis of possible flight paths to preclude the need to cancel launches and allow execution of launches in a timely manner. Its design is based on low-cost, large-customer- base elements including personal computers, the Windows operating system, C/C++ object-oriented software, and network interfaces. In conformance with the NASA software safety standard, the system is designed to ensure that it does not falsely report a safe-for-launch condition. To improve the current ship surveillance method, the system is designed to prevent delay of launch under a safe-for-launch condition. A single workstation is designated the controller of the official ship information and the official risk analysis. Copies of this information are shared with other networked workstations. The program design is divided into five subsystems areas: 1. Communication Link -- threads that control the networking of workstations; 2. Contact List -- a thread that controls a list of protected item (ocean vessel) information; 3. Hazard List -- threads that control a list of hazardous item (debris) information and associated risk calculation information; 4. Display -- threads that control operator inputs and screen display outputs; and 5. Archive -- a thread that controls archive file read and write access. Currently, most of the hazard list thread and parts of other threads are being reused as part of a new ship surveillance system, under the SureTrak project.

  14. Methods for characterizing fine particulate matter using ground observations and remotely sensed data: potential use for environmental public health surveillance.

    Science.gov (United States)

    Al-Hamdan, Mohammad Z; Crosson, William L; Limaye, Ashutosh S; Rickman, Douglas L; Quattrochi, Dale A; Estes, Maurice G; Qualters, Judith R; Sinclair, Amber H; Tolsma, Dennis D; Adeniyi, Kafayat A; Niskar, Amanda Sue

    2009-07-01

    This study describes and demonstrates different techniques for surface fitting daily environmental hazards data of particulate matter with aerodynamic diameter less than or equal to 2.5 microm (PM2.5) for the purpose of integrating respiratory health and environmental data for the Centers for Disease Control and Prevention (CDC) pilot study of Health and Environment Linked for Information Exchange (HELIX)-Atlanta. It presents a methodology for estimating daily spatial surfaces of ground-level PM2.5 concentrations using the B-Spline and inverse distance weighting (IDW) surface-fitting techniques, leveraging National Aeronautics and Space Administration (NASA) Moderate Resolution Imaging Spectrometer (MODIS) data to complement U.S. Environmental Protection Agency (EPA) ground observation data. The study used measurements of ambient PM2.5 from the EPA database for the year 2003 as well as PM2.5 estimates derived from NASA's satellite data. Hazard data have been processed to derive the surrogate PM2.5 exposure estimates. This paper shows that merging MODIS remote sensing data with surface observations of PM,2. not only provides a more complete daily representation of PM,2. than either dataset alone would allow, but it also reduces the errors in the PM2.5-estimated surfaces. The results of this study also show that although the IDW technique can introduce some numerical artifacts that could be due to its interpolating nature, which assumes that the maxima and minima can occur only at the observation points, the daily IDW PM2.5 surfaces had smaller errors in general, with respect to observations, than those of the B-Spline surfaces. Finally, the methods discussed in this paper establish a foundation for environmental public health linkage and association studies for which determining the concentrations of an environmental hazard such as PM2.5 with high accuracy is critical.

  15. Geochronology and geochemistry by nuclear tracks method: some utilization examples in geologic applied

    International Nuclear Information System (INIS)

    Poupeau, G.; Soliani Junior, E.

    1988-01-01

    This article discuss some applications of the 'nuclear tracks method' in geochronology, geochemistry and geophysic. In geochronology, after rapid presentation of the dating principles by 'Fission Track' and the kinds of geological events mensurable by this method, is showed some application in metallogeny and in petroleum geolocy. In geochemistry the 'fission tracks' method utilizations are related with mining prospecting and uranium prospecting. In geophysics an important application is the earthquake prevision, through the Ra 222 emanations continous control. (author) [pt

  16. The development of a curved beam element model applied to finite elements method

    International Nuclear Information System (INIS)

    Bento Filho, A.

    1980-01-01

    A procedure for the evaluation of the stiffness matrix for a thick curved beam element is developed, by means of the minimum potential energy principle, applied to finite elements. The displacement field is prescribed through polynomial expansions, and the interpolation model is determined by comparison of results obtained by the use of a sample of different expansions. As a limiting case of the curved beam, three cases of straight beams, with different dimensional ratios are analised, employing the approach proposed. Finally, an interpolation model is proposed and applied to a curved beam with great curvature. Desplacements and internal stresses are determined and the results are compared with those found in the literature. (Author) [pt

  17. Surveillance strategies for detecting chronic wasting disease in free-ranging deer and elk: results of a CWD surveillance workshop

    Science.gov (United States)

    Samuel, Michael D.; Joly, Damien O.; Wild, Margaret A.; Wright, Scott D.; Otis, David L.; Werge, Rob W.; Miller, Michael W.

    2003-01-01

    Chronic Wasting Disease (CWD), a fatal brain disease of North American deer and elk, has recently emerged as an important wildlife management issue. Interest and concern over the spread of this disease and its potential impact on free-ranging cervid populations has increased with discovery of the disease in numerous states and provinces. Current studies suggest that CWD may adversely affect of these highly visible, socially desirable, and economically valuable species. Despite the lack of evidence that CWD affects humans or livestock, a significant concern has been the perceived risk to humans and livestock. Uncertainty about whether CWD poses a health risk to hunters and their families who consume venison has resulted in testing of free-ranging cervids for CWD. In response to many of these concerns, wildlife management agencies across the nation have undertaken surveillance programs to detect CWD in their cervid populations. The nation-wide costs for an extensive CWD surveillance program have been estimated at several million dollars. This document provides guidance on the development and conduct of scientifically sound surveillance programs to detect CWD in free-ranging deer and elk populations. These guidelines will not apply equally to all jurisdictions. In many cases local circumstances, resources, area(s) of concern, disease risk, animal and landscape ecology, political, social, and many other factors will influence the objectives, design, and conduct of CWD surveillance programs. Part I of this report discusses the importance of management goals, strategies, and disease risks in developing a surveillance program. Part II describes surveillance methods, steps in designing a sampling strategy to detect CWD, alternative collection methods, and statistical considerations. Part III describes costs (personnel, time, and money) associated with implementation of these plans that will influence program design. Part IV outlines research that is needed to further

  18. Applying formal method to design of nuclear power plant embedded protection system

    International Nuclear Information System (INIS)

    Kim, Jin Hyun; Kim, Il Gon; Sung, Chang Hoon; Choi, Jin Young; Lee, Na Young

    2001-01-01

    Nuclear power embedded protection systems is a typical safety-critical system, which detects its failure and shutdowns its operation of nuclear reactor. These systems are very dangerous so that it absolutely requires safety and reliability. Therefore nuclear power embedded protection system should fulfill verification and validation completely from the design stage. To develop embedded system, various V and V method have been provided and especially its design using Formal Method is studied in other advanced country. In this paper, we introduce design method of nuclear power embedded protection systems using various Formal-Method in various respect following nuclear power plant software development guideline

  19. Concepts for risk-based surveillance in the field of veterinary medicine and veterinary public health: Review of current approaches

    Directory of Open Access Journals (Sweden)

    Knopf Lea

    2006-02-01

    Full Text Available Abstract Background Emerging animal and zoonotic diseases and increasing international trade have resulted in an increased demand for veterinary surveillance systems. However, human and financial resources available to support government veterinary services are becoming more and more limited in many countries world-wide. Intuitively, issues that present higher risks merit higher priority for surveillance resources as investments will yield higher benefit-cost ratios. The rapid rate of acceptance of this core concept of risk-based surveillance has outpaced the development of its theoretical and practical bases. Discussion The principal objectives of risk-based veterinary surveillance are to identify surveillance needs to protect the health of livestock and consumers, to set priorities, and to allocate resources effectively and efficiently. An important goal is to achieve a higher benefit-cost ratio with existing or reduced resources. We propose to define risk-based surveillance systems as those that apply risk assessment methods in different steps of traditional surveillance design for early detection and management of diseases or hazards. In risk-based designs, public health, economic and trade consequences of diseases play an important role in selection of diseases or hazards. Furthermore, certain strata of the population of interest have a higher probability to be sampled for detection of diseases or hazards. Evaluation of risk-based surveillance systems shall prove that the efficacy of risk-based systems is equal or higher than traditional systems; however, the efficiency (benefit-cost ratio shall be higher in risk-based surveillance systems. Summary Risk-based surveillance considerations are useful to support both strategic and operational decision making. This article highlights applications of risk-based surveillance systems in the veterinary field including food safety. Examples are provided for risk-based hazard selection, risk

  20. A robust moving mesh finite volume method applied to 1D hyperbolic conservation laws from magnetohydrodynamics

    NARCIS (Netherlands)

    Dam, A. van; Zegeling, P.A.

    2006-01-01

    In this paper we describe a one-dimensional adaptive moving mesh method and its application to hyperbolic conservation laws from magnetohydrodynamics (MHD). The method is robust, because it employs automatic control of mesh adaptation when a new model is considered, without manually-set

  1. Statistical methods applied to gamma-ray spectroscopy algorithms in nuclear security missions.

    Science.gov (United States)

    Fagan, Deborah K; Robinson, Sean M; Runkle, Robert C

    2012-10-01

    Gamma-ray spectroscopy is a critical research and development priority to a range of nuclear security missions, specifically the interdiction of special nuclear material involving the detection and identification of gamma-ray sources. We categorize existing methods by the statistical methods on which they rely and identify methods that have yet to be considered. Current methods estimate the effect of counting uncertainty but in many cases do not address larger sources of decision uncertainty, which may be significantly more complex. Thus, significantly improving algorithm performance may require greater coupling between the problem physics that drives data acquisition and statistical methods that analyze such data. Untapped statistical methods, such as Bayes Modeling Averaging and hierarchical and empirical Bayes methods, could reduce decision uncertainty by rigorously and comprehensively incorporating all sources of uncertainty. Application of such methods should further meet the needs of nuclear security missions by improving upon the existing numerical infrastructure for which these analyses have not been conducted. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Adjoint Weighting Methods Applied to Monte Carlo Simulations of Applications and Experiments in Nuclear Criticality

    Energy Technology Data Exchange (ETDEWEB)

    Kiedrowski, Brian C. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2014-03-11

    The goals of this project are to develop Monte Carlo radiation transport methods and simulation software for engineering analysis that are robust, efficient and easy to use; and provide computational resources to assess and improve the predictive capability of radiation transport methods and nuclear data.

  3. T2-01 A Method for Prioritizing Chemical Hazards in Food applied to Antibiotics

    NARCIS (Netherlands)

    Asselt, van E.D.; Spiegel, van der M.; Noordam, M.Y.; Pikkemaat, M.G.; Fels, van der H.J.

    2014-01-01

    Introduction: Part of risk based control is the prioritization of hazard-food combinations for monitoring food safety. There are currently many methods for ranking microbial hazards ranging from quantitative to qualitative methods, but there is hardly any information available for prioritizing

  4. A New Machine Classification Method Applied to Human Peripheral Blood Leukocytes.

    Science.gov (United States)

    Rorvig, Mark E.; And Others

    1993-01-01

    Discusses pattern classification of images by computer and describes the Two Domain Method in which expert knowledge is acquired using multidimensional scaling of judgments of dissimilarities and linear mapping. An application of the Two Domain Method that tested its power to discriminate two patterns of human blood leukocyte distribution is…

  5. Studying the properties of Variational Data Assimilation Methods by Applying a Set of Test-Examples

    DEFF Research Database (Denmark)

    Thomsen, Per Grove; Zlatev, Zahari

    2007-01-01

    data assimilation methods are used. The main idea, on which the variational data assimilation methods are based, is pretty general. A functional is formed by using a weighted inner product of differences of model results and measurements. The value of this functional is to be minimized. Forward...

  6. The quasi-exactly solvable potentials method applied to the three-body problem

    International Nuclear Information System (INIS)

    Chafa, F.; Chouchaoui, A.; Hachemane, M.; Ighezou, F.Z.

    2007-01-01

    The quasi-exactly solved potentials method is used to determine the energies and the corresponding exact eigenfunctions for three families of potentials playing an important role in the description of interactions occurring between three particles of equal mass. The obtained results may also be used as a test in evaluating the performance of numerical methods

  7. A linear perturbation computation method applied to hydrodynamic instability growth predictions in ICF targets

    International Nuclear Information System (INIS)

    Clarisse, J.M.; Boudesocque-Dubois, C.; Leidinger, J.P.; Willien, J.L.

    2006-01-01

    A linear perturbation computation method is used to compute hydrodynamic instability growth in model implosions of inertial confinement fusion direct-drive and indirect-drive designed targets. Accurate descriptions of linear perturbation evolutions for Legendre mode numbers up to several hundreds have thus been obtained in a systematic way, motivating further improvements of the physical modeling currently handled by the method. (authors)

  8. Heterogeneity among violence-exposed women: applying person-oriented research methods.

    Science.gov (United States)

    Nurius, Paula S; Macy, Rebecca J

    2008-03-01

    Variability of experience and outcomes among violence-exposed people pose considerable challenges toward developing effective prevention and treatment protocols. To address these needs, the authors present an approach to research and a class of methodologies referred to as person oriented. Person-oriented tools support assessment of meaningful patterns among people that distinguish one group from another, subgroups for whom different interventions are indicated. The authors review the conceptual base of person-oriented methods, outline their distinction from more familiar variable-oriented methods, present descriptions of selected methods as well as empirical applications of person-oriented methods germane to violence exposure, and conclude with discussion of implications for future research and translation between research and practice. The authors focus on violence against women as a population, drawing on stress and coping theory as a theoretical framework. However, person-oriented methods hold utility for investigating diversity among violence-exposed people's experiences and needs across populations and theoretical foundations.

  9. Applying cognitive developmental psychology to middle school physics learning: The rule assessment method

    Science.gov (United States)

    Hallinen, Nicole R.; Chi, Min; Chin, Doris B.; Prempeh, Joe; Blair, Kristen P.; Schwartz, Daniel L.

    2013-01-01

    Cognitive developmental psychology often describes children's growing qualitative understanding of the physical world. Physics educators may be able to use the relevant methods to advantage for characterizing changes in students' qualitative reasoning. Siegler developed the "rule assessment" method for characterizing levels of qualitative understanding for two factor situations (e.g., volume and mass for density). The method assigns children to rule levels that correspond to the degree they notice and coordinate the two factors. Here, we provide a brief tutorial plus a demonstration of how we have used this method to evaluate instructional outcomes with middle-school students who learned about torque, projectile motion, and collisions using different instructional methods with simulations.

  10. A multiparametric method of interpolation using WOA05 applied to anthropogenic CO2 in the Atlantic

    Directory of Open Access Journals (Sweden)

    Anton Velo

    2010-11-01

    Full Text Available This paper describes the development of a multiparametric interpolation method and its application to anthropogenic carbon (CANT in the Atlantic, calculated by two estimation methods using the CARINA database. The multiparametric interpolation proposed uses potential temperature (θ, salinity, conservative ‘NO’ and ‘PO’ as conservative parameters for the gridding, and the World Ocean Atlas (WOA05 as a reference for the grid structure and the indicated parameters. We thus complement CARINA data with WOA05 database in an attempt to obtain better gridded values by keeping the physical-biogeochemical sea structures. The algorithms developed here also have the prerequisite of being simple and easy to implement. To test the improvements achieved, a comparison between the proposed multiparametric method and a pure spatial interpolation for an independent parameter (O2 was made. As an application case study, CANT estimations by two methods (φCTº and TrOCA were performed on the CARINA database and then gridded by both interpolation methods (spatial and multiparametric. Finally, a calculation of CANT inventories for the whole Atlantic Ocean was performed with the gridded values and using ETOPO2v2 as the sea bottom. Thus, the inventories were between 55.1 and 55.2 Pg-C with the φCTº method and between 57.9 and 57.6 Pg-C with the TrOCA method.

  11. THE COST MANAGEMENT BY APPLYING THE STANDARD COSTING METHOD IN THE FURNITURE INDUSTRY-Case study

    Directory of Open Access Journals (Sweden)

    Radu Mărginean

    2013-06-01

    Full Text Available Among the modern calculation methods used in managerial accounting, with a large applicability in the industrial production field, we can find the standard costing method. This managerial approach of cost calculation has a real value in the managerial accounting field, due to its usefulness in forecasting production costs, helping the managers in the decision making process. The standard costing method in managerial accounting is part of modern managerial accounting methods, used in many enterprises with production activity. As research objectives for this paper, we propose studying the possibility of implementing this modern method of cost calculation in a company from the Romanian furniture industry, using real financial data. In order to achieve this aim, we used some specialized literature in the field of managerial accounting, showing the strengths and weaknesses of this method. The case study demonstrates that the standard costing modern method of cost calculation has full applicability in our case, and in conclusion it has a real value in the cost management process for enterprises in the Romanian furniture industry.

  12. Applying the Taguchi Method to River Water Pollution Remediation Strategy Optimization

    Directory of Open Access Journals (Sweden)

    Tsung-Ming Yang

    2014-04-01

    Full Text Available Optimization methods usually obtain the travel direction of the solution by substituting the solutions into the objective function. However, if the solution space is too large, this search method may be time consuming. In order to address this problem, this study incorporated the Taguchi method into the solution space search process of the optimization method, and used the characteristics of the Taguchi method to sequence the effects of the variation of decision variables on the system. Based on the level of effect, this study determined the impact factor of decision variables and the optimal solution for the model. The integration of the Taguchi method and the solution optimization method successfully obtained the optimal solution of the optimization problem, while significantly reducing the solution computing time and enhancing the river water quality. The results suggested that the basin with the greatest water quality improvement effectiveness is the Dahan River. Under the optimal strategy of this study, the severe pollution length was reduced from 18 km to 5 km.

  13. Applying the Taguchi method to river water pollution remediation strategy optimization.

    Science.gov (United States)

    Yang, Tsung-Ming; Hsu, Nien-Sheng; Chiu, Chih-Chiang; Wang, Hsin-Ju

    2014-04-15

    Optimization methods usually obtain the travel direction of the solution by substituting the solutions into the objective function. However, if the solution space is too large, this search method may be time consuming. In order to address this problem, this study incorporated the Taguchi method into the solution space search process of the optimization method, and used the characteristics of the Taguchi method to sequence the effects of the variation of decision variables on the system. Based on the level of effect, this study determined the impact factor of decision variables and the optimal solution for the model. The integration of the Taguchi method and the solution optimization method successfully obtained the optimal solution of the optimization problem, while significantly reducing the solution computing time and enhancing the river water quality. The results suggested that the basin with the greatest water quality improvement effectiveness is the Dahan River. Under the optimal strategy of this study, the severe pollution length was reduced from 18 km to 5 km.

  14. APPLIED BEHAVIOUR ANALYZE METHOD INCREASE SOCIAL INTERACTION CHILDREN WITH AUTISME, 2-5 YEARS OLD

    Directory of Open Access Journals (Sweden)

    Khoridatul Bahiyah

    2017-07-01

    Full Text Available Introduction: Autism is social interaction disorder in children. They were seemingly living in their own world. ABA method was a technique to decrease behaviour disorder or social interaction in autism children. The aimed of this research was to evaluate correlation between ABA method implementation and parents role with social interaction development in children with autism. Method: This research was used a cross sectional with purposive sampling. There  were 22 respondents who met to the inclusion criteria. The independent variable was ABA methode and the dependent variable was social interaction development. Data were collected by using questionnaire and observation, then analyzed by using Spearman Rho Correlation with significance level α≤0.05. Result: The result showed that there was a correlation between ABA method and social interaction development in autism children with p<0.30. Discussion: It can be concluded that ABA method has a correlation with social interaction in autism children. It is recommended that ABA method can be used as a technique to decrease social interaction disorder on autism children.

  15. A local expansion method applied to fast plasma boundary reconstruction for EAST

    Science.gov (United States)

    Guo, Yong; Xiao, Bingjia; Luo, Zhengping

    2011-10-01

    A fast plasma boundary reconstruction technique based on a local expansion method is designed for EAST. It represents the poloidal flux distribution in the vacuum region by a limited number of expansions. The plasma boundary reconstructed by the local expansion method is consistent with EFIT/RT-EFIT results for an arbitrary plasma configuration. On a Linux server with Intel (R) Xeon (TM) CPU 3.2 GHz, the method completes one plasma boundary reconstruction in about 150 µs. This technique is sufficiently reliable and fast for real-time shape control.

  16. Between visibility and surveillance

    DEFF Research Database (Denmark)

    Uldam, Julie

    As activists move from alternative media platforms to commercial social media platforms they face increasing challenges in protecting their online security and privacy. While government surveillance of activists is well-documented in both scholarly research and the media, corporate surveillance...... of activists remains under-researched. This presentation explores visibility as a prerequisite and an obstacle to political participation. The dual capacity of visibility in social media enables both surveillance and counter-surveillance by making not only the surveilled actor, but also the surveilling actor...... visible. It thus enables activists to monitor and expose corporate misconduct, but simultaneously renders them vulnerable to surveillance from corporations. In this presentation, I examine these practices and discuss their implications for political participation by drawing on examples of companies...

  17. Surveillance and Communication

    DEFF Research Database (Denmark)

    Bøge, Ask Risom; Albrechtslund, Anders; Lauritsen, Peter

    2017-01-01

    eyes of cameras are but one of many important aspects of the surveillance society. In particular, surveillance has become intrinsic to our digitally mediated communication. Many are constantly engaged in forms of social surveillance as they observe what friends, family, celebrities, love interests......, and acquaintances are up to on social media. In turn, they also leave trails of digital footprints that may be collected and analyzed by governments, businesses, or hackers. The imperceptible nature of this new surveillance raises some pressing concerns about our digital lives as our data doubles increasingly...... are particularly relevant to this topic and audience. The fourth section outlines a variety of themes in which surveillance of communication is being studied. Organized under the headings Tracking; Mass Surveillance; Media; and Art, Fiction, and Popular Culture, this section provides a survey in surveillance...

  18. Review on characterization methods applied to HTR-fuel element components

    International Nuclear Information System (INIS)

    Koizlik, K.

    1976-02-01

    One of the difficulties which on the whole are of no special scientific interest, but which bear a lot of technical problems for the development and production of HTR fuel elements is the proper characterization of the element and its components. Consequently a lot of work has been done during the past years to develop characterization procedures for the fuel, the fuel kernel, the pyrocarbon for the coatings, the matrix and graphite and their components binder and filler. This paper tries to give a status report on characterization procedures which are applied to HTR fuel in KFA and cooperating institutions. (orig.) [de

  19. A pulse stacking method of particle counting applied to position sensitive detection

    International Nuclear Information System (INIS)

    Basilier, E.

    1976-03-01

    A position sensitive particle counting system is described. A cyclic readout imaging device serves as an intermediate information buffer. Pulses are allowed to stack in the imager at very high counting rates. Imager noise is completely discriminated to provide very wide dynamic range. The system has been applied to a detector using cascaded microchannel plates. Pulse height spread produced by the plates causes some loss of information. The loss is comparable to the input loss of the plates. The improvement in maximum counting rate is several hundred times over previous systems that do not permit pulse stacking. (Auth.)

  20. Establishing seasonal and alert influenza thresholds in Cambodia using the WHO method: implications for effective utilization of influenza surveillance in the tropics and subtropics

    Directory of Open Access Journals (Sweden)

    Sovann Ly

    2017-03-01

    Full Text Available Objective: To establish seasonal and alert thresholds and transmission intensity categories for influenza to provide timely triggers for preventive measures or upscaling control measures in Cambodia. Methods: Using Cambodia’s influenza-like illness (ILI and laboratory-confirmed influenza surveillance data from 2009 to 2015, three parameters were assessed to monitor influenza activity: the proportion of ILI patients among all outpatients, proportion of ILI samples positive for influenza and the product of the two. With these parameters, four threshold levels (seasonal, moderate, high and alert were established and transmission intensity was categorized based on a World Health Organization alignment method. Parameters were compared against their respective thresholds. Results: Distinct seasonality was observed using the two parameters that incorporated laboratory data. Thresholds established using the composite parameter, combining syndromic and laboratory data, had the least number of false alarms in declaring season onset and were most useful in monitoring intensity. Unlike in temperate regions, the syndromic parameter was less useful in monitoring influenza activity or for setting thresholds. Conclusion: Influenza thresholds based on appropriate parameters have the potential to provide timely triggers for public health measures in a tropical country where monitoring and assessing influenza activity has been challenging. Based on these findings, the Ministry of Health plans to raise general awareness regarding influenza among the medical community and the general public. Our findings have important implications for countries in the tropics/subtropics and in resource-limited settings, and categorized transmission intensity can be used to assess severity of potential pandemic influenza as well as seasonal influenza.

  1. Textbook finite element methods applied to linear wave propagation problems involving conversion and absorption

    International Nuclear Information System (INIS)

    Appert, K.; Vaclavik, J.; Villard, L.; Hellsten, T.

    1986-01-01

    A system of two second-order ordinary differential equations describing wave propagation in a hot plasma is solved numerically by the finite element method involving standard linear and cubic elements. Evanescent short-wavelength modes do not constitute a problem because of the variational nature of the method. It is straightforward to generalize the method to systems of equations with more than two equations. The performance of the method is demonstrated on known physical situations and is measured by investigating the convergence properties. Cubic elements perform much better than linear ones. In an application it is shown that global plasma oscillations might have an importance for the linear wave conversion in the ion-cyclotron range of frequency. (orig.)

  2. Applying Formal Methods to an Information Security Device: An Experience Report

    National Research Council Canada - National Science Library

    Kirby, Jr, James; Archer, Myla; Heitmeyer, Constance

    1999-01-01

    .... This paper describes a case study in which the SCR method was used to specify and analyze a different class of system, a cryptographic system called CD, which must satisfy a large set of security properties...

  3. Review on applied foods and analyzed methods in identification testing of irradiated foods

    International Nuclear Information System (INIS)

    Kim, Kwang Hoon; Lee, Hoo Chul; Park, Sung Hyun; Kim, Soo Jin; Kim, Kwan Soo; Jeong, Il Yun; Lee, Ju Woon; Yook, Hong Sun

    2010-01-01

    Identification methods of irradiated foods have been adopted as official test by EU and Codex. PSL, TL, ESR and GC/MS methods were registered in Korea food code on 2009 and put in force as control system of verification for labelling of food irradiation. But most generally applicable PSL and TL methods are specified applicable foods according to domestic approved items. Unlike these specifications, foods unpermitted in Korea are included in applicable items of ESR and GC/MS methods. According to recent research data, numerous food groups are possible to effective legal control by identification and these items are demanded to permit regulations for irradiation additionally. Especially, the prohibition of irradiation for meats or seafoods is not harmonized with international standards and interacts as trade friction or industrial restrictions due to unprepared domestic regulation. Hence, extension of domestic legal permission for food irradiation can contrive to related industrial development and also can reduce trade friction and enhance international competitiveness

  4. Numerical analysis of the immersed boundary method applied to the flow around a forced oscillating cylinder

    International Nuclear Information System (INIS)

    Pinto, L C; Silvestrini, J H; Schettini, E B C

    2011-01-01

    In present paper, Navier-Stokes and Continuity equations for incompressible flow around an oscillating cylinder were numerically solved. Sixth order compact difference schemes were used to solve the spatial derivatives, while the time advance was carried out through second order Adams Bashforth accurate scheme. In order to represent the obstacle in the flow, the Immersed Boundary Method was adopted. In this method a force term is added to the Navier-Stokes equations representing the body. The simulations present results regarding the hydrodynamic coefficients and vortex wakes in agreement to experimental and numerical previous works and the physical lock-in phenomenon was identified. Comparing different methods to impose the IBM, it can be concluded that no alterations regarding the vortex shedding mode were observed. The Immersed Boundary Method techniques used here can represent the surface of an oscillating cylinder in the flow.

  5. Development of characterization methods applied to radioactive wastes and waste packages

    International Nuclear Information System (INIS)

    Guy, C.; Bienvenu, Ph.; Comte, J.; Excoffier, E.; Dodi, A.; Gal, O.; Gmar, M.; Jeanneau, F.; Poumarede, B.; Tola, F.; Moulin, V.; Jallu, F.; Lyoussi, A.; Ma, J.L.; Oriol, L.; Passard, Ch.; Perot, B.; Pettier, J.L.; Raoux, A.C.; Thierry, R.

    2004-01-01

    This document is a compilation of R and D studies carried out in the framework of the axis 3 of the December 1991 law about the conditioning and storage of high-level and long lived radioactive wastes and waste packages, and relative to the methods of characterization of these wastes. This R and D work has permitted to implement and qualify new methods (characterization of long-lived radioelements, high energy imaging..) and also to improve the existing methods by lowering detection limits and reducing uncertainties of measured data. This document is the result of the scientific production of several CEA laboratories that use complementary techniques: destructive methods and radiochemical analyses, photo-fission and active photonic interrogation, high energy imaging systems, neutron interrogation, gamma spectroscopy and active and passive imaging techniques. (J.S.)

  6. Neutron tomography as a reverse engineering method applied to the IS-60 Rover gas turbine

    CSIR Research Space (South Africa)

    Roos, TH

    2011-09-01

    Full Text Available Probably the most common method of reverse engineering in mechanical engineering involves measuring the physical geometry of a component using a coordinate measuring machine (CMM). Neutron tomography, in contrast, is used primarily as a non...

  7. Considerations on the question of applying ion exchange or reverse osmosis methods in boiler feedwater processing

    International Nuclear Information System (INIS)

    Marquardt, K.; Dengler, H.

    1976-01-01

    This consideration is to show that the method of reverse osmosis presents in many cases an interesting and economical alternative to part and total desolination plants using ion exchangers. The essential advantages of the reverse osmosis are a higher degree of automization, no additional salting of the removed waste water, small constructional volume of the plant as well as favourable operational costs with increasing salt content of the crude water to be processed. As there is a relatively high salt breakthrough compared to the ion exchange method, the future tendency in boiler feedwater processing will be more towards a combination of methods of reverse osmosis and post-purification through continuous ion exchange methods. (orig./LH) [de

  8. Overcoming the Problems of Inconsistent International Migration data : A New Method Applied to Flows in Europe

    NARCIS (Netherlands)

    de Beer, Joop; Raymer, James; van der Erf, Rob; van Wissen, Leo

    2010-01-01

    Due to differences in definitions and measurement methods, cross-country comparisons of international migration patterns are difficult and confusing. Emigration numbers reported by sending countries tend to differ from the corresponding immigration numbers reported by receiving countries. In this

  9. Overcoming the problems of inconsistent international migration data: a new method applied to flows in Europe

    NARCIS (Netherlands)

    de Beer, J.A.A.; Raymer, J.; van der Erf, R.F.; van Wissen, L.J.G.

    2010-01-01

    Due to differences in definitions and measurement methods, crosscountry comparisons of international migration patterns are difficult and confusing. Emigration numbers reported by sending countries tend to differ from the corresponding immigration numbers reported by receiving countries. In this

  10. Applied Warfighter Ergonomics: A Research Method for Evaluating Military Individual Equipment

    National Research Council Canada - National Science Library

    Takagi, Koichi

    2005-01-01

    The objective of this research effort is to design and implement a laboratory and establish a research method focused on scientific evaluation of human factors considerations for military individual...

  11. Computational performance of Free Mesh Method applied to continuum mechanics problems

    Science.gov (United States)

    YAGAWA, Genki

    2011-01-01

    The free mesh method (FMM) is a kind of the meshless methods intended for particle-like finite element analysis of problems that are difficult to handle using global mesh generation, or a node-based finite element method that employs a local mesh generation technique and a node-by-node algorithm. The aim of the present paper is to review some unique numerical solutions of fluid and solid mechanics by employing FMM as well as the Enriched Free Mesh Method (EFMM), which is a new version of FMM, including compressible flow and sounding mechanism in air-reed instruments as applications to fluid mechanics, and automatic remeshing for slow crack growth, dynamic behavior of solid as well as large-scale Eigen-frequency of engine block as applications to solid mechanics. PMID:21558753

  12. Galvanokinetic polarization method applied to the pitting corrosion study of stainless steels

    International Nuclear Information System (INIS)

    Le Xuan, Q.; Vu Quang, K.

    1992-01-01

    Galvanokinetic (GK) polarisation method was used to study the pitting corrosion of 316L stainless steel in chloride solution. Current scan rate effect on the pitting characteristic parameters was pointed out. Specific relations between current scan rate and some pitting characteristic parameters, such as critical current density I c , stable current density I s , critical time t c , stable time t s , were established. Some advantages of the GK polarisation method were discussed

  13. A novel method for applying reduced graphene oxide directly to electronic textiles from yarns to fabrics.

    Science.gov (United States)

    Yun, Yong Ju; Hong, Won G; Kim, Wan-Joong; Jun, Yongseok; Kim, Byung Hoon

    2013-10-25

    Conductive, flexible, and durable reduced RGO textiles with a facile preparation method are presented. BSA proteins serve as universal adhesives for improving the adsorption of GO onto any textile, irrespective of the materials and the surface conditions. Using this method, we successfully prepared various RGO textiles based on nylon-6 yarns, cotton yarns, polyester yarns, and nonwoven fabrics. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Sustainable Assessment of Aerosol Pollution Decrease Applying Multiple Attribute Decision-Making Methods

    Directory of Open Access Journals (Sweden)

    Audrius Čereška

    2016-06-01

    Full Text Available Air pollution with various materials, particularly with aerosols, increases with the advances in technological development. This is a complicated global problem. One of the priorities in achieving sustainable development is the reduction of harmful technological effects on the environment and human health. It is a responsibility of researchers to search for effective methods of reducing pollution. The reliable results can be obtained by combining the approaches used in various fields of science and technology. This paper aims to demonstrate the effectiveness of the multiple attribute decision-making (MADM methods in investigating and solving the environmental pollution problems. The paper presents the study of the process of the evaporation of a toxic liquid based on using the MADM methods. A schematic view of the test setup is presented. The density, viscosity, and rate of the released vapor flow are measured and the dependence of the variation of the solution concentration on its temperature is determined in the experimental study. The concentration of hydrochloric acid solution (HAS varies in the range from 28% to 34%, while the liquid is heated from 50 to 80 °C. The variations in the parameters are analyzed using the well-known VIKOR and COPRAS MADM methods. For determining the criteria weights, a new CILOS (Criterion Impact LOSs method is used. The experimental results are arranged in the priority order, using the MADM methods. Based on the obtained data, the technological parameters of production, ensuring minimum environmental pollution, can be chosen.

  15. Assessment of Pansharpening Methods Applied to WorldView-2 Imagery Fusion

    Directory of Open Access Journals (Sweden)

    Hui Li

    2017-01-01

    Full Text Available Since WorldView-2 (WV-2 images are widely used in various fields, there is a high demand for the use of high-quality pansharpened WV-2 images for different application purposes. With respect to the novelty of the WV-2 multispectral (MS and panchromatic (PAN bands, the performances of eight state-of-art pan-sharpening methods for WV-2 imagery including six datasets from three WV-2 scenes were assessed in this study using both quality indices and information indices, along with visual inspection. The normalized difference vegetation index, normalized difference water index, and morphological building index, which are widely used in applications related to land cover classification, the extraction of vegetation areas, buildings, and water bodies, were employed in this work to evaluate the performance of different pansharpening methods in terms of information presentation ability. The experimental results show that the Haze- and Ratio-based, adaptive Gram-Schmidt, Generalized Laplacian pyramids (GLP methods using enhanced spectral distortion minimal model and enhanced context-based decision model methods are good choices for producing fused WV-2 images used for image interpretation and the extraction of urban buildings. The two GLP-based methods are better choices than the other methods, if the fused images will be used for applications related to vegetation and water-bodies.

  16. Three-Dimensional CST Parameterization Method Applied in Aircraft Aeroelastic Analysis

    Directory of Open Access Journals (Sweden)

    Hua Su

    2017-01-01

    Full Text Available Class/shape transformation (CST method has advantages of adjustable design variables and powerful parametric geometric shape design ability and has been widely used in aerodynamic design and optimization processes. Three-dimensional CST is an extension for complex aircraft and can generate diverse three-dimensional aircraft and the corresponding mesh automatically and quickly. This paper proposes a parametric structural modeling method based on gridding feature extraction from the aerodynamic mesh generated by the three-dimensional CST method. This novel method can create parametric structural model for fuselage and wing and keep the coordination between the aerodynamic mesh and the structural mesh. Based on the generated aerodynamic model and structural model, an automatic process for aeroelastic modeling and solving is presented with the panel method for aerodynamic solver and NASTRAN for structural solver. A reusable launch vehicle (RLV is used to illustrate the process for aeroelastic modeling and solving. The result shows that this method can generate aeroelastic model for diverse complex three-dimensional aircraft automatically and reduce the difficulty of aeroelastic analysis dramatically. It provides an effective approach to make use of the aeroelastic analysis at the conceptual design phase for modern aircraft.

  17. Applying the Network Simulation Method for testing chaos in a resistively and capacitively shunted Josephson junction model

    Directory of Open Access Journals (Sweden)

    Fernando Gimeno Bellver

    Full Text Available In this paper, we explore the chaotic behavior of resistively and capacitively shunted Josephson junctions via the so-called Network Simulation Method. Such a numerical approach establishes a formal equivalence among physical transport processes and electrical networks, and hence, it can be applied to efficiently deal with a wide range of differential systems.The generality underlying that electrical equivalence allows to apply the circuit theory to several scientific and technological problems. In this work, the Fast Fourier Transform has been applied for chaos detection purposes and the calculations have been carried out in PSpice, an electrical circuit software.Overall, it holds that such a numerical approach leads to quickly computationally solve Josephson differential models. An empirical application regarding the study of the Josephson model completes the paper. Keywords: Electrical analogy, Network Simulation Method, Josephson junction, Chaos indicator, Fast Fourier Transform

  18. A Method for Evaluation and Comparison of Parallel Robots for Safe Human Interaction, Applied to Robotic TMS

    NARCIS (Netherlands)

    de Jong, Jan Johannes; Stienen, Arno; van der Wijk, V.; Wessels, Martijn; van der Kooij, Herman

    2012-01-01

    Transcranial magnetic stimulation (TMS) is a noninvasive method to modify behaviour of neurons in the brain. TMS is applied by running large currents through a coil close to the scalp. For consistent results it is required to maintain the coil position within millimetres of the targeted location,

  19. Solution of Ge(111)-(4x4)-Ag structure using direct methods applied to X-ray diffraction data

    DEFF Research Database (Denmark)

    Collazo-Davila, C.; Grozea, D.; Marks, L.D.

    1998-01-01

    A structure model for the Ge(111)-(4 x 4)-Ag surface is proposed. The model was derived by applying direct methods to surface X-ray diffraction data. It is a missing top layer reconstruction with six Ag atoms placed on Ge substitutional sites in one triangular subunit of the surface unit cell. A ...

  20. GLYCOHEMOGLOBIN - COMPARISON OF 12 ANALYTICAL METHODS, APPLIED TO LYOPHILIZED HEMOLYSATES BY 101 LABORATORIES IN AN EXTERNAL QUALITY ASSURANCE PROGRAM

    NARCIS (Netherlands)

    WEYKAMP, CW; PENDERS, TJ; MUSKIET, FAJ; VANDERSLIK, W

    Stable lyophilized ethylenediaminetetra-acetic acid (EDTA)-blood haemolysates were applied in an external quality assurance programme (SKZL, The Netherlands) for glycohaemoglobin assays in 101 laboratories using 12 methods. The mean intralaboratory day-to-day coefficient of variation (CV),

  1. Comparison of 15 evaporation methods applied to a small mountain lake in the northeastern USA

    Science.gov (United States)

    Rosenberry, D.O.; Winter, T.C.; Buso, D.C.; Likens, G.E.

    2007-01-01

    Few detailed evaporation studies exist for small lakes or reservoirs in mountainous settings. A detailed evaporation study was conducted at Mirror Lake, a 0.15 km2 lake in New Hampshire, northeastern USA, as part of a long-term investigation of lake hydrology. Evaporation was determined using 14 alternate evaporation methods during six open-water seasons and compared with values from the Bowen-ratio energy-budget (BREB) method, considered the standard. Values from the Priestley-Taylor, deBruin-Keijman, and Penman methods compared most favorably with BREB-determined values. Differences from BREB values averaged 0.19, 0.27, and 0.20 mm d-1, respectively, and results were within 20% of BREB values during more than 90% of the 37 monthly comparison periods. All three methods require measurement of net radiation, air temperature, change in heat stored in the lake, and vapor pressure, making them relatively data intensive. Several of the methods had substantial bias when compared with BREB values and were subsequently modified to eliminate bias. Methods that rely only on measurement of air temperature, or air temperature and solar radiation, were relatively cost-effective options for measuring evaporation at this small New England lake, outperforming some methods that require measurement of a greater number of variables. It is likely that the atmosphere above Mirror Lake was affected by occasional formation of separation eddies on the lee side of nearby high terrain, although those influences do not appear to be significant to measured evaporation from the lake when averaged over monthly periods. ?? 2007 Elsevier B.V. All rights reserved.

  2. Unsupervised nonlinear dimensionality reduction machine learning methods applied to multiparametric MRI in cerebral ischemia: preliminary results

    Science.gov (United States)

    Parekh, Vishwa S.; Jacobs, Jeremy R.; Jacobs, Michael A.

    2014-03-01

    The evaluation and treatment of acute cerebral ischemia requires a technique that can determine the total area of tissue at risk for infarction using diagnostic magnetic resonance imaging (MRI) sequences. Typical MRI data sets consist of T1- and T2-weighted imaging (T1WI, T2WI) along with advanced MRI parameters of diffusion-weighted imaging (DWI) and perfusion weighted imaging (PWI) methods. Each of these parameters has distinct radiological-pathological meaning. For example, DWI interrogates the movement of water in the tissue and PWI gives an estimate of the blood flow, both are critical measures during the evolution of stroke. In order to integrate these data and give an estimate of the tissue at risk or damaged; we have developed advanced machine learning methods based on unsupervised non-linear dimensionality reduction (NLDR) techniques. NLDR methods are a class of algorithms that uses mathematically defined manifolds for statistical sampling of multidimensional classes to generate a discrimination rule of guaranteed statistical accuracy and they can generate a two- or three-dimensional map, which represents the prominent structures of the data and provides an embedded image of meaningful low-dimensional structures hidden in their high-dimensional observations. In this manuscript, we develop NLDR methods on high dimensional MRI data sets of preclinical animals and clinical patients with stroke. On analyzing the performance of these methods, we observed that there was a high of similarity between multiparametric embedded images from NLDR methods and the ADC map and perfusion map. It was also observed that embedded scattergram of abnormal (infarcted or at risk) tissue can be visualized and provides a mechanism for automatic methods to delineate potential stroke volumes and early tissue at risk.

  3. Finite difference applied to the reconstruction method of the nuclear power density distribution

    International Nuclear Information System (INIS)

    Pessoa, Paulo O.; Silva, Fernando C.; Martinez, Aquilino S.

    2016-01-01

    Highlights: • A method for reconstruction of the power density distribution is presented. • The method uses discretization by finite differences of 2D neutrons diffusion equation. • The discretization is performed homogeneous meshes with dimensions of a fuel cell. • The discretization is combined with flux distributions on the four node surfaces. • The maximum errors in reconstruction occur in the peripheral water region. - Abstract: In this reconstruction method the two-dimensional (2D) neutron diffusion equation is discretized by finite differences, employed to two energy groups (2G) and meshes with fuel-pin cell dimensions. The Nodal Expansion Method (NEM) makes use of surface discontinuity factors of the node and provides for reconstruction method the effective multiplication factor of the problem and the four surface average fluxes in homogeneous nodes with size of a fuel assembly (FA). The reconstruction process combines the discretized 2D diffusion equation by finite differences with fluxes distribution on four surfaces of the nodes. These distributions are obtained for each surfaces from a fourth order one-dimensional (1D) polynomial expansion with five coefficients to be determined. The conditions necessary for coefficients determination are three average fluxes on consecutive surfaces of the three nodes and two fluxes in corners between these three surface fluxes. Corner fluxes of the node are determined using a third order 1D polynomial expansion with four coefficients. This reconstruction method uses heterogeneous nuclear parameters directly providing the heterogeneous neutron flux distribution and the detailed nuclear power density distribution within the FAs. The results obtained with this method has good accuracy and efficiency when compared with reference values.

  4. Control method of dioxines and furans fallout around a UIOM; Methode de surveillance des retombees des dioxines et furanes autour d'une UIOM

    Energy Technology Data Exchange (ETDEWEB)

    Durif, M.

    2001-12-15

    This report presents the different methods used to evaluate the pollution related to the dioxines and the furans around an emitted source. Based on the advantages and disadvantages of each methods, a protocol is decided to realize a state of the art around a future implementation site of a unit of domestic wastes incineration (UIOM). This protocol will also be able to control and identify the origins of the fallout around the installation when it will operate. (A.L.B.)

  5. Boundary element method applied to a gas-fired pin-fin-enhanced heat pipe

    Energy Technology Data Exchange (ETDEWEB)

    Andraka, C.E.; Knorovsky, G.A.; Drewien, C.A.

    1998-02-01

    The thermal conduction of a portion of an enhanced surface heat exchanger for a gas fired heat pipe solar receiver was modeled using the boundary element and finite element methods (BEM and FEM) to determine the effect of weld fillet size on performance of a stud welded pin fin. A process that could be utilized by others for designing the surface mesh on an object of interest, performing a conversion from the mesh into the input format utilized by the BEM code, obtaining output on the surface of the object, and displaying visual results was developed. It was determined that the weld fillet on the pin fin significantly enhanced the heat performance, improving the operating margin of the heat exchanger. The performance of the BEM program on the pin fin was measured (as computational time) and used as a performance comparison with the FEM model. Given similar surface element densities, the BEM method took longer to get a solution than the FEM method. The FEM method creates a sparse matrix that scales in storage and computation as the number of nodes (N), whereas the BEM method scales as N{sup 2} in storage and N{sup 3} in computation.

  6. On the spectral nodal methods applied to discrete ordinates eigenvalue problems in Cartesian geometry

    International Nuclear Information System (INIS)

    Abreu, Marcos P. de; Alves Filho, Hermes; Barros, Ricardo C.

    2001-01-01

    We describe hybrid spectral nodal methods for discrete ordinates (SN) eigenvalue problems in Cartesian geometry. These coarse-mesh methods are based on three ingredients: the use of the standard discretized spatial balance SN equations; the use of the non-standard spectral diamond (SD) auxiliary equations in the multiplying regions of the domain, e.g. fuel assemblies; and the use of the non-standard spectral Green's function (SGF) auxiliary equations in the non-multiplying regions of the domain, e.g., the reflector. In slab-geometry the hybrid SD-SGF method generates numerical results that are completely free of spatial truncation errors. In X,Y-geometry, we obtain a system of two 'slab-geometry' SN equations for the node-edge average angular fluxes by transverse-integrating the X,Y-geometry SN equations separately in the y- and then in the x-directions within an arbitrary node of the spatial grid set up on the domain. In this paper, we approximate the transverse leakage terms by constants. These are the only approximations considered in the SD-SGF-constant nodal method, as the source terms, that include scattering and eventually fission events, are treated exactly. We show numerical results to typical model problems to illustrate the accuracy of spectral nodal methods for coarse-mesh SN criticality calculations. (author)

  7. A combined approach of AHP and TOPSIS methods applied in the field of integrated software systems

    Science.gov (United States)

    Berdie, A. D.; Osaci, M.; Muscalagiu, I.; Barz, C.

    2017-05-01

    Adopting the most appropriate technology for developing applications on an integrated software system for enterprises, may result in great savings both in cost and hours of work. This paper proposes a research study for the determination of a hierarchy between three SAP (System Applications and Products in Data Processing) technologies. The technologies Web Dynpro -WD, Floorplan Manager - FPM and CRM WebClient UI - CRM WCUI are multi-criteria evaluated in terms of the obtained performances through the implementation of the same web business application. To establish the hierarchy a multi-criteria analysis model that combines the AHP (Analytic Hierarchy Process) and the TOPSIS (Technique for Order Preference by Similarity to Ideal Solution) methods was proposed. This model was built with the help of the SuperDecision software. This software is based on the AHP method and determines the weights for the selected sets of criteria. The TOPSIS method was used to obtain the final ranking and the technologies hierarchy.

  8. A generic method for assignment of reliability scores applied to solvent accessibility predictions

    DEFF Research Database (Denmark)

    Petersen, Bent; Petersen, Thomas Nordahl; Andersen, Pernille

    2009-01-01

    the relative exposure of the amino acids. The method assigns a reliability score to each surface accessibility prediction as an inherent part of the training process. This is in contrast to the most commonly used procedures where reliabilities are obtained by post-processing the output. CONCLUSION......Estimation of the reliability of specific real value predictions is nontrivial and the efficacy of this is often questionable. It is important to know if you can trust a given prediction and therefore the best methods associate a prediction with a reliability score or index. For discrete...... qualitative predictions, the reliability is conventionally estimated as the difference between output scores of selected classes. Such an approach is not feasible for methods that predict a biological feature as a single real value rather than a classification. As a solution to this challenge, we have...

  9. The increase in the starting torque of PMSM motor by applying of FOC method

    Science.gov (United States)

    Plachta, Kamil

    2017-05-01

    The article presents field oriented control method of synchronous permanent magnet motor equipped in optical sensors. This method allows for a wide range regulation of torque and rotational speed of the electric motor. The paper presents mathematical model of electric motor and vector control method. Optical sensors have shorter time response as compared to the inductive sensors, which allow for faster response of the electronic control system to changes of motor loads. The motor driver is based on the digital signal processor which performs advanced mathematical operations in real time. The appliance of Clark and Park transformation in the software defines the angle of rotor position. The presented solution provides smooth adjustment of the rotational speed in the first operating zone and reduces the dead zone of the torque in the second and third operating zones.

  10. Whole-Genome Regression and Prediction Methods Applied to Plant and Animal Breeding

    Science.gov (United States)

    de los Campos, Gustavo; Hickey, John M.; Pong-Wong, Ricardo; Daetwyler, Hans D.; Calus, Mario P. L.

    2013-01-01

    Genomic-enabled prediction is becoming increasingly important in animal and plant breeding and is also receiving attention in human genetics. Deriving accurate predictions of complex traits requires implementing whole-genome regression (WGR) models where phenotypes are regressed on thousands of markers concurrently. Methods exist that allow implementing these large-p with small-n regressions, and genome-enabled selection (GS) is being implemented in several plant and animal breeding programs. The list of available methods is long, and the relationships between them have not been fully addressed. In this article we provide an overview of available methods for implementing parametric WGR models, discuss selected topics that emerge in applications, and present a general discussion of lessons learned from simulation and empirical data analysis in the last decade. PMID:22745228

  11. Two Thermoeconomic Diagnosis Methods Applied to Representative Operating Data of a Commercial Transcritical Refrigeration Plant

    DEFF Research Database (Denmark)

    Ommen, Torben Schmidt; Sigthorsson, Oskar; Elmegaard, Brian

    2017-01-01

    in the low measurement uncertainty scenario, both methods are applicable to locate the causes of the malfunctions. For both the scenarios an outlier limit was found, which determines if it was possible to reject a high relative indicator based on measurement uncertainty. For high uncertainties, the threshold...... on the magnitude of the uncertainties. Two different uncertainty scenarios were evaluated, as the use of repeated measurements yields a lower magnitude of uncertainty. The two methods show similar performance in the presented study for both of the considered measurement uncertainty scenarios. However, only...... value of the relative indicator was 35, whereas for low uncertainties one of the methods resulted in a threshold at 8. Additionally, the contribution of different measuring instruments to the relative indicator in two central components was analysed. It shows that the contribution was component...

  12. Applying Process Improvement Methods to Clinical and Translational Research: Conceptual Framework and Case Examples.

    Science.gov (United States)

    Daudelin, Denise H; Selker, Harry P; Leslie, Laurel K

    2015-12-01

    There is growing appreciation that process improvement holds promise for improving quality and efficiency across the translational research continuum but frameworks for such programs are not often described. The purpose of this paper is to present a framework and case examples of a Research Process Improvement Program implemented at Tufts CTSI. To promote research process improvement, we developed online training seminars, workshops, and in-person consultation models to describe core process improvement principles and methods, demonstrate the use of improvement tools, and illustrate the application of these methods in case examples. We implemented these methods, as well as relational coordination theory, with junior researchers, pilot funding awardees, our CTRC, and CTSI resource and service providers. The program focuses on capacity building to address common process problems and quality gaps that threaten the efficient, timely and successful completion of clinical and translational studies. © 2015 The Authors. Clinical and Translational Science published by Wiley Periodicals, Inc.

  13. Boundary Element Method Applied to Added Mass Coefficient Calculation of the Skewed Marine Propellers

    Directory of Open Access Journals (Sweden)

    Yari Ehsan

    2016-04-01

    Full Text Available The paper mainly aims to study computation of added mass coefficients for marine propellers. A three-dimensional boundary element method (BEM is developed to predict the propeller added mass and moment of inertia coefficients. Actually, only few experimental data sets are available as the validation reference. Here the method is validated with experimental measurements of the B-series marine propeller. The behavior of the added mass coefficients predicted based on variation of geometric and flow parameters of the propeller is calculated and analyzed. BEM is more accurate in obtaining added mass coefficients than other fast numerical methods. All added mass coefficients are nondimensionalized by fluid density, propeller diameter, and rotational velocity. The obtained results reveal that the diameter, expanded area ratio, and thickness have dominant influence on the increase of the added mass coefficients.

  14. An Efficient Acoustic Density Estimation Method with Human Detectors Applied to Gibbons in Cambodia.

    Directory of Open Access Journals (Sweden)

    Darren Kidney

    Full Text Available Some animal species are hard to see but easy to hear. Standard visual methods for estimating population density for such species are often ineffective or inefficient, but methods based on passive acoustics show more promise. We develop spatially explicit capture-recapture (SECR methods for territorial vocalising species, in which humans act as an acoustic detector array. We use SECR and estimated bearing data from a single-occasion acoustic survey of a gibbon population in northeastern Cambodia to estimate the density of calling groups. The properties of the estimator are assessed using a simulation study, in which a variety of survey designs are also investigated. We then present a new form of the SECR likelihood for multi-occasion data which accounts for the stochastic availability of animals. In the context of gibbon surveys this allows model-based estimation of the proportion of groups that produce territorial vocalisations on a given day, thereby enabling the density of groups, instead of the density of calling groups, to be estimated. We illustrate the performance of this new estimator by simulation. We show that it is possible to estimate density reliably from human acoustic detections of visually cryptic species using SECR methods. For gibbon surveys we also show that incorporating observers' estimates of bearings to detected groups substantially improves estimator performance. Using the new form of the SECR likelihood we demonstrate that estimates of availability, in addition to population density and detection function parameters, can be obtained from multi-occasion data, and that the detection function parameters are not confounded with the availability parameter. This acoustic SECR method provides a means of obtaining reliable density estimates for territorial vocalising species. It is also efficient in terms of data requirements since since it only requires routine survey data. We anticipate that the low-tech field requirements will

  15. Analysis of Steel Wire Rope Diagnostic Data Applying Multi-Criteria Methods

    Directory of Open Access Journals (Sweden)

    Audrius Čereška

    2018-02-01

    Full Text Available Steel ropes are complex flexible structures used in many technical applications, such as elevators, cable cars, and funicular cabs. Due to the specific design and critical safety requirements, diagnostics of ropes remains an important issue. Broken wire number in the steel ropes is limited by safety standards when they are used in the human lifting and carrying installations. There are some practical issues on loose wires—firstly, it shows end of lifetime of the entire rope, independently of wear, lubrication or wrong winding on the drums or through pulleys; and, secondly, it can stick in the tight pulley—support gaps and cause deterioration of rope structure up to birdcage formations. Normal rope operation should not generate broken wires, so increasing of their number shows a need for rope installation maintenance. This paper presents a methodology of steel rope diagnostics and the results of analysis using multi-criteria analysis methods. The experimental part of the research was performed using an original test bench to detect broken wires on the rope surface by its vibrations. Diagnostics was performed in the range of frequencies from 60 to 560 Hz with a pitch of 50 Hz. The obtained amplitudes of the broken rope wire vibrations, different from the entire rope surface vibration parameters, was the significant outcome. Later analysis of the obtained experimental results revealed the most significant values of the diagnostic parameters. The evaluation of the power of the diagnostics was implemented by using multi-criteria decision-making (MCDM methods. Various decision-making methods are necessary due to unknown efficiencies with respect to the physical phenomena of the evaluated processes. The significance of the methods was evaluated using objective methods from the structure of the presented data. Some of these methods were proposed by authors of this paper. Implementation of MCDM in diagnostic data analysis and definition of the

  16. An Analysis of Methods Section of Research Reports in Applied Linguistics

    Directory of Open Access Journals (Sweden)

    Patrícia Marcuzzo

    2011-10-01

    Full Text Available This work aims at identifying analytical categories and research procedures adopted in the analysis of research article in Applied Linguistics/EAP in order to propose a systematization of the research procedures in Genre Analysis. For that purpose, 12 research reports and interviews with four authors were analyzed. The analysis showed that the studies are concentrated on the investigation of the macrostructure or on the microstructure of research articles in different fields. Studies about the microstructure report exclusively the analysis of grammatical elements, and studies about the macrostructure investigate the language with the purpose of identifying patterns of organization in written discourse. If the objective of these studies is in fact to develop a genre analysis in order to contribute to reading and writing teaching in EAP, these studies should include an ethnographic perspective that analyzes the genre based on its context.

  17. Color changes in wood during heating: kinetic analysis by applying a time-temperature superposition method

    Science.gov (United States)

    Matsuo, Miyuki; Yokoyama, Misao; Umemura, Kenji; Gril, Joseph; Yano, Ken'ichiro; Kawai, Shuichi

    2010-04-01

    This paper deals with the kinetics of the color properties of hinoki ( Chamaecyparis obtusa Endl.) wood. Specimens cut from the wood were heated at 90-180°C as accelerated aging treatment. The specimens completely dried and heated in the presence of oxygen allowed us to evaluate the effects of thermal oxidation on wood color change. Color properties measured by a spectrophotometer showed similar behavior irrespective of the treatment temperature with each time scale. Kinetic analysis using the time-temperature superposition principle, which uses the whole data set, was successfully applied to the color changes. The calculated values of the apparent activation energy in terms of L *, a *, b *, and Δ E^{*}_{ab} were 117, 95, 114, and 113 kJ/mol, respectively, which are similar to the values of the literature obtained for other properties such as the physical and mechanical properties of wood.

  18. Benthic microalgal production in the Arctic: Applied methods and status of the current database

    DEFF Research Database (Denmark)

    Glud, Ronnie Nøhr; Woelfel, Jana; Karsten, Ulf

    2009-01-01

    the often very confusing terminology in the existing literature. Our compilation demonstrates that i) benthic microalgae contribute significantly to coastal ecosystem production in the Arctic, and ii) benthic microalgal production on average exceeds pelagic productivity by a factor of 1.5 for water depths......The current database on benthic microalgal production in Arctic waters comprises 10 peer-reviewed and three unpublished studies. Here, we compile and discuss these datasets, along with the applied measurement approaches used. The latter is essential for robust comparative analysis and to clarify...... down to 30 m. We have established relationships between irradiance, water depth and benthic microalgal productivity that can be used to extrapolate results from quantitative experimental studies to the entire Arctic region. Two different approaches estimated that current benthic microalgal production...

  19. Microbeam high-resolution diffraction and x-ray standing wave methods applied to semiconductor structures

    International Nuclear Information System (INIS)

    Kazimirov, A; Bilderback, D H; Huang, R; Sirenko, A; Ougazzaden, A

    2004-01-01

    A new approach to conditioning x-ray microbeams for high angular resolution x-ray diffraction and scattering techniques is introduced. We combined focusing optics (one-bounce imaging capillary) and post-focusing collimating optics (miniature Si(004) channel-cut crystal) to generate an x-ray microbeam with a size of 10 μm and ultimate angular resolution of 14 μrad. The microbeam was used to analyse the strain in sub-micron thick InGaAsP epitaxial layers grown on an InP(100) substrate by the selective area growth technique in narrow openings between the oxide stripes. For the structures for which the diffraction peaks from the substrate and the film overlap, the x-ray standing wave technique was applied for precise measurements of the strain with a Δd/d resolution of better than 10 -4 . (rapid communication)

  20. A comparison of several practical smoothing methods applied to Auger electron energy distributions and line scans

    International Nuclear Information System (INIS)

    Yu, K.S.; Prutton, M.; Larson, L.A.; Pate, B.B.; Poppa, H.

    1982-01-01

    Data-fitting routines utilizing nine-point least-squares quadratic, stiff spline, and piecewise least-squares polynomial methods have been compared on noisy Auger spectra and line scans. The spline-smoothing technique has been found to be the most useful and practical, allowing information to be extracted with excellent integrity from model Auger data having close to unity signal-to-noise ratios. Automatic determination of stiffness parameters is described. A comparison of the relative successes of these smoothing methods, using artificial data, is given. Applications of spline smoothing are presented to illustrate its effectiveness for difference spectra and for noisy Auger line scans. (orig.)