WorldWideScience

Sample records for methods proposed update

  1. 76 FR 28194 - Proposed FOIA Fee Schedule Update

    Science.gov (United States)

    2011-05-16

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 Proposed FOIA Fee Schedule Update AGENCY... publishing its proposed Freedom of Information Act (FOIA) Fee Schedule Update and solicits comments from... on the proposed fee schedule should be mailed or delivered to the Office of the General Counsel...

  2. 75 FR 27228 - Proposed FOIA Fee Schedule Update

    Science.gov (United States)

    2010-05-14

    ... DEFENSE NUCLEAR FACILITIES SAFETY BOARD 10 CFR Part 1703 Proposed FOIA Fee Schedule Update AGENCY... publishing its proposed Freedom of Information Act (FOIA) Fee Schedule Update and solicits comments from... on the proposed fee schedule should be mailed or delivered to the Office of the General Counsel...

  3. 77 FR 33980 - Proposed FOIA Fee Schedule Update

    Science.gov (United States)

    2012-06-08

    ... 1703 Proposed FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Notice... the Board's proposed FOIA Fee Schedule Update published in the Federal Register of June 1, 2012. The...: The FOIA requires each Federal agency covered by the Act to specify a schedule of fees applicable to...

  4. 77 FR 32433 - Proposed FOIA Fee Schedule Update

    Science.gov (United States)

    2012-06-01

    ... 1703 Proposed FOIA Fee Schedule Update AGENCY: Defense Nuclear Facilities Safety Board. ACTION: Notice... Defense Nuclear Facilities Safety Board is publishing its proposed Freedom of Information Act (FOIA) Fee.... on or before July 2, 2012. ADDRESSES: Comments on the proposed fee schedule should be mailed or...

  5. Test Methods for Evaluating Solid Waste, Physical/Chemical Methods. First Update. (3rd edition)

    International Nuclear Information System (INIS)

    Friedman; Sellers.

    1988-01-01

    The proposed Update is for Test Methods for Evaluating Solid Waste, Physical/Chemical Methods, SW-846, Third Edition. Attached to the report is a list of methods included in the proposed update indicating whether the method is a new method, a partially revised method, or a totally revised method. Do not discard or replace any of the current pages in the SW-846 manual until the proposed update I package is promulgated. Until promulgation of the update package, the methods in the update package are not officially part of the SW-846 manual and thus do not carry the status of EPA-approved methods. In addition to the proposed Update, six finalized methods are included for immediate inclusion into the Third Edition of SW-846. Four methods, originally proposed October 1, 1984, will be finalized in a soon to be released rulemaking. They are, however, being submitted to subscribers for the first time in the update. These methods are 7211, 7381, 7461, and 7951. Two other methods were finalized in the 2nd Edition of SW-846. They were inadvertantly omitted from the 3rd Edition and are not being proposed as new. These methods are 7081 and 7761

  6. Technical Notes: Notes and Proposed Guidelines on Updated ...

    African Journals Online (AJOL)

    In light of recent expansion in the planning and construction of major building structures as well as other infrastructures such as railways, masshousing, dams, bridges, etc, this paper reviews the extent of seismic hazard in Ethiopia and proposes a review and update of the current out-dated and - in most cases ...

  7. Nonparametric methods in actigraphy: An update

    Directory of Open Access Journals (Sweden)

    Bruno S.B. Gonçalves

    2014-09-01

    Full Text Available Circadian rhythmicity in humans has been well studied using actigraphy, a method of measuring gross motor movement. As actigraphic technology continues to evolve, it is important for data analysis to keep pace with new variables and features. Our objective is to study the behavior of two variables, interdaily stability and intradaily variability, to describe rest activity rhythm. Simulated data and actigraphy data of humans, rats, and marmosets were used in this study. We modified the method of calculation for IV and IS by modifying the time intervals of analysis. For each variable, we calculated the average value (IVm and ISm results for each time interval. Simulated data showed that (1 synchronization analysis depends on sample size, and (2 fragmentation is independent of the amplitude of the generated noise. We were able to obtain a significant difference in the fragmentation patterns of stroke patients using an IVm variable, while the variable IV60 was not identified. Rhythmic synchronization of activity and rest was significantly higher in young than adults with Parkinson׳s when using the ISM variable; however, this difference was not seen using IS60. We propose an updated format to calculate rhythmic fragmentation, including two additional optional variables. These alternative methods of nonparametric analysis aim to more precisely detect sleep–wake cycle fragmentation and synchronization.

  8. A Kriging Model Based Finite Element Model Updating Method for Damage Detection

    Directory of Open Access Journals (Sweden)

    Xiuming Yang

    2017-10-01

    Full Text Available Model updating is an effective means of damage identification and surrogate modeling has attracted considerable attention for saving computational cost in finite element (FE model updating, especially for large-scale structures. In this context, a surrogate model of frequency is normally constructed for damage identification, while the frequency response function (FRF is rarely used as it usually changes dramatically with updating parameters. This paper presents a new surrogate model based model updating method taking advantage of the measured FRFs. The Frequency Domain Assurance Criterion (FDAC is used to build the objective function, whose nonlinear response surface is constructed by the Kriging model. Then, the efficient global optimization (EGO algorithm is introduced to get the model updating results. The proposed method has good accuracy and robustness, which have been verified by a numerical simulation of a cantilever and experimental test data of a laboratory three-story structure.

  9. A parallel orbital-updating based plane-wave basis method for electronic structure calculations

    International Nuclear Information System (INIS)

    Pan, Yan; Dai, Xiaoying; Gironcoli, Stefano de; Gong, Xin-Gao; Rignanese, Gian-Marco; Zhou, Aihui

    2017-01-01

    Highlights: • Propose three parallel orbital-updating based plane-wave basis methods for electronic structure calculations. • These new methods can avoid the generating of large scale eigenvalue problems and then reduce the computational cost. • These new methods allow for two-level parallelization which is particularly interesting for large scale parallelization. • Numerical experiments show that these new methods are reliable and efficient for large scale calculations on modern supercomputers. - Abstract: Motivated by the recently proposed parallel orbital-updating approach in real space method , we propose a parallel orbital-updating based plane-wave basis method for electronic structure calculations, for solving the corresponding eigenvalue problems. In addition, we propose two new modified parallel orbital-updating methods. Compared to the traditional plane-wave methods, our methods allow for two-level parallelization, which is particularly interesting for large scale parallelization. Numerical experiments show that these new methods are more reliable and efficient for large scale calculations on modern supercomputers.

  10. Improved Quasi-Newton method via PSB update for solving systems of nonlinear equations

    Science.gov (United States)

    Mamat, Mustafa; Dauda, M. K.; Waziri, M. Y.; Ahmad, Fadhilah; Mohamad, Fatma Susilawati

    2016-10-01

    The Newton method has some shortcomings which includes computation of the Jacobian matrix which may be difficult or even impossible to compute and solving the Newton system in every iteration. Also, the common setback with some quasi-Newton methods is that they need to compute and store an n × n matrix at each iteration, this is computationally costly for large scale problems. To overcome such drawbacks, an improved Method for solving systems of nonlinear equations via PSB (Powell-Symmetric-Broyden) update is proposed. In the proposed method, the approximate Jacobian inverse Hk of PSB is updated and its efficiency has improved thereby require low memory storage, hence the main aim of this paper. The preliminary numerical results show that the proposed method is practically efficient when applied on some benchmark problems.

  11. A review of methods for updating forest monitoring system estimates

    Science.gov (United States)

    Hector Franco-Lopez; Alan R. Ek; Andrew P. Robinson

    2000-01-01

    Intensifying interest in forests and the development of new monitoring technologies have induced major changes in forest monitoring systems in the last few years, including major revisions in the methods used for updating. This paper describes the methods available for projecting stand- and plot-level information, emphasizing advantages and disadvantages, and the...

  12. Aircraft engine sensor fault diagnostics using an on-line OBEM update method.

    Directory of Open Access Journals (Sweden)

    Xiaofeng Liu

    Full Text Available This paper proposed a method to update the on-line health reference baseline of the On-Board Engine Model (OBEM to maintain the effectiveness of an in-flight aircraft sensor Fault Detection and Isolation (FDI system, in which a Hybrid Kalman Filter (HKF was incorporated. Generated from a rapid in-flight engine degradation, a large health condition mismatch between the engine and the OBEM can corrupt the performance of the FDI. Therefore, it is necessary to update the OBEM online when a rapid degradation occurs, but the FDI system will lose estimation accuracy if the estimation and update are running simultaneously. To solve this problem, the health reference baseline for a nonlinear OBEM was updated using the proposed channel controller method. Simulations based on the turbojet engine Linear-Parameter Varying (LPV model demonstrated the effectiveness of the proposed FDI system in the presence of substantial degradation, and the channel controller can ensure that the update process finishes without interference from a single sensor fault.

  13. Updated Methods for Seed Shape Analysis

    Directory of Open Access Journals (Sweden)

    Emilio Cervantes

    2016-01-01

    Full Text Available Morphological variation in seed characters includes differences in seed size and shape. Seed shape is an important trait in plant identification and classification. In addition it has agronomic importance because it reflects genetic, physiological, and ecological components and affects yield, quality, and market price. The use of digital technologies, together with development of quantification and modeling methods, allows a better description of seed shape. Image processing systems are used in the automatic determination of seed size and shape, becoming a basic tool in the study of diversity. Seed shape is determined by a variety of indexes (circularity, roundness, and J index. The comparison of the seed images to a geometrical figure (circle, cardioid, ellipse, ellipsoid, etc. provides a precise quantification of shape. The methods of shape quantification based on these models are useful for an accurate description allowing to compare between genotypes or along developmental phases as well as to establish the level of variation in different sets of seeds.

  14. FE Model Updating on an In-Service Self-Anchored Suspension Bridge with Extra-Width Using Hybrid Method

    Directory of Open Access Journals (Sweden)

    Zhiyuan Xia

    2017-02-01

    Full Text Available Nowadays, many more bridges with extra-width have been needed for vehicle throughput. In order to obtain a precise finite element (FE model of those complex bridge structures, the practical hybrid updating method by integration of Gaussian mutation particle swarm optimization (GMPSO, Kriging meta-model and Latin hypercube sampling (LHS was proposed. By demonstrating the efficiency and accuracy of the hybrid method through the model updating of a damaged simply supported beam, the proposed method was applied to the model updating of a self-anchored suspension bridge with extra-width which showed great necessity considering the results of ambient vibration test. The results of bridge model updating showed that both of the mode frequencies and shapes had relatively high agreement between the updated model and experimental structure. The successful model updating of this bridge fills in the blanks of model updating of a complex self-anchored suspension bridge. Moreover, the updating process enables other model updating issues for complex bridge structures

  15. Updating National Topographic Data Base Using Change Detection Methods

    Science.gov (United States)

    Keinan, E.; Felus, Y. A.; Tal, Y.; Zilberstien, O.; Elihai, Y.

    2016-06-01

    The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA), the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS) classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  16. Modified methods for growing 3-D skin equivalents: an update.

    Science.gov (United States)

    Lamb, Rebecca; Ambler, Carrie A

    2014-01-01

    Artificial epidermis can be reconstituted in vitro by seeding primary epidermal cells (keratinocytes) onto a supportive substrate and then growing the developing skin equivalent at the air-liquid interface. In vitro skin models are widely used to study skin biology and for industrial drug and cosmetic testing. Here, we describe updated methods for growing 3-dimensional skin equivalents using de-vitalized, de-epidermalized dermis (DED) substrates including methods for DED substrate preparation, cell seeding, growth conditions, and fixation procedures.

  17. UPDATING NATIONAL TOPOGRAPHIC DATA BASE USING CHANGE DETECTION METHODS

    Directory of Open Access Journals (Sweden)

    E. Keinan

    2016-06-01

    Full Text Available The traditional method for updating a topographic database on a national scale is a complex process that requires human resources, time and the development of specialized procedures. In many National Mapping and Cadaster Agencies (NMCA, the updating cycle takes a few years. Today, the reality is dynamic and the changes occur every day, therefore, the users expect that the existing database will portray the current reality. Global mapping projects which are based on community volunteers, such as OSM, update their database every day based on crowdsourcing. In order to fulfil user's requirements for rapid updating, a new methodology that maps major interest areas while preserving associated decoding information, should be developed. Until recently, automated processes did not yield satisfactory results, and a typically process included comparing images from different periods. The success rates in identifying the objects were low, and most were accompanied by a high percentage of false alarms. As a result, the automatic process required significant editorial work that made it uneconomical. In the recent years, the development of technologies in mapping, advancement in image processing algorithms and computer vision, together with the development of digital aerial cameras with NIR band and Very High Resolution satellites, allow the implementation of a cost effective automated process. The automatic process is based on high-resolution Digital Surface Model analysis, Multi Spectral (MS classification, MS segmentation, object analysis and shape forming algorithms. This article reviews the results of a novel change detection methodology as a first step for updating NTDB in the Survey of Israel.

  18. Analysis of Stress Updates in the Material-point Method

    DEFF Research Database (Denmark)

    Andersen, Søren; Andersen, Lars

    2009-01-01

    The material-point method (MPM) is a new numerical method for analysis of large strain engineering problems. The MPM applies a dual formulation, where the state of the problem (mass, stress, strain, velocity etc.) is tracked using a finite set of material points while the governing equations...... are solved on a background computational grid. Several references state, that one of the main advantages of the material-point method is the easy application of complicated material behaviour as the constitutive response is updated individually for each material point. However, as discussed here, the MPM way...

  19. Two updating methods for dissipative models with non symmetric matrices

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Aubry, D.

    1997-01-01

    In this paper the feasibility of the extension of two updating methods to rotating machinery models is considered, the particularity of rotating machinery models is to use non-symmetric stiffness and damping matrices. It is shown that the two methods described here, the inverse Eigen-sensitivity method and the error in constitutive relation method can be adapted to such models given some modification.As far as inverse sensitivity method is concerned, an error function based on the difference between right hand calculated and measured Eigen mode shapes and calculated and measured Eigen values is used. Concerning the error in constitutive relation method, the equation which defines the error has to be modified due to the non definite positiveness of the stiffness matrix. The advantage of this modification is that, in some cases, it is possible to focus the updating process on some specific model parameters. Both methods were validated on a simple test model consisting in a two-bearing and disc rotor system. (author)

  20. EOP Improvement Proposal for SGTR based on The OPR PSA Update

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Hee; Cho, Jae Hyun; Kim, Dong San; Yang, Joon Eon [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This updating process was also focused to enhance the PSA quality and to respect the as built and as operated conditions of target plants. For this purpose, the EOP(Emergency Operating Procedure) and AOP(Abnormal Operating Procedure) of target plant were reviewed in detail and various thermal hydraulic(T/H) analysis were also performed to analyze the realistic PSA accident sequence model. In this paper, the unreasonable point of SGTR (Steam Generator Tube Rupture) EOP based on PSA perspective was identified and the initial proposal for EOP change items from PSA insight was proposed. In this paper, the unreasonable point of SGTR EOP based on PSA perspective was identified and the EOP improvement items are proposed to enhance safety and operator's convenience for the target plant.

  1. A Proposed Arabic Handwritten Text Normalization Method

    Directory of Open Access Journals (Sweden)

    Tarik Abu-Ain

    2014-11-01

    Full Text Available Text normalization is an important technique in document image analysis and recognition. It consists of many preprocessing stages, which include slope correction, text padding, skew correction, and straight the writing line. In this side, text normalization has an important role in many procedures such as text segmentation, feature extraction and characters recognition. In the present article, a new method for text baseline detection, straightening, and slant correction for Arabic handwritten texts is proposed. The method comprises a set of sequential steps: first components segmentation is done followed by components text thinning; then, the direction features of the skeletons are extracted, and the candidate baseline regions are determined. After that, selection of the correct baseline region is done, and finally, the baselines of all components are aligned with the writing line.  The experiments are conducted on IFN/ENIT benchmark Arabic dataset. The results show that the proposed method has a promising and encouraging performance.

  2. An update on neurotoxin products and administration methods.

    Science.gov (United States)

    Lanoue, Julien; Dong, Joanna; Do, Timothy; Goldenberg, Gary

    2016-09-01

    Since onabotulinumtoxinA for nonsurgical aesthetic enhancement of glabellar lines was initially reported, the popularity of botulinum neurotoxin (BoNT) products among both clinicians and consumers has rapidly grown, and we have seen several additional BoNT formulations enter the market. As the demand for minimally invasive cosmetic procedures continues to increase, we will see the introduction of additional formulations of BoNT products as well as new delivery devices and administration techniques. In this article, we provide a brief update on current and upcoming BoNT products and also review the literature on novel administration methods based on recently published studies.

  3. Fuzzy cross-model cross-mode method and its application to update the finite element model of structures

    International Nuclear Information System (INIS)

    Liu Yang; Xu Dejian; Li Yan; Duan Zhongdong

    2011-01-01

    As a novel updating technique, cross-model cross-mode (CMCM) method possesses a high efficiency and capability of flexible selecting updating parameters. However, the success of this method depends on the accuracy of measured modal shapes. Usually, the measured modal shapes are inaccurate since many kinds of measured noises are inevitable. Furthermore, the complete testing modal shapes are required by CMCM method so that the calculating errors may be introduced into the measured modal shapes by conducting the modal expansion or model reduction technique. Therefore, this algorithm is faced with the challenge of updating the finite element (FE) model of practical complex structures. In this study, the fuzzy CMCM method is proposed in order to weaken the effect of errors of the measured modal shapes on the updated results. Then two simulated examples are applied to compare the performance of the fuzzy CMCM method with the CMCM method. The test results show that proposed method is more promising to update the FE model of practical structures than CMCM method.

  4. A hierarchical updating method for finite element model of airbag buffer system under landing impact

    Directory of Open Access Journals (Sweden)

    He Huan

    2015-12-01

    Full Text Available In this paper, we propose an impact finite element (FE model for an airbag landing buffer system. First, an impact FE model has been formulated for a typical airbag landing buffer system. We use the independence of the structure FE model from the full impact FE model to develop a hierarchical updating scheme for the recovery module FE model and the airbag system FE model. Second, we define impact responses at key points to compare the computational and experimental results to resolve the inconsistency between the experimental data sampling frequency and experimental triggering. To determine the typical characteristics of the impact dynamics response of the airbag landing buffer system, we present the impact response confidence factors (IRCFs to evaluate how consistent the computational and experiment results are. An error function is defined between the experimental and computational results at key points of the impact response (KPIR to serve as a modified objective function. A radial basis function (RBF is introduced to construct updating variables for a surrogate model for updating the objective function, thereby converting the FE model updating problem to a soluble optimization problem. Finally, the developed method has been validated using an experimental and computational study on the impact dynamics of a classic airbag landing buffer system.

  5. Proposal for secondary ion beams and update of data taking schedule for 2009-2013

    CERN Document Server

    Abgrall, N; Andrieu, B; Anticic, T; Antoniou, N; Argyriades, J; Asryan, A G; Baatar, B; Blondel, A; Blumer, J; Boldizsar, L; Bravar, A; Brzychczyk, J; Bunyatov, S A; Choi, K U; Christakoglou, P; Chung, P; Cleymans, J; Derkach, D A; Diakonos, F; Dominik, W; Dumarchez, J; Engel, R; Ereditato, A; Feofilov, G A; Ferrero, A; Fodor, Z; Gazdzicki, M; Golubeva, M; Grebieszkow, K; Guber, F; Hasegawa, T; Haungs, A; Hess, M; Igolkin, S; Ivanov, A S; Ivashkin, A; Kadija, K; Katrynska, N; Kielczewska, D; Kikola, D; Kim, J H; Kobayashi, T; Kolesnikov, V I; Kolev, D; Kolevatov, R S; Kondratiev, V P; Kurepin, A; Lacey, R; Laszlo, A; Lehmann, S; Lungwitz, B; Lyubushkin, V V; Maevskaya, A; Majka, Z; Malakhov, A I; Marchionni, A; Marcinek, A; Maris, I; Matveev, V; Melkumov, G L; Meregaglia, A; Messina, M; Meurer, C; Mijakowski, P; Mitrovski, M; Montaruli, T; Mrówczynski, St; Murphy, S; Nakadaira, T; Naumenko, P A; Nikolic, V; Nishikawa, K; Palczewski, T; Pálla, G; Panagiotou, A D; Peryt, W; Petridis, A; Planeta, R; Pluta, J; Popov, B A; Posiadala, M; Przewlocki, P; Rauch, W; Ravonel, M; Renfordt, R; Röhrich, D; Rondio, E; Rossi, B; Roth, M; Rubbia, A; Rybczynski, M; Sadovskii, A; Sakashita, K; Schuster, T; Sekiguchi, T; Seyboth, P; Shileev, K; Sissakian, A N; Skrzypczak, E; Slodkowski, M; Sorin, A S; Staszel, P; Stefanek, G; Stepaniak, J; Strabel, C; Ströbele, H; Susa, T; Szentpétery, I; Szuba, M; Taranenko, A; Tsenov, R; Ulrich, R; Unger, M; Vassiliou, M; Vechernin, V V; Vesztergombi, G; Wlodarczyk, Z; Wojtaszek, A; Yi, J G; Yoo, I K; CERN. Geneva. SPS and PS Experiments Committee; SPSC

    2009-01-01

    This document presents the proposal for secondary ion beams and the updated data taking schedule of the NA61 Collaboration. The modification of the original NA61 plans is necessary in order to reach compatibility between the current I-LHC and NA61 schedules. It assumes delivery of primary proton beam in 2009-2012 and of primary lead beam in 2011-2013. The primary lead beam will be fragmented into a secondary beam of lighter ions. The modified H2 beam line will serve as a fragment separator to produce the light ion species for NA61 data taking. The expected physics performance of the NA61 experiment with secondary ion beams will be sufficient to reach the primary NA61 physics goals.

  6. Update on NOx measuring methods and emission levels

    International Nuclear Information System (INIS)

    Yamada, N.; Desprets, M.

    1997-01-01

    The survey was carried out in 1995 to update the NO x report prepared for presentation at the 19th World Gas Conference held in Milan in 1994 and drawn out on the basis of the information obtained through the survey carried out in 1992. Over the past three years the work on standard developments and/or improvements on NO x emissions has progressed in several IGU member countries. For example, in Europe a report on 'Determination of emissions from appliances burning gaseous fuels during type-testing' was drafted by European Committee for Standardization (CEN) in March 1994 as 'CR 1404 : 1994'. This report is based on the work so far carried out within MARCOGAZ, and it is expected that the NO x measuring methods specified in this report would be finally introduced into relevant gas appliance standards issued by CEN for type-testing of gas appliances covered by this report. (au)

  7. A Weighted Two-Level Bregman Method with Dictionary Updating for Nonconvex MR Image Reconstruction

    Directory of Open Access Journals (Sweden)

    Qiegen Liu

    2014-01-01

    Full Text Available Nonconvex optimization has shown that it needs substantially fewer measurements than l1 minimization for exact recovery under fixed transform/overcomplete dictionary. In this work, two efficient numerical algorithms which are unified by the method named weighted two-level Bregman method with dictionary updating (WTBMDU are proposed for solving lp optimization under the dictionary learning model and subjecting the fidelity to the partial measurements. By incorporating the iteratively reweighted norm into the two-level Bregman iteration method with dictionary updating scheme (TBMDU, the modified alternating direction method (ADM solves the model of pursuing the approximated lp-norm penalty efficiently. Specifically, the algorithms converge after a relatively small number of iterations, under the formulation of iteratively reweighted l1 and l2 minimization. Experimental results on MR image simulations and real MR data, under a variety of sampling trajectories and acceleration factors, consistently demonstrate that the proposed method can efficiently reconstruct MR images from highly undersampled k-space data and presents advantages over the current state-of-the-art reconstruction approaches, in terms of higher PSNR and lower HFEN values.

  8. Effective updating process of seismic fragilities using Bayesian method and information entropy

    International Nuclear Information System (INIS)

    Kato, Masaaki; Takata, Takashi; Yamaguchi, Akira

    2008-01-01

    Seismic probabilistic safety assessment (SPSA) is an effective method for evaluating overall performance of seismic safety of a plant. Seismic fragilities are estimated to quantify the seismically induced accident sequences. It is a great concern that the SPSA results involve uncertainties, a part of which comes from the uncertainty in the seismic fragility of equipment and systems. A straightforward approach to reduce the uncertainty is to perform a seismic qualification test and to reflect the results on the seismic fragility estimate. In this paper, we propose a figure-of-merit to find the most cost-effective condition of the seismic qualification tests about the acceleration level and number of components tested. Then a mathematical method to reflect the test results on the fragility update is developed. A Bayesian method is used for the fragility update procedure. Since a lognormal distribution that is used for the fragility model does not have a Bayes conjugate function, a parameterization method is proposed so that the posterior distribution expresses the characteristics of the fragility. The information entropy is used as the figure-of-merit to express importance of obtained evidence. It is found that the information entropy is strongly associated with the uncertainty of the fragility. (author)

  9. A Proposal for Updated Standards of Photographic Documentation in Aesthetic Medicine.

    Science.gov (United States)

    Prantl, Lukas; Brandl, Dirk; Ceballos, Patricia

    2017-08-01

    In 1998, DiBernardo et al. published a very helpful standardization of comparative (before and after) photographic documentation. These standards prevail to this day. Although most of them are useful for objective documentation of aesthetic results, there are at least 3 reasons why an update is necessary at this time: First, DiBernardo et al. focused on the prevalent standards of medical photography at that time. From a modern perspective, these standards are antiquated and not always correct. Second, silver-based analog photography has mutated into digital photography. Digitalization offers virtually unlimited potential for image manipulation using a vast array of digital Apps and tools including, but not limited to, image editing software like Photoshop. Digitalization has given rise to new questions, particularly regarding appropriate use of editing techniques to maximize or increase objectivity. Third, we suggest changes to a very small number of their medical standards in the interest of obtaining a better or more objective documentation of aesthetic results. This article is structured into 3 sections and is intended as a new proposal for photographic and medical standards for the documentation of aesthetic interventions: 1. The photographic standards. 2. The medical standards. 3. Description of editing tools which should be used to increase objectivity.

  10. Smoking and plastic surgery, part I. Pathophysiological aspects: update and proposed recommendations.

    Science.gov (United States)

    Pluvy, I; Garrido, I; Pauchot, J; Saboye, J; Chavoin, J P; Tropet, Y; Grolleau, J L; Chaput, B

    2015-02-01

    Smoking patients undergoing a plastic surgery intervention are exposed to increased risk of perioperative and postoperative complications. It seemed useful to us to establish an update about the negative impact of smoking, especially on wound healing, and also about the indisputable benefits of quitting. We wish to propose a minimum time lapse of withdrawal in the preoperative and postoperative period in order to reduce the risks and maximize the results of the intervention. A literature review of documents from 1972 to 2014 was carried out by searching five different databases (Medline, PubMed Central, Cochrane library, Pascal and Web of Science). Cigarette smoke has a diffuse and multifactorial impact in the body. Hypoxia, tissue ischemia and immune disorders induced by tobacco consumption cause alterations of the healing process. Some of these effects are reversible by quitting. Data from the literature recommend a preoperative smoking cessation period lasting between 3 and 8 weeks and up until 4 weeks postoperatively. Use of nicotine replacement therapies doubles the abstinence rate in the short term. When a patient is heavily dependent, the surgeon should be helped by a tobacco specialist. Total smoking cessation of 4 weeks preoperatively and lasting until primary healing of the operative site (2 weeks) appears to optimize surgical conditions without heightening anesthetic risk. Tobacco withdrawal assistance, both human and drug-based, is highly recommended. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  11. A visual tracking method based on deep learning without online model updating

    Science.gov (United States)

    Tang, Cong; Wang, Yicheng; Feng, Yunsong; Zheng, Chao; Jin, Wei

    2018-02-01

    The paper proposes a visual tracking method based on deep learning without online model updating. In consideration of the advantages of deep learning in feature representation, deep model SSD (Single Shot Multibox Detector) is used as the object extractor in the tracking model. Simultaneously, the color histogram feature and HOG (Histogram of Oriented Gradient) feature are combined to select the tracking object. In the process of tracking, multi-scale object searching map is built to improve the detection performance of deep detection model and the tracking efficiency. In the experiment of eight respective tracking video sequences in the baseline dataset, compared with six state-of-the-art methods, the method in the paper has better robustness in the tracking challenging factors, such as deformation, scale variation, rotation variation, illumination variation, and background clutters, moreover, its general performance is better than other six tracking methods.

  12. The history of female genital tract malformation classifications and proposal of an updated system.

    Science.gov (United States)

    Acién, Pedro; Acién, Maribel I

    2011-01-01

    A correct classification of malformations of the female genital tract is essential to prevent unnecessary and inadequate surgical operations and to compare reproductive results. An ideal classification system should be based on aetiopathogenesis and should suggest the appropriate therapeutic strategy. We conducted a systematic review of relevant articles found in PubMed, Scopus, Scirus and ISI webknowledge, and analysis of historical collections of 'female genital malformations' and 'classifications'. Of 124 full-text articles assessed for eligibility, 64 were included because they contained original general, partial or modified classifications. All the existing classifications were analysed and grouped. The unification of terms and concepts was also analysed. Traditionally, malformations of the female genital tract have been catalogued and classified as Müllerian malformations due to agenesis, lack of fusion, the absence of resorption and lack of posterior development of the Müllerian ducts. The American Fertility Society classification of the late 1980s included seven basic groups of malformations also considering the Müllerian development and the relationship of the malformations to fertility. Other classifications are based on different aspects: functional, defects in vertical fusion, embryological or anatomical (Vagina, Cervix, Uterus, Adnex and Associated Malformation: VCUAM classification). However, an embryological-clinical classification system seems to be the most appropriate. Accepting the need for a new classification system of genitourinary malformations that considers the experience gained from the application of the current classification systems, the aetiopathogenesis and that also suggests the appropriate treatment, we proposed an update of our embryological-clinical classification as a new system with six groups of female genitourinary anomalies.

  13. Evaluation of two updating methods for dissipative models on a real structure

    International Nuclear Information System (INIS)

    Moine, P.; Billet, L.

    1996-01-01

    Finite Element Models are widely used to predict the dynamic behaviour from structures. Frequently, the model does not represent the structure with all be expected accuracy i.e. the measurements realised on the structure differ from the data predicted by the model. It is therefore necessary to update the model. Although many modeling errors come from inadequate representation of the damping phenomena, most of the model updating techniques are up to now restricted to conservative models only. In this paper, we present two updating methods for dissipative models using Eigen mode shapes and Eigen values as behavioural information from the structure. The first method - the modal output error method - compares directly the experimental Eigen vectors and Eigen values to the model Eigen vectors and Eigen values whereas the second method - the error in constitutive relation method - uses an energy error derived from the equilibrium relation. The error function, in both cases, is minimized by a conjugate gradient algorithm and the gradient is calculated analytically. These two methods behave differently which can be evidenced by updating a real structure constituted from a piece of pipe mounted on two viscous elastic suspensions. The updating of the model validates an updating strategy consisting in realizing a preliminary updating with the error in constitutive relation method (a fast to converge but difficult to control method) and then to pursue the updating with the modal output error method (a slow to converge but reliable and easy to control method). Moreover the problems encountered during the updating process and their corresponding solutions are given. (authors)

  14. Proposed method for regulating major materials licensees

    International Nuclear Information System (INIS)

    1992-02-01

    The Director, Office of Nuclear Material Safety and Safeguards, US Nuclear Regulatory Commission, appointed a Materials Regulatory Review Task Force to conduct a broad-based review of the Commission's current licensing and oversight programs for fuel cycle and large materials plants. The task force, as requested, defined the components and subcomponents of an ideal regulatory evaluation system for these types of licensed plants and compared they to the components and subcomponents of the existing regulatory evaluation system. This report discusses findings from this comparison and proposed recommendations on the basis of these findings

  15. A Proposed Multimedia Cone of Abstraction: Updating a Classic Instructional Design Theory

    Science.gov (United States)

    Baukal, Charles E.; Ausburn, Floyd B.; Ausburn, Lynna J.

    2013-01-01

    Advanced multimedia techniques offer significant learning potential for students. Dale (1946, 1954, 1969) developed a Cone of Experience (CoE) which is a hierarchy of learning experiences ranging from direct participation to abstract symbolic expression. This paper updates the CoE for today's technology and learning context, specifically focused…

  16. Medicare: Comparison of Catastrophic Health Insurance Proposals--An Update. Briefing Report to the Chairman, Select Committee on Aging, House of Representatives.

    Science.gov (United States)

    General Accounting Office, Washington, DC. Div. of Human Resources.

    This document updates a recent report by the General Accounting Office (GAO) which compared Medicare catastrophic health insurance proposals. The update includes H.R. 2470, as passed by the House of Representatives and S. 1127, as reported by the Senate Committee on Finance. An introduction explains the roles of Medicare, Medicaid, the Veterans…

  17. A gradual update method for simulating the steady-state solution of stiff differential equations in metabolic circuits.

    Science.gov (United States)

    Shiraishi, Emi; Maeda, Kazuhiro; Kurata, Hiroyuki

    2009-02-01

    Numerical simulation of differential equation systems plays a major role in the understanding of how metabolic network models generate particular cellular functions. On the other hand, the classical and technical problems for stiff differential equations still remain to be solved, while many elegant algorithms have been presented. To relax the stiffness problem, we propose new practical methods: the gradual update of differential-algebraic equations based on gradual application of the steady-state approximation to stiff differential equations, and the gradual update of the initial values in differential-algebraic equations. These empirical methods show a high efficiency for simulating the steady-state solutions for the stiff differential equations that existing solvers alone cannot solve. They are effective in extending the applicability of dynamic simulation to biochemical network models.

  18. Data Updating Methods for Spatial Data Infrastructure that Maintain Infrastructure Quality and Enable its Sustainable Operation

    Science.gov (United States)

    Murakami, S.; Takemoto, T.; Ito, Y.

    2012-07-01

    The Japanese government, local governments and businesses are working closely together to establish spatial data infrastructures in accordance with the Basic Act on the Advancement of Utilizing Geospatial Information (NSDI Act established in August 2007). Spatial data infrastructures are urgently required not only to accelerate computerization of the public administration, but also to help restoration and reconstruction of the areas struck by the East Japan Great Earthquake and future disaster prevention and reduction. For construction of a spatial data infrastructure, various guidelines have been formulated. But after an infrastructure is constructed, there is a problem of maintaining it. In one case, an organization updates its spatial data only once every several years because of budget problems. Departments and sections update the data on their own without careful consideration. That upsets the quality control of the entire data system and the system loses integrity, which is crucial to a spatial data infrastructure. To ensure quality, ideally, it is desirable to update data of the entire area every year. But, that is virtually impossible, considering the recent budget crunch. The method we suggest is to update spatial data items of higher importance only in order to maintain quality, not updating all the items across the board. We have explored a method of partially updating the data of these two geographical features while ensuring the accuracy of locations. Using this method, data on roads and buildings that greatly change with time can be updated almost in real time or at least within a year. The method will help increase the availability of a spatial data infrastructure. We have conducted an experiment on the spatial data infrastructure of a municipality using those data. As a result, we have found that it is possible to update data of both features almost in real time.

  19. Notification: FY 2017 Update of Proposed Key Management Challenges and Internal Control Weaknesses Confronting the U.S. Chemical Safety and Hazard Investigation Board

    Science.gov (United States)

    Jan 5, 2017. The EPA OIG is beginning work to update for fiscal year 2017 its list of proposed key management challenges and internal control weaknesses confronting the U.S. Chemical Safety and Hazard Investigation Board (CSB).

  20. Update and Improve Subsection NH - Alternative Simplified Creep-Fatigue Design Methods

    International Nuclear Information System (INIS)

    Asayama, Tai

    2009-01-01

    This report described the results of investigation on Task 10 of DOE/ASME Materials NGNP/Generation IV Project based on a contract between ASME Standards Technology, LLC (ASME ST-LLC) and Japan Atomic Energy Agency (JAEA). Task 10 is to Update and Improve Subsection NH -- Alternative Simplified Creep-Fatigue Design Methods. Five newly proposed promising creep-fatigue evaluation methods were investigated. Those are (1) modified ductility exhaustion method, (2) strain range separation method, (3) approach for pressure vessel application, (4) hybrid method of time fraction and ductility exhaustion, and (5) simplified model test approach. The outlines of those methods are presented first, and predictability of experimental results of these methods is demonstrated using the creep-fatigue data collected in previous Tasks 3 and 5. All the methods (except the simplified model test approach which is not ready for application) predicted experimental results fairly accurately. On the other hand, predicted creep-fatigue life in long-term regions showed considerable differences among the methodologies. These differences come from the concepts each method is based on. All the new methods investigated in this report have advantages over the currently employed time fraction rule and offer technical insights that should be thought much of in the improvement of creep-fatigue evaluation procedures. The main points of the modified ductility exhaustion method, the strain range separation method, the approach for pressure vessel application and the hybrid method can be reflected in the improvement of the current time fraction rule. The simplified mode test approach would offer a whole new advantage including robustness and simplicity which are definitely attractive but this approach is yet to be validated for implementation at this point. Therefore, this report recommends the following two steps as a course of improvement of NH based on newly proposed creep-fatigue evaluation

  1. National Security in the Nuclear Age: Public Library Proposal and Booklist. May 1987 Update.

    Science.gov (United States)

    Dane, Ernest B.

    To increase public understanding of national security issues, this document proposes that a balanced and up-to-date collection of books and other materials on national security in the nuclear age be included in all U.S. public libraries. The proposal suggests that the books be grouped together on an identified shelf. Selection criteria for the…

  2. Assessment of proposed electromagnetic quantum vacuum energy extraction methods

    OpenAIRE

    Moddel, Garret

    2009-01-01

    In research articles and patents several methods have been proposed for the extraction of zero-point energy from the vacuum. None has been reliably demonstrated, but the proposals remain largely unchallenged. In this paper the feasibility of these methods is assessed in terms of underlying thermodynamics principles of equilibrium, detailed balance, and conservation laws. The methods are separated into three classes: nonlinear processing of the zero-point field, mechanical extraction using Cas...

  3. A Proposed Method for Solving Fuzzy System of Linear Equations

    Directory of Open Access Journals (Sweden)

    Reza Kargar

    2014-01-01

    Full Text Available This paper proposes a new method for solving fuzzy system of linear equations with crisp coefficients matrix and fuzzy or interval right hand side. Some conditions for the existence of a fuzzy or interval solution of m×n linear system are derived and also a practical algorithm is introduced in detail. The method is based on linear programming problem. Finally the applicability of the proposed method is illustrated by some numerical examples.

  4. A Progressive Buffering Method for Road Map Update Using OpenStreetMap Data

    Directory of Open Access Journals (Sweden)

    Changyong Liu

    2015-07-01

    Full Text Available Web 2.0 enables a two-way interaction between servers and clients. GPS receivers become available to more citizens and are commonly found in vehicles and smart phones, enabling individuals to record and share their trajectory data on the Internet and edit them online. OpenStreetMap (OSM makes it possible for citizens to contribute to the acquisition of geographic information. This paper studies the use of OSM data to find newly mapped or built roads that do not exist in a reference road map and create its updated version. For this purpose, we propose a progressive buffering method for determining an optimal buffer radius to detect the new roads in the OSM data. In the next step, the detected new roads are merged into the reference road maps geometrically, topologically, and semantically. Experiments with OSM data and reference road maps over an area of 8494 km2 in the city of Wuhan, China and five of its 5 km × 5 km areas are conducted to demonstrate the feasibility and effectiveness of the method. It is shown that the OSM data can add 11.96% or a total of 2008.6 km of new roads to the reference road maps with an average precision of 96.49% and an average recall of 97.63%.

  5. Issue update: a regional settlement proposal to resolve the Washington Nuclear Plant No. 3 lawsuit

    International Nuclear Information System (INIS)

    1985-08-01

    The Bonneville Power Administration (BPA) announced on August 2, 1985, that a number of substantive changes suggested by public comment on the Washington Nuclear Plant No. 3 settlement had been agreed to in principle by BPA and four private utilities. Since that date the details of these changes have been resolved, and the proposed settlement is now being offered for public review and comment

  6. Hypersensitivity to local anaesthetics--update and proposal of evaluation algorithm

    DEFF Research Database (Denmark)

    Thyssen, Jacob Pontoppidan; Menné, Torkil; Elberling, Jesper

    2008-01-01

    of patients suspected with immediate- and delayed-type immune reactions. Literature was examined using PubMed-Medline, EMBASE, Biosis and Science Citation Index. Based on the literature, the proposed algorithm may safely and rapidly distinguish between immediate-type and delayed-type allergic immune reactions....

  7. The choice of leasing companies for automobile fleet updating on the basis of hierarchies analysis method

    OpenAIRE

    Dorohov, А.

    2007-01-01

    The basic criteria of leasing companies choice by the transport enterprises for automobile fleet updating such as terms of financing, size of advance, assortment time of existence at the market, have been determined. The determination of the best leasing company according to these parameters on the basis of hierarchies analysis method has been offered.

  8. Application of Computational Methods in Planaria Research: A Current Update

    Directory of Open Access Journals (Sweden)

    Ghosh Shyamasree

    2017-07-01

    Full Text Available Planaria is a member of the Phylum Platyhelminthes including flatworms. Planarians possess the unique ability of regeneration from adult stem cells or neoblasts and finds importance as a model organism for regeneration and developmental studies. Although research is being actively carried out globally through conventional methods to understand the process of regeneration from neoblasts, biology of development, neurobiology and immunology of Planaria, there are many thought provoking questions related to stem cell plasticity, and uniqueness of regenerative potential in Planarians amongst other members of Phylum Platyhelminthes. The complexity of receptors and signalling mechanisms, immune system network, biology of repair, responses to injury are yet to be understood in Planaria. Genomic and transcriptomic studies have generated a vast repository of data, but their availability and analysis is a challenging task. Data mining, computational approaches of gene curation, bioinformatics tools for analysis of transcriptomic data, designing of databases, application of algorithms in deciphering changes of morphology by RNA interference (RNAi approaches, understanding regeneration experiments is a new venture in Planaria research that is helping researchers across the globe in understanding the biology. We highlight the applications of Hidden Markov models (HMMs in designing of computational tools and their applications in Planaria decoding their complex biology.

  9. Proposal of Evolutionary Simplex Method for Global Optimization Problem

    Science.gov (United States)

    Shimizu, Yoshiaki

    To make an agile decision in a rational manner, role of optimization engineering has been notified increasingly under diversified customer demand. With this point of view, in this paper, we have proposed a new evolutionary method serving as an optimization technique in the paradigm of optimization engineering. The developed method has prospects to solve globally various complicated problem appearing in real world applications. It is evolved from the conventional method known as Nelder and Mead’s Simplex method by virtue of idea borrowed from recent meta-heuristic method such as PSO. Mentioning an algorithm to handle linear inequality constraints effectively, we have validated effectiveness of the proposed method through comparison with other methods using several benchmark problems.

  10. Updated method guidelines for cochrane musculoskeletal group systematic reviews and metaanalyses

    DEFF Research Database (Denmark)

    Ghogomu, Elizabeth A T; Maxwell, Lara J; Buchbinder, Rachelle

    2014-01-01

    The Cochrane Musculoskeletal Group (CMSG), one of 53 groups of the not-for-profit, international Cochrane Collaboration, prepares, maintains, and disseminates systematic reviews of treatments for musculoskeletal diseases. It is important that authors conducting CMSG reviews and the readers of our...... reviews be aware of and use updated, state-of-the-art systematic review methodology. One hundred sixty reviews have been published. Previous method guidelines for systematic reviews of interventions in the musculoskeletal field published in 2006 have been substantially updated to incorporate...... using network metaanalysis. Method guidelines specific to musculoskeletal disorders are provided by CMSG editors for various aspects of undertaking a systematic review. These method guidelines will help improve the quality of reporting and ensure high standards of conduct as well as consistency across...

  11. Updating and testing of a Finnish method for mixed municipal solid waste composition studies.

    Science.gov (United States)

    Liikanen, M; Sahimaa, O; Hupponen, M; Havukainen, J; Sorvari, J; Horttanainen, M

    2016-06-01

    More efficient recycling of municipal solid waste (MSW) is an essential precondition for turning Europe into a circular economy. Thus, the recycling of MSW must increase significantly in several member states, including Finland. This has increased the interest in the composition of mixed MSW. Due to increased information needs, a method for mixed MSW composition studies was introduced in Finland in order to improve the national comparability of composition study results. The aim of this study was to further develop the method so that it corresponds to the information needed about the composition of mixed MSW and still works in practice. A survey and two mixed MSW composition studies were carried out in the study. According to the responses of the survey, the intensification of recycling, the landfill ban on organic waste and the producer responsibility for packaging waste have particularly influenced the need for information about the composition of mixed MSW. The share of biowaste in mixed MSW interested the respondents most. Additionally, biowaste proved to be the largest waste fraction in mixed MSW in the composition studies. It constituted over 40% of mixed MSW in both composition studies. For these reasons, the classification system of the method was updated by further defining the classifications of biowaste. The classifications of paper as well as paperboard and cardboard were also updated. The updated classification system provides more information on the share of avoidable food waste and waste materials suitable for recycling in mixed MSW. The updated method and the information gained from the composition studies are important in ensuring that the method will be adopted by municipal waste management companies and thus used widely in Finland. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Intraoperative magnetic resonance imaging to update interactive navigation in neurosurgery: method and preliminary experience.

    Science.gov (United States)

    Wirtz, C R; Bonsanto, M M; Knauth, M; Tronnier, V M; Albert, F K; Staubert, A; Kunze, S

    1997-01-01

    We report on the first successful intraoperative update of interactive image guidance based on an intraoperatively acquired magnetic resonance imaging (MRI) date set. To date, intraoperative imaging methods such as ultrasound, computerized tomography (CT), or MRI have not been successfully used to update interactive navigation. We developed a method of imaging patients intraoperatively with the surgical field exposed in an MRI scanner (Magnetom Open; Siemens Corp., Erlangen, Germany). In 12 patients, intraoperatively acquired 3D data sets were used for successful recalibration of neuronavigation, accounting for any anatomical changes caused by surgical manipulations. The MKM Microscope (Zeiss Corp., Oberkochen, Germany) was used as navigational system. With implantable fiducial markers, an accuracy of 0.84 +/- 0.4 mm for intraoperative reregistration was achieved. Residual tumor detected on MRI was consequently resected using navigation with the intraoperative data. No adverse effects were observed from intraoperative imaging or the use of navigation with intraoperative images, demonstrating the feasibility of recalibrating navigation with intraoperative MRI.

  13. Proposals of counting method for bubble detectors and their intercomparisons

    International Nuclear Information System (INIS)

    Ramalho, Eduardo; Silva, Ademir X.; Bellido, Luis F.; Facure, Alessandro; Pereira, Mario

    2009-01-01

    The study of neutron's spectrometry and dosimetry has become significantly easier due to relatively new devices called bubble detectors. Insensitive to gamma rays and composed by superheated emulsions, they still are subjects of many researches in Radiation Physics and Nuclear Engineering. In bubble detectors, either exposed to more intense neutron fields or for a long time, when more bubbles are produced, the statistical uncertainty during the dosimetric and spectrometric processes is reduced. A proposal of this nature is set up in this work, which presents ways to perform counting processes for bubble detectors and an updated proceeding to get the irradiated detectors' images in order to make the manual counting easier. Twelve BDS detectors were irradiated by RDS111 cyclotron from IEN's (Instituto de Engenharia Nuclear) and photographed using an assembly specially designed for this experiment. Counting was proceeded manually in a first moment; simultaneously, ImagePro was used in order to perform counting automatically. The bubble counting values, either manual or automatic, were compared and the time to get them and their difficult levels as well. After the bubble counting, the detectors' standardizes responses were calculated in both cases, according to BDS's manual and they were also compared. Among the results, the counting on these devices really becomes very hard at a large number of bubbles, besides higher variations in counting of many bubbles. Because of the good agreement between manual counting and the custom program, the last one revealed a good alternative in practical and economical levels. Despite the good results, the custom program needs of more adjustments in order to achieve more accuracy on higher counting on bubble detectors for neutron measurement applications. (author)

  14. Method for updating pipelined, single port Z-buffer by segments on a scan line

    International Nuclear Information System (INIS)

    Hannah, M.R.

    1990-01-01

    This patent describes, in a raster scan, computer controlled video display system for presenting an image to an observer. Having Z-buffer for storing Z values and a frame buffer for storing pixel values, a method for updating the Z-buffer with new Z values to replace old Z values. It comprises: calculating a new pixel value and a new Z value for each pixel location in pixel locations, performing a Z comparison for each new Z value by comparing the old Z value with the new Z value for each pixel location, the Z comparison being performed sequentially in one direction through the plurality of pixel locations, and updating the Z-buffer only after the Z comparison produces a combination of a fail condition for a current pixel location subsequent to producing a pass condition for a pixel location immediately preceding the current pixel location

  15. DEVELOPMENT OF AUTOMATIC EXTRACTION METHOD FOR ROAD UPDATE INFORMATION BASED ON PUBLIC WORK ORDER OUTLOOK

    Science.gov (United States)

    Sekimoto, Yoshihide; Nakajo, Satoru; Minami, Yoshitaka; Yamaguchi, Syohei; Yamada, Harutoshi; Fuse, Takashi

    Recently, disclosure of statistic data, representing financial effects or burden for public work, through each web site of national or local government, enables us to discuss macroscopic financial trends. However, it is still difficult to grasp a basic property nationwide how each spot was changed by public work. In this research, our research purpose is to collect road update information reasonably which various road managers provide, in order to realize efficient updating of various maps such as car navigation maps. In particular, we develop the system extracting public work concerned and registering summary including position information to database automatically from public work order outlook, released by each local government, combinating some web mining technologies. Finally, we collect and register several tens of thousands from web site all over Japan, and confirm the feasibility of our method.

  16. Methodological proposal for environmental impact evaluation since different specific methods

    International Nuclear Information System (INIS)

    Leon Pelaez, Juan Diego; Lopera Arango Gabriel Jaime

    1999-01-01

    Some conceptual and practical elements related to environmental impact evaluation are described and related to the preparation of technical reports (environmental impact studies and environmental management plans) to be presented to environmental authorities for obtaining the environmental permits for development projects. In the first part of the document a summary of the main aspects of normative type is made that support the studies of environmental impact in Colombia. We propose a diagram for boarding and elaboration of the evaluation of environmental impact, which begins with the description of the project and of the environmental conditions in the area of the same. Passing then to identify the impacts through a method matricial and continuing with the quantitative evaluation of the same. For which we propose the use of the method developed by Arboleda (1994). Also we propose to qualify the activities of the project and the components of the environment in their relative importance, by means of a method here denominated agglomerate evaluation. Which allows finding those activities more impacting and the mostly impacted components. Lastly it is presented some models for the elaboration and presentation of the environmental management plans. The pursuit programs and those of environmental supervision

  17. Low-rank Quasi-Newton updates for Robust Jacobian lagging in Newton methods

    International Nuclear Information System (INIS)

    Brown, J.; Brune, P.

    2013-01-01

    Newton-Krylov methods are standard tools for solving nonlinear problems. A common approach is to 'lag' the Jacobian when assembly or preconditioner setup is computationally expensive, in exchange for some degradation in the convergence rate and robustness. We show that this degradation may be partially mitigated by using the lagged Jacobian as an initial operator in a quasi-Newton method, which applies unassembled low-rank updates to the Jacobian until the next full reassembly. We demonstrate the effectiveness of this technique on problems in glaciology and elasticity. (authors)

  18. An Experimental Study of Structural Identification of Bridges Using the Kinetic Energy Optimization Technique and the Direct Matrix Updating Method

    Directory of Open Access Journals (Sweden)

    Gwanghee Heo

    2016-01-01

    Full Text Available This paper aims to develop an SI (structural identification technique using the KEOT and the DMUM to decide on optimal location of sensors and to update FE model, respectively, which ultimately contributes to a composition of more effective SHM. Owing to the characteristic structural flexing behavior of cable bridges (e.g., cable-stayed bridges and suspension bridges, which makes them vulnerable to any vibration, systematic and continuous structural health monitoring (SHM is pivotal for them. Since it is necessary to select optimal measurement locations with the fewest possible measurements and also to accurately assess the structural state of a bridge for the development of an effective SHM, an SI technique is as much important to accurately determine the modal parameters of the current structure based on the data optimally obtained. In this study, the kinetic energy optimization technique (KEOT was utilized to determine the optimal measurement locations, while the direct matrix updating method (DMUM was utilized for FE model updating. As a result of experiment, the required number of measurement locations derived from KEOT based on the target mode was reduced by approximately 80% compared to the initial number of measurement locations. Moreover, compared to the eigenvalue of the modal experiment, an improved FE model with a margin of error of less than 1% was derived from DMUM. Thus, the SI technique for cable-stayed bridges proposed in this study, which utilizes both KEOT and DMUM, is proven effective in minimizing the number of sensors while accurately determining the structural dynamic characteristics.

  19. A study on reducing update frequency of the forecast samples in the ensemble-based 4DVar data assimilation method

    Energy Technology Data Exchange (ETDEWEB)

    Shao, Aimei; Xu, Daosheng [Lanzhou Univ. (China). Key Lab. of Arid Climatic Changing and Reducing Disaster of Gansu Province; Chinese Academy of Meteorological Sciences, Beijing (China). State Key Lab. of Severe Weather; Qiu, Xiaobin [Lanzhou Univ. (China). Key Lab. of Arid Climatic Changing and Reducing Disaster of Gansu Province; Tianjin Institute of Meteorological Science (China); Qiu, Chongjian [Lanzhou Univ. (China). Key Lab. of Arid Climatic Changing and Reducing Disaster of Gansu Province

    2013-02-15

    In the ensemble-based four dimensional variational assimilation method (SVD-En4DVar), a singular value decomposition (SVD) technique is used to select the leading eigenvectors and the analysis variables are expressed as the orthogonal bases expansion of the eigenvectors. The experiments with a two-dimensional shallow-water equation model and simulated observations show that the truncation error and rejection of observed signals due to the reduced-dimensional reconstruction of the analysis variable are the major factors that damage the analysis when the ensemble size is not large enough. However, a larger-sized ensemble is daunting computational burden. Experiments with a shallow-water equation model also show that the forecast error covariances remain relatively constant over time. For that reason, we propose an approach that increases the members of the forecast ensemble while reducing the update frequency of the forecast error covariance in order to increase analysis accuracy and to reduce the computational cost. A series of experiments were conducted with the shallow-water equation model to test the efficiency of this approach. The experimental results indicate that this approach is promising. Further experiments with the WRF model show that this approach is also suitable for the real atmospheric data assimilation problem, but the update frequency of the forecast error covariances should not be too low. (orig.)

  20. Proposed frustrated-total-reflection acoustic sensing method

    International Nuclear Information System (INIS)

    Hull, J.R.

    1981-01-01

    Modulation of electromagnetic energy transmission through a frustrated-total-reflection device by pressure-induced changes in the index of refraction is proposed for use as an acoustic detector. Maximum sensitivity occurs for angles of incidence near the critical angle. The minimum detectable pressure in air is limited by Brownian noise. Acoustic propagation losses and diffraction of the optical beam by the acoustic signal limit the minimum acoustic wavelength to lengths of the order of the spatial extent of the optical beam. The response time of the method is fast enough to follow individual acoustic waves

  1. A proposed assessment method for image of regional educational institutions

    Directory of Open Access Journals (Sweden)

    Kataeva Natalya

    2017-01-01

    Full Text Available Market of educational services in the current Russian economic conditions is a complex of a huge variety of educational institutions. Market of educational services is already experiencing a significant influence of the demographic situation in Russia. This means that higher education institutions are forced to fight in a tough competition for high school students. Increased competition in the educational market forces universities to find new methods of non-price competition in attraction of potential students and throughout own educational and economic activities. Commercialization of education places universities in a single plane with commercial companies who study a positive perception of the image and reputation as a competitive advantage, which is quite acceptable for use in strategic and current activities of higher education institutions to ensure the competitiveness of educational services and educational institution in whole. Nevertheless, due to lack of evidence-based proposals in this area there is a need for scientific research in terms of justification of organizational and methodological aspects of image use as a factor in the competitiveness of the higher education institution. Theoretically and practically there are different methods and ways of evaluating the company’s image. The article provides a comparative assessment of the existing valuation methods of corporate image and the author’s method of estimating the image of higher education institutions based on the key influencing factors. The method has been tested on the Vyatka State Agricultural Academy (Russia. The results also indicate the strengths and weaknesses of the institution, highlights ways of improving, and adjusts the efforts for image improvement.

  2. 78 FR 58985 - Proposed Amendments to the Water Quality Regulations, Water Code and Comprehensive Plan To Update...

    Science.gov (United States)

    2013-09-25

    ..., Water Code and Comprehensive Plan to update stream quality objectives (also called ``water quality..., to Commission Secretary at 609-883-9522; if by U.S. Mail, to Commission Secretary, DRBC, P.O. Box...-7203. SUPPLEMENTARY INFORMATION: Background. The Commission in 1967 assigned stream quality objectives...

  3. A qualitative method proposal to improve environmental impact assessment

    Energy Technology Data Exchange (ETDEWEB)

    Toro, Javier, E-mail: jjtoroca@unal.edu.co [Institute of Environmental Studies, National University of Colombia at Bogotá (Colombia); Requena, Ignacio, E-mail: requena@decsai.ugr.es [Department of Computer Science and Artificial Intelligence, University of Granada (Spain); Duarte, Oscar, E-mail: ogduartev@unal.edu.co [National University of Colombia at Bogotá, Department of Electrical Engineering and Electronics (Colombia); Zamorano, Montserrat, E-mail: zamorano@ugr.es [Department of Civil Engineering, University of Granada (Spain)

    2013-11-15

    In environmental impact assessment, qualitative methods are used because they are versatile and easy to apply. This methodology is based on the evaluation of the strength of the impact by grading a series of qualitative attributes that can be manipulated by the evaluator. The results thus obtained are not objective, and all too often impacts are eliminated that should be mitigated with corrective measures. However, qualitative methodology can be improved if the calculation of Impact Importance is based on the characteristics of environmental factors and project activities instead on indicators assessed by evaluators. In this sense, this paper proposes the inclusion of the vulnerability of environmental factors and the potential environmental impact of project activities. For this purpose, the study described in this paper defined Total Impact Importance and specified a quantification procedure. The results obtained in the case study of oil drilling in Colombia reflect greater objectivity in the evaluation of impacts as well as a positive correlation between impact values, the environmental characteristics at and near the project location, and the technical characteristics of project activities. -- Highlights: • Concept of vulnerability has been used to calculate the importance impact assessment. • This paper defined Total Impact Importance and specified a quantification procedure. • The method includes the characteristics of environmental and project activities. • The application has shown greater objectivity in the evaluation of impacts. • Better correlation between impact values, environment and the project has been shown.

  4. A qualitative method proposal to improve environmental impact assessment

    International Nuclear Information System (INIS)

    Toro, Javier; Requena, Ignacio; Duarte, Oscar; Zamorano, Montserrat

    2013-01-01

    In environmental impact assessment, qualitative methods are used because they are versatile and easy to apply. This methodology is based on the evaluation of the strength of the impact by grading a series of qualitative attributes that can be manipulated by the evaluator. The results thus obtained are not objective, and all too often impacts are eliminated that should be mitigated with corrective measures. However, qualitative methodology can be improved if the calculation of Impact Importance is based on the characteristics of environmental factors and project activities instead on indicators assessed by evaluators. In this sense, this paper proposes the inclusion of the vulnerability of environmental factors and the potential environmental impact of project activities. For this purpose, the study described in this paper defined Total Impact Importance and specified a quantification procedure. The results obtained in the case study of oil drilling in Colombia reflect greater objectivity in the evaluation of impacts as well as a positive correlation between impact values, the environmental characteristics at and near the project location, and the technical characteristics of project activities. -- Highlights: • Concept of vulnerability has been used to calculate the importance impact assessment. • This paper defined Total Impact Importance and specified a quantification procedure. • The method includes the characteristics of environmental and project activities. • The application has shown greater objectivity in the evaluation of impacts. • Better correlation between impact values, environment and the project has been shown

  5. Principles, Methods of Participatory Research: Proposal for Draft Animal Power

    Directory of Open Access Journals (Sweden)

    E. Chia

    2004-03-01

    Full Text Available The meeting of researchers, who question themselves on the efficiency of their actions when they accompany stakeholders during change processes, provides the opportunity to ponder on the research methods to develop when working together with the stakeholders: participative research, research-action, research-intervention… The author proposes to present the research-action approach as new. If the three phases of research-action are important, the negotiation phase is essential, because it enables contract formalization among partners (ethical aspect, development of a common language, and formalization of structuring efforts between researchers with various specialties and stakeholders. In the research-action approach, the managing set-ups (scientific committees… play a major role: they guarantee at the same time a solution to problems, production, and the legitimacy of the scientific knowledge produced. In conclusion, the author suggests ways to develop research-action in the field of animal traction in order to conceive new socio-technical and organizational innovations that will make the use of this technique easier.

  6. A Time Domain Update Method for Reservoir History Matching of Electromagnetic Data

    KAUST Repository

    Katterbauer, Klemens

    2014-03-25

    The oil & gas industry has been the backbone of the world\\'s economy in the last century and will continue to be in the decades to come. With increasing demand and conventional reservoirs depleting, new oil industry projects have become more complex and expensive, operating in areas that were previously considered impossible and uneconomical. Therefore, good reservoir management is key for the economical success of complex projects requiring the incorporation of reliable uncertainty estimates for reliable production forecasts and optimizing reservoir exploitation. Reservoir history matching has played here a key role incorporating production, seismic, electromagnetic and logging data for forecasting the development of reservoirs and its depletion. With the advances in the last decade, electromagnetic techniques, such as crosswell electromagnetic tomography, have enabled engineers to more precisely map the reservoirs and understand their evolution. Incorporating the large amount of data efficiently and reducing uncertainty in the forecasts has been one of the key challenges for reservoir management. Computing the conductivity distribution for the field for adjusting parameters in the forecasting process via solving the inverse problem has been a challenge, due to the strong ill-posedness of the inversion problem and the extensive manual calibration required, making it impossible to be included into an efficient reservoir history matching forecasting algorithm. In the presented research, we have developed a novel Finite Difference Time Domain (FDTD) based method for incorporating electromagnetic data directly into the reservoir simulator. Based on an extended Archie relationship, EM simulations are performed for both forecasted and Porosity-Saturation retrieved conductivity parameters being incorporated directly into an update step for the reservoir parameters. This novel direct update method has significant advantages such as that it overcomes the expensive and ill

  7. Properties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    Online: 02 April (2018) ISSN 1017-1398 R&D Projects: GA ČR GA13-06684S Institutional support: RVO:67985807 Keywords : Unconstrained minimization * Block variable metric methods * Limited-memory methods * BFGS update * Global convergence * Numerical results Subject RIV: BA - General Mathematics OBOR OECD: Applied mathematics Impact factor: 1.241, year: 2016

  8. Simultaneous determination of some antiprotozoal drugs in different combined dosage forms by mean centering of ratio spectra and multivariate calibration with model updating methods

    Directory of Open Access Journals (Sweden)

    Abdelaleem Eglal A

    2012-04-01

    Full Text Available Abstract Background Metronidazole (MET and Diloxanide Furoate (DF, act as antiprotozoal drugs, in their ternary mixtures with Mebeverine HCl (MEH, an effective antispasmodic drug. This work concerns with the development and validation of two simple, specific and cost effective methods mainly for simultaneous determination of the proposed ternary mixture. In addition, the developed multivariate calibration model has been updated to determine Metronidazole benzoate (METB in its binary mixture with DF in Dimetrol® suspension. Results Method (I is the mean centering of ratio spectra spectrophotometric method (MCR that depends on using the mean centered ratio spectra in two successive steps that eliminates the derivative steps and therefore the signal to noise ratio is enhanced. The developed MCR method has been successfully applied for determination of MET, DF and MEH in different laboratory prepared mixtures and in tablets. Method (II is the partial least square (PLS multivariate calibration method that has been optimized for determination of MET, DF and MEH in Dimetrol ® tablets and by updating the developed model, it has been successfully used for prediction of binary mixtures of DF and Metronidazole Benzoate ester (METB in Dimetrol ® suspension with good accuracy and precision without reconstruction of the calibration set. Conclusion The developed methods have been validated; accuracy, precision and specificity were found to be within the acceptable limits. Moreover results obtained by the suggested methods showed no significant difference when compared with those obtained by reported methods. Graphical Abstract

  9. Proposal for a Five-Step Method to Elicit Expert Judgment

    Directory of Open Access Journals (Sweden)

    Duco Veen

    2017-12-01

    Full Text Available Elicitation is a commonly used tool to extract viable information from experts. The information that is held by the expert is extracted and a probabilistic representation of this knowledge is constructed. A promising avenue in psychological research is to incorporated experts’ prior knowledge in the statistical analysis. Systematic reviews on elicitation literature however suggest that it might be inappropriate to directly obtain distributional representations from experts. The literature qualifies experts’ performance on estimating elements of a distribution as unsatisfactory, thus reliably specifying the essential elements of the parameters of interest in one elicitation step seems implausible. Providing feedback within the elicitation process can enhance the quality of the elicitation and interactive software can be used to facilitate the feedback. Therefore, we propose to decompose the elicitation procedure into smaller steps with adjustable outcomes. We represent the tacit knowledge of experts as a location parameter and their uncertainty concerning this knowledge by a scale and shape parameter. Using a feedback procedure, experts can accept the representation of their beliefs or adjust their input. We propose a Five-Step Method which consists of (1 Eliciting the location parameter using the trial roulette method. (2 Provide feedback on the location parameter and ask for confirmation or adjustment. (3 Elicit the scale and shape parameter. (4 Provide feedback on the scale and shape parameter and ask for confirmation or adjustment. (5 Use the elicited and calibrated probability distribution in a statistical analysis and update it with data or to compute a prior-data conflict within a Bayesian framework. User feasibility and internal validity for the Five-Step Method are investigated using three elicitation studies.

  10. Update to Proposal for an Experiment to Measure Mixing, CP Violation and Rare Decays in Charm and Beauty Particle Decays at the Fermilab Collider - BTeV

    Energy Technology Data Exchange (ETDEWEB)

    Butler, Joel [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Stone, Sheldon [Syracuse Univ., NY (United States)

    2002-03-01

    We have been requested to submit an update of the BTeV plan to the Fermilab Physics Advisory Committee, where to save money the detector has only one arm and there is no new interaction region magnet construction planned. These are to come from a currently running collider experiment at the appropriate time. The "Physics Case" section is complete and updated with the section on the "New Physics" capabilites of BTeV greatly expanded. We show that precise measurements of rare flavor-changing neutral current processes and CP violation are and will be complementary to the Tevatron and LHC in unraveling the electroweak breaking puzzle. We include a revised summary of the physics sensitivities for the one-arm detector, which are not simply taking our proposal numbers and dividing by two, because of additional improvements. One important change resulted from an improved understanding of just how important the RJCH detector is to muon and electron identification, that we can indeed separate electrons from pions and muons from pions, especially at relatively large angles beyond the physical aperture of the EM calorimeter or the Muon Detector. This is documented in the "Physics Sensitivities" section. The section on the detector includes the motivation for doing b and c physics at a hadron collider, and shows the changes in the detector since the proposal based on our ongoing R&D program. We do not here include a detailed description of the entire detector. That is available in the May, 2000 proposal. We include a summary of our R&D activities for the entire experiment. Finally, we also include a fully updated cost estimate for the one-arm system.

  11. FIFRA Peer Review: Proposed Risk Assessment Methods Process

    Science.gov (United States)

    From September 11-14, 2012, EPA participated in a Federal Insecticide, Fungicide and Rodenticide Act Scientific Advisory Panel (SAP) meeting on a proposed pollinator risk assessment framework for determining the potential risks of pesticides to honey bees.

  12. SWCD: a sliding window and self-regulated learning-based background updating method for change detection in videos

    Science.gov (United States)

    Işık, Şahin; Özkan, Kemal; Günal, Serkan; Gerek, Ömer Nezih

    2018-03-01

    Change detection with background subtraction process remains to be an unresolved issue and attracts research interest due to challenges encountered on static and dynamic scenes. The key challenge is about how to update dynamically changing backgrounds from frames with an adaptive and self-regulated feedback mechanism. In order to achieve this, we present an effective change detection algorithm for pixelwise changes. A sliding window approach combined with dynamic control of update parameters is introduced for updating background frames, which we called sliding window-based change detection. Comprehensive experiments on related test videos show that the integrated algorithm yields good objective and subjective performance by overcoming illumination variations, camera jitters, and intermittent object motions. It is argued that the obtained method makes a fair alternative in most types of foreground extraction scenarios; unlike case-specific methods, which normally fail for their nonconsidered scenarios.

  13. Update of Standard Practices for New Method Validation in Forensic Toxicology.

    Science.gov (United States)

    Wille, Sarah M R; Coucke, Wim; De Baere, Thierry; Peters, Frank T

    2017-01-01

    International agreement concerning validation guidelines is important to obtain quality forensic bioanalytical research and routine applications as it all starts with the reporting of reliable analytical data. Standards for fundamental validation parameters are provided in guidelines as those from the US Food and Drug Administration (FDA), the European Medicines Agency (EMA), the German speaking Gesellschaft fur Toxikologie und Forensische Chemie (GTFCH) and the Scientific Working Group of Forensic Toxicology (SWGTOX). These validation parameters include selectivity, matrix effects, method limits, calibration, accuracy and stability, as well as other parameters such as carryover, dilution integrity and incurred sample reanalysis. It is, however, not easy for laboratories to implement these guidelines into practice as these international guidelines remain nonbinding protocols, that depend on the applied analytical technique, and that need to be updated according the analyst's method requirements and the application type. In this manuscript, a review of the current guidelines and literature concerning bioanalytical validation parameters in a forensic context is given and discussed. In addition, suggestions for the experimental set-up, the pros and cons of statistical approaches and adequate acceptance criteria for the validation of bioanalytical applications are given. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  14. Updated Heat Atlas calculation method. Layout of flooded evaporators; Aktualisierte Waermeatlas-Rechenmethode. Auslegung ueberfluteter Verdampfer

    Energy Technology Data Exchange (ETDEWEB)

    Gorenflo, Dieter; Baumhoegger, Elmar; Herres, Gerhard [Paderborn Univ. (Germany). Thermodynamik und Energietechnik; Kotthoff, Stephan [Siemens AG, Goerlitz (Germany)

    2012-07-01

    For years, the most precise forecast of the heat transfer performance of evaporators is a current topic with regard to an efficient energy utilization. An established calculation method for the new edition of the Heat Atlas was updated with regard to flooded evaporators which especially were implemented in air-conditioning and cooling systems. The contribution under consideration outlines this method and enlarges upon the innovations in detail. The impact of the heat flow density and boiling pressure on the heat transfer during pool boiling is modified by means of measurement in the case of a single, horizontal vaporizer tube. Above all, the impact of the fluid can be described easier and more exact. The authors compare the forecasting results with the experimental results regarding the ribbing of the heating surface and impact of the bundle. Furthermore, examples of close boiling and near azeotropic mixtures were admitted to the Heat Atlas. The authors also consider the positive effect of the rising bubble swarm when boiling the mixture in horizontal tube bundles.

  15. Proposed efficient method for ticket booking (PEMTB) | Ahmed ...

    African Journals Online (AJOL)

    Journal of Fundamental and Applied Sciences. Journal Home ... We used angular JS, ionic for a front end and node.js, express.js for a back end and mongo DB for a database. ... Our proposed system is totally softcopy and in digitalized.

  16. [Contribution of the cervical vertebral maturation (CVM) method to dentofacial orthopedics: update].

    Science.gov (United States)

    Elhaddaoui, R; Benyahia, H; Azaroual, F; Zaoui, F

    2014-11-01

    The successful orthopedic treatment of skeletal Class II malocclusions is closely related to the reasoned determination of the optimal time to initiate the treatment. This is why various methods have been proposed to assess skeletal maturation, such as a hand-wrist radiograph or the cervical vertebral maturation (CVM) method. The hand-wrist radiograph was up to now the most frequently used method to assess skeletal maturation. However, the clinical and biological limitations of this technique, as well as the need to perform an additional radiograph, were reasons to develop another method to explore the maturation stages of visible cervical vertebrae on a simple lateral cephalometric radiograph. The authors compare the 2 methods and prove the greater contribution of the CVM method compared to the hand-wrist radiograph. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  17. Updating the 2001 National Land Cover Database land cover classification to 2006 by using Landsat imagery change detection methods

    Science.gov (United States)

    Xian, George; Homer, Collin G.; Fry, Joyce

    2009-01-01

    The recent release of the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001, which represents the nation's land cover status based on a nominal date of 2001, is widely used as a baseline for national land cover conditions. To enable the updating of this land cover information in a consistent and continuous manner, a prototype method was developed to update land cover by an individual Landsat path and row. This method updates NLCD 2001 to a nominal date of 2006 by using both Landsat imagery and data from NLCD 2001 as the baseline. Pairs of Landsat scenes in the same season in 2001 and 2006 were acquired according to satellite paths and rows and normalized to allow calculation of change vectors between the two dates. Conservative thresholds based on Anderson Level I land cover classes were used to segregate the change vectors and determine areas of change and no-change. Once change areas had been identified, land cover classifications at the full NLCD resolution for 2006 areas of change were completed by sampling from NLCD 2001 in unchanged areas. Methods were developed and tested across five Landsat path/row study sites that contain several metropolitan areas including Seattle, Washington; San Diego, California; Sioux Falls, South Dakota; Jackson, Mississippi; and Manchester, New Hampshire. Results from the five study areas show that the vast majority of land cover change was captured and updated with overall land cover classification accuracies of 78.32%, 87.5%, 88.57%, 78.36%, and 83.33% for these areas. The method optimizes mapping efficiency and has the potential to provide users a flexible method to generate updated land cover at national and regional scales by using NLCD 2001 as the baseline.

  18. Proposed test method for determining discharge rates from water closets

    DEFF Research Database (Denmark)

    Nielsen, V.; Fjord Jensen, T.

    At present the rates at which discharge takes place from sanitary appliances are mostly known only in the form of estimated average values. SBI has developed a measuring method enabling determination of the exact rate of discharge from a sanitary appliance as function of time. The methods depends...

  19. The Complete and Updated "Rotifer Polyculture Method" for Rearing First Feeding Zebrafish

    Science.gov (United States)

    Lawrence, Christian; Best, Jason; Cockington, Jason; Henry, Eric C.; Hurley, Shane; James, Althea; Lapointe, Christopher; Maloney, Kara; Sanders, Erik

    2016-01-01

    The zebrafish (Danio rerio) is a model organism of increasing importance in many fields of science. One of the most demanding technical aspects of culture of this species in the laboratory is rearing first-feeding larvae to the juvenile stage with high rates of growth and survival. The central management challenge of this developmental period revolves around delivering highly nutritious feed items to the fish on a nearly continuous basis without compromising water quality. Because larval zebrafish are well-adapted to feed on small zooplankton in the water column, live prey items such as brachionid rotifers, Artemia, and Paramecium are widely recognized as the feeds of choice, at least until the fish reach the juvenile stage and are able to efficiently feed on processed diets. This protocol describes a method whereby newly hatched zebrafish larvae are cultured together with live saltwater rotifers (Brachionus plicatilis) in the same system. This polyculture approach provides fish with an "on-demand", nutrient-rich live food source without producing chemical waste at levels that would otherwise limit performance. Importantly, because the system harnesses both the natural high productivity of the rotifers and the behavioral preferences of the fish, the labor involved with maintenance is low. The following protocol details an updated, step-by-step procedure that incorporates rotifer production (scalable to any desired level) for use in a polyculture of zebrafish larvae and rotifers that promotes maximal performance during the first 5 days of exogenous feeding. PMID:26863035

  20. BE-2004: International meeting on updates in best estimate methods in nuclear installation safety analysis. Proceedings

    International Nuclear Information System (INIS)

    2004-01-01

    BE-2004 is the second in a series of embedded conferences that focus on generating and sustaining the dialogue regarding the use of best estimate plus uncertainty tools to license operational and advanced nuclear systems. The first conference in the series was held during the 2000 American Nuclear Society Winter Meeting in Washington. BE-2004 is international in scope, as evidenced by the multinational sources of the papers, and is intended to serve as an opportunity for information exchange between research scientists, practicing engineers, and regulators. However, as appropriate to a follow-on conference, the primary theme of BE-2004 is to provide updates reflecting the progress in best estimate methodologies in the last four years. Examples include research activities that evolved from the current Generation-IV initiative and other new designs [Nuclear Energy Research Initiative (NERI), etc.], core design and neutronic calculations that support best estimate analysis, use of advanced methodologies to produce plant licensing procedures competitive with best estimate methods, and of course current philosophical and technical issues that need to be considered in implementing best estimate codes as an established part of the international licensing framework

  1. Qualitative methods in radiography research: a proposed framework

    International Nuclear Information System (INIS)

    Adams, J.; Smith, T.

    2003-01-01

    Introduction: While radiography is currently developing a research base, which is important in terms of professional development and informing practice and policy issues in the field, the amount of research published by radiographers remains limited. However, a range of qualitative methods offer further opportunities for radiography research. Purpose: This paper briefly introduces a number of key qualitative methods (qualitative interviews, focus groups, observational methods, diary methods and document/text analysis) and sketches one possible framework for future qualitative work in radiography research. The framework focuses upon three areas for study: intra-professional issues; inter-professional issues; and clinical practice, patient and health delivery issues. While the paper outlines broad areas for future focus rather than providing a detailed protocol for how individual pieces of research should be conducted, a few research questions have been chosen and examples of possible qualitative methods required to answer such questions are outlined for each area. Conclusion: Given the challenges and opportunities currently facing the development of a research base within radiography, the outline of key qualitative methods and broad areas suitable for their application is offered as a useful tool for those within the profession looking to embark upon or enhance their research career

  2. Generalizations of the limited-memory BFGS method based on the quasi-product form of update

    Czech Academy of Sciences Publication Activity Database

    Vlček, Jan; Lukšan, Ladislav

    2013-01-01

    Roč. 241, 15 March (2013), s. 116-129 ISSN 0377-0427 R&D Projects: GA ČR GA201/09/1957 Institutional research plan: CEZ:AV0Z10300504 Keywords : unconstrained minimization * variable metric methods * limited-memory methods * Broyden class updates * global convergence * numerical results Subject RIV: BA - General Mathematics Impact factor: 1.077, year: 2013

  3. Proposed New Method of Interpretation of Infrared Ship Signature Requirements

    NARCIS (Netherlands)

    Neele, F.P.; Wilson, M.T.; Youern, K.

    2005-01-01

    new method of deriving and defining requirements for the infrared signature of new ships is presented. The current approach is to specify the maximum allowed temperature or radiance contrast of the ship with respect to its background. At present, in most NATO countries, it is the contractor’s

  4. Kidney function measured by clearance. Methods and indications. An update; Zur Messung der Nierenfunktion durch Clearancebestimmungen. Methoden und Indikationen. Ein Update

    Energy Technology Data Exchange (ETDEWEB)

    Durand, E. [Universitaetskrankenhaus Bicetre, Paris (France); Mueller-Suur, R. [Karolinska Inst., Danderyds Krankenhaus und Aleris Fysiologlab, Stockholm (Sweden)

    2010-09-15

    Renal function impairment can be monitored by many tests. Measurement of plasma-creatinine level is the most used method, 24 h plasma-creatininclearance with urine collection is used by others and also other alternative but indirect methods exist. However, the use of radionuclide-clearances is by far the easiest performed method with the highest accuracy and precision and gives very low irradiation dose. In the following we will discuss the different radiopharmaceuticals in use, their plus and minus, the different clearance methods in use, their limitations and give some clinically important indications to perform clearance investigations according to consensus reports. Summarizing the use of plasma clearance of 51-Cr-EDTA after a single injection with one blood sample can be generally recommended, however with some modifications in special clinical situations, which are pointed out. Please note, that this paper is, for a significant part, an update of a previous paper published 2003 in ''Der Nuklearmediziner'' (orig.)

  5. Signal predictions for a proposed fast neutron interrogation method

    International Nuclear Information System (INIS)

    Sale, K.E.

    1992-12-01

    We have applied the Monte Carlo radiation transport code COG) to assess the utility of a proposed explosives detection scheme based on neutron emission. In this scheme a pulsed neutron beam is generated by an approximately seven MeV deuteron beam incident on a thick Be target. A scintillation detector operating in the current mode measures the neutrons transmitted through the object as a function of time. The flight time of unscattered neutrons from the source to the detector is simply related to the neutron energy. This information along with neutron cross section excitation functions is used to infer the densities of H, C, N and O in the volume sampled. The code we have chosen to use enables us to create very detailed and realistic models of the geometrical configuration of the system, the neutron source and of the detector response. By calculating the signals that will be observed for several configurations and compositions of interrogated object we can investigate and begin to understand how a system that could actually be fielded will perform. Using this modeling capability many early on with substantial savings in time and cost and with improvements in performance. We will present our signal predictions for simple single element test cases and for explosive compositions. From these studies it is dear that the interpretation of the signals from such an explosives identification system will pose a substantial challenge

  6. Centrifugal compressor shape modification using a proposed inverse design method

    International Nuclear Information System (INIS)

    Niliahmadabadi, Mahdi; Poursadegh, Farzad

    2013-01-01

    This paper is concerned with a quasi-3D design method for the radial and axial diffusers of a centrifugal compressor on the meridional plane. The method integrates a novel inverse design algorithm, called ball-spine algorithm (BSA), and a quasi-3D analysis code. The Euler equation is solved on the meridional plane for a numerical domain, of which unknown boundaries (hub and shroud) are iteratively modified under the BSA until a prescribed pressure distribution is reached. In BSA, unknown walls are composed of a set of virtual balls that move freely along specified directions called spines. The difference between target and current pressure distributions causes the flexible boundary to deform at each modification step. In validating the quasi-3D analysis code, a full 3D Navier-Stokes code is used to analyze the existing and designed compressors numerically. Comparison of the quasi-3D analysis results with full 3D analysis results shows viable agreement. The 3D numerical analysis of the current compressor shows a huge total pressure loss on the 90 .deg. bend between the radial and axial diffusers. Geometric modification of the meridional plane causes the efficiency to improve by about 10%.

  7. A proposed method for fast determination of plasma parameters

    International Nuclear Information System (INIS)

    Braams, B.J.; Lackner, K.

    1984-09-01

    The method of function parametrization, developed and applied by H. Wind for fast data evaluation in high energy physics, is presented in the context of controlled fusion research. This method relies on statistical analysis of a data base of simulated experiments in order to obtain a functional representation for the intrinsic physical parameters of a system in terms of the values of the measurements. Some variations on Wind's original procedure are suggested. A specific application for tokamak experiments would be the determination of certain global parameters of the plasma, characterizing the current profile, shape of the cross-section, plasma pressure, and the internal inductance. The relevant measurements for this application include values of the poloidal field and flux external to the plasma, and a diamagnetic measurement. These may be combined with other diagnostics, such as electron-cyclotron emission and laser interferometry, in order to obtain also density and temperature profiles. There appears to be a capability for on-line determination of basic physical parameters, in a millisecond timescale on a minicomputer instead of in seconds on a large mainframe. (orig.)

  8. Centrifugal compressor shape modification using a proposed inverse design method

    Energy Technology Data Exchange (ETDEWEB)

    Niliahmadabadi, Mahdi [Isfahan University of Technology, Isfahan (Iran, Islamic Republic of); Poursadegh, Farzad [Sharif University of Technology, Tehran (Iran, Islamic Republic of)

    2013-03-15

    This paper is concerned with a quasi-3D design method for the radial and axial diffusers of a centrifugal compressor on the meridional plane. The method integrates a novel inverse design algorithm, called ball-spine algorithm (BSA), and a quasi-3D analysis code. The Euler equation is solved on the meridional plane for a numerical domain, of which unknown boundaries (hub and shroud) are iteratively modified under the BSA until a prescribed pressure distribution is reached. In BSA, unknown walls are composed of a set of virtual balls that move freely along specified directions called spines. The difference between target and current pressure distributions causes the flexible boundary to deform at each modification step. In validating the quasi-3D analysis code, a full 3D Navier-Stokes code is used to analyze the existing and designed compressors numerically. Comparison of the quasi-3D analysis results with full 3D analysis results shows viable agreement. The 3D numerical analysis of the current compressor shows a huge total pressure loss on the 90 .deg. bend between the radial and axial diffusers. Geometric modification of the meridional plane causes the efficiency to improve by about 10%.

  9. Updating the 2001 National Land Cover Database Impervious Surface Products to 2006 using Landsat imagery change detection methods

    Science.gov (United States)

    Xian, George; Homer, Collin G.

    2010-01-01

    A prototype method was developed to update the U.S. Geological Survey (USGS) National Land Cover Database (NLCD) 2001 to a nominal date of 2006. NLCD 2001 is widely used as a baseline for national land cover and impervious cover conditions. To enable the updating of this database in an optimal manner, methods are designed to be accomplished by individual Landsat scene. Using conservative change thresholds based on land cover classes, areas of change and no-change were segregated from change vectors calculated from normalized Landsat scenes from 2001 and 2006. By sampling from NLCD 2001 impervious surface in unchanged areas, impervious surface predictions were estimated for changed areas within an urban extent defined by a companion land cover classification. Methods were developed and tested for national application across six study sites containing a variety of urban impervious surface. Results show the vast majority of impervious surface change associated with urban development was captured, with overall RMSE from 6.86 to 13.12% for these areas. Changes of urban development density were also evaluated by characterizing the categories of change by percentile for impervious surface. This prototype method provides a relatively low cost, flexible approach to generate updated impervious surface using NLCD 2001 as the baseline.

  10. A comprehensive change detection method for updating the National Land Cover Database to circa 2011

    Science.gov (United States)

    Jin, Suming; Yang, Limin; Danielson, Patrick; Homer, Collin G.; Fry, Joyce; Xian, George

    2013-01-01

    The importance of characterizing, quantifying, and monitoring land cover, land use, and their changes has been widely recognized by global and environmental change studies. Since the early 1990s, three U.S. National Land Cover Database (NLCD) products (circa 1992, 2001, and 2006) have been released as free downloads for users. The NLCD 2006 also provides land cover change products between 2001 and 2006. To continue providing updated national land cover and change datasets, a new initiative in developing NLCD 2011 is currently underway. We present a new Comprehensive Change Detection Method (CCDM) designed as a key component for the development of NLCD 2011 and the research results from two exemplar studies. The CCDM integrates spectral-based change detection algorithms including a Multi-Index Integrated Change Analysis (MIICA) model and a novel change model called Zone, which extracts change information from two Landsat image pairs. The MIICA model is the core module of the change detection strategy and uses four spectral indices (CV, RCVMAX, dNBR, and dNDVI) to obtain the changes that occurred between two image dates. The CCDM also includes a knowledge-based system, which uses critical information on historical and current land cover conditions and trends and the likelihood of land cover change, to combine the changes from MIICA and Zone. For NLCD 2011, the improved and enhanced change products obtained from the CCDM provide critical information on location, magnitude, and direction of potential change areas and serve as a basis for further characterizing land cover changes for the nation. An accuracy assessment from the two study areas show 100% agreement between CCDM mapped no-change class with reference dataset, and 18% and 82% disagreement for the change class for WRS path/row p22r39 and p33r33, respectively. The strength of the CCDM is that the method is simple, easy to operate, widely applicable, and capable of capturing a variety of natural and

  11. Human Expeditions to Near-Earth Asteroids: An Update on NASA's Status and Proposed Activities for Small Body Exploration

    Science.gov (United States)

    Abell, Paul; Mazanek, Dan; Barbee, Brent; Landis, Rob; Johnson, Lindley; Yeomans, Don; Reeves, David; Drake, Bret; Friedensen, Victoria

    2013-01-01

    Over the past several years, much attention has been focused on the human exploration of near-Earth asteroids (NEAs). Two independent NASA studies examined the feasibility of sending piloted missions to NEAs, and in 2009, the Augustine Commission identified NEAs as high profile destinations for human exploration missions beyond the Earth- Moon system as part of the Flexible Path. More recently the current U.S. presidential administration directed NASA to include NEAs as destinations for future human exploration with the goal of sending astronauts to a NEA in the mid to late 2020s. This directive became part of the official National Space Policy of the United States of America as of June 28, 2010. The scientific and hazard mitigation benefits, along with the programmatic and operational benefits of a human venture beyond the Earth-Moon system, make a mission to a NEA using NASA s proposed exploration systems a compelling endeavor.

  12. Description and status update on GELLO: a proposed standardized object-oriented expression language for clinical decision support.

    Science.gov (United States)

    Sordo, Margarita; Boxwala, Aziz A; Ogunyemi, Omolola; Greenes, Robert A

    2004-01-01

    A major obstacle to sharing computable clinical knowledge is the lack of a common language for specifying expressions and criteria. Such a language could be used to specify decision criteria, formulae, and constraints on data and action. Al-though the Arden Syntax addresses this problem for clinical rules, its generalization to HL7's object-oriented data model is limited. The GELLO Expression language is an object-oriented language used for expressing logical conditions and computations in the GLIF3 (GuideLine Interchange Format, v. 3) guideline modeling language. It has been further developed under the auspices of the HL7 Clinical Decision Support Technical Committee, as a proposed HL7 standard., GELLO is based on the Object Constraint Language (OCL), because it is vendor-independent, object-oriented, and side-effect-free. GELLO expects an object-oriented data model. Although choice of model is arbitrary, standardization is facilitated by ensuring that the data model is compatible with the HL7 Reference Information Model (RIM).

  13. A Review of the Extraction and Determination Methods of Thirteen Essential Vitamins to the Human Body: An Update from 2010.

    Science.gov (United States)

    Zhang, Yuan; Zhou, Wei-E; Yan, Jia-Qing; Liu, Min; Zhou, Yu; Shen, Xin; Ma, Ying-Lin; Feng, Xue-Song; Yang, Jun; Li, Guo-Hui

    2018-06-19

    Vitamins are a class of essential nutrients in the body; thus, they play important roles in human health. The chemicals are involved in many physiological functions and both their lack and excess can put health at risk. Therefore, the establishment of methods for monitoring vitamin concentrations in different matrices is necessary. In this review, an updated overview of the main pretreatments and determination methods that have been used since 2010 is given. Ultrasonic assisted extraction, liquid⁻liquid extraction, solid phase extraction and dispersive liquid⁻liquid microextraction are the most common pretreatment methods, while the determination methods involve chromatography methods, electrophoretic methods, microbiological assays, immunoassays, biosensors and several other methods. Different pretreatments and determination methods are discussed.

  14. A Review of the Extraction and Determination Methods of Thirteen Essential Vitamins to the Human Body: An Update from 2010

    Directory of Open Access Journals (Sweden)

    Yuan Zhang

    2018-06-01

    Full Text Available Vitamins are a class of essential nutrients in the body; thus, they play important roles in human health. The chemicals are involved in many physiological functions and both their lack and excess can put health at risk. Therefore, the establishment of methods for monitoring vitamin concentrations in different matrices is necessary. In this review, an updated overview of the main pretreatments and determination methods that have been used since 2010 is given. Ultrasonic assisted extraction, liquid–liquid extraction, solid phase extraction and dispersive liquid–liquid microextraction are the most common pretreatment methods, while the determination methods involve chromatography methods, electrophoretic methods, microbiological assays, immunoassays, biosensors and several other methods. Different pretreatments and determination methods are discussed.

  15. Update on CNSC's readiness to regulate projects proposing the use of Small Modular Reactors (SMR)

    International Nuclear Information System (INIS)

    De Vos, M.; Lee, K.

    2014-01-01

    Over the past few years, Canadian Nuclear Security Commission (CNSC)staff have been working to identify and understand key regulatory and technical issues that may be encountered in Small Modular Reactor (SMR) deployment scenarios in Canada. This work is considered necessary not only to be ready to engage with vendors and utilities in technical and licensing discussions, but also to prepare to disseminate objective scientific, technical and regulatory information to the public. Beyond size differences, SMRs are reactor-based facilities. The main finding from CNSC's work-to-date is that most, if not all, of the regulatory issues to be addressed from a Canadian perspective are due to the alternate or novel approaches that proponents of SMRs are proposing and that present uncertainties from the perspective of proven technology or public acceptance. These uncertainties represent risks that need to be mitigated by proponents before the environmental assessment and licensing processes are initiated. Examples of alternate or novel approaches include, but are not limited to: non-traditional siting scenarios (remote regions, near industrial facilities, in urban areas); increased use of physical design measures to reduce the need for security personnel; fleet-based regional emergency planning and response; design specific passive design features; and, remote operation of the facility This paper covers two main themes: 1. CNSC staff have made significant progress in the ongoing characterization of key regulatory and licensing issues that may emerge in deployment of both large and / or small SMRs in Canada. This work informs CNSC’s regulatory framework activities. 2. Many regulatory framework development activities (e.g. REGDOCs) have either already accounted for SMR concepts in the development of requirements and guidance or are planning to do so. This includes taking into account the ability to use a graded approach. (author)

  16. An empirical Bayes method for updating inferences in analysis of quantitative trait loci using information from related genome scans.

    Science.gov (United States)

    Zhang, Kui; Wiener, Howard; Beasley, Mark; George, Varghese; Amos, Christopher I; Allison, David B

    2006-08-01

    Individual genome scans for quantitative trait loci (QTL) mapping often suffer from low statistical power and imprecise estimates of QTL location and effect. This lack of precision yields large confidence intervals for QTL location, which are problematic for subsequent fine mapping and positional cloning. In prioritizing areas for follow-up after an initial genome scan and in evaluating the credibility of apparent linkage signals, investigators typically examine the results of other genome scans of the same phenotype and informally update their beliefs about which linkage signals in their scan most merit confidence and follow-up via a subjective-intuitive integration approach. A method that acknowledges the wisdom of this general paradigm but formally borrows information from other scans to increase confidence in objectivity would be a benefit. We developed an empirical Bayes analytic method to integrate information from multiple genome scans. The linkage statistic obtained from a single genome scan study is updated by incorporating statistics from other genome scans as prior information. This technique does not require that all studies have an identical marker map or a common estimated QTL effect. The updated linkage statistic can then be used for the estimation of QTL location and effect. We evaluate the performance of our method by using extensive simulations based on actual marker spacing and allele frequencies from available data. Results indicate that the empirical Bayes method can account for between-study heterogeneity, estimate the QTL location and effect more precisely, and provide narrower confidence intervals than results from any single individual study. We also compared the empirical Bayes method with a method originally developed for meta-analysis (a closely related but distinct purpose). In the face of marked heterogeneity among studies, the empirical Bayes method outperforms the comparator.

  17. Construction Method of Display Proposal for Commodities in Sales Promotion by Genetic Algorithm

    Science.gov (United States)

    Yumoto, Masaki

    In a sales promotion task, wholesaler prepares and presents the display proposal for commodities in order to negotiate with retailer's buyers what commodities they should sell. For automating the sales promotion tasks, the proposal has to be constructed according to the target retailer's buyer. However, it is difficult to construct the proposal suitable for the target retail store because of too much combination of commodities. This paper proposes a construction method by Genetic algorithm (GA). The proposed method represents initial display proposals for commodities with genes, improve ones with the evaluation value by GA, and rearrange one with the highest evaluation value according to the classification of commodity. Through practical experiment, we can confirm that display proposal by the proposed method is similar with the one constructed by a wholesaler.

  18. Antiretroviral treatment cohort analysis using time-updated CD4 counts: assessment of bias with different analytic methods.

    Directory of Open Access Journals (Sweden)

    Katharina Kranzer

    Full Text Available Survival analysis using time-updated CD4+ counts during antiretroviral therapy is frequently employed to determine risk of clinical events. The time-point when the CD4+ count is assumed to change potentially biases effect estimates but methods used to estimate this are infrequently reported.This study examined the effect of three different estimation methods: assuming i a constant CD4+ count from date of measurement until the date of next measurement, ii a constant CD4+ count from the midpoint of the preceding interval until the midpoint of the subsequent interval and iii a linear interpolation between consecutive CD4+ measurements to provide additional midpoint measurements. Person-time, tuberculosis rates and hazard ratios by CD4+ stratum were compared using all available CD4+ counts (measurement frequency 1-3 months and 6 monthly measurements from a clinical cohort. Simulated data were used to compare the extent of bias introduced by these methods.The midpoint method gave the closest fit to person-time spent with low CD4+ counts and for hazard ratios for outcomes both in the clinical dataset and the simulated data.The midpoint method presents a simple option to reduce bias in time-updated CD4+ analysis, particularly at low CD4 cell counts and rapidly increasing counts after ART initiation.

  19. Proposals for Updating Tai Algorithm

    Science.gov (United States)

    1997-12-01

    1997 meeting, the Comiti International des Poids et Mesures (CIPM) decided to change the name of the Comiti Consultatif pour la Difinition de la ...Report of the BIPM Time Section, 1988,1, D1-D22. [2] P. Tavella, C. Thomas, Comparative study of time scale algorithms, Metrologia , 1991, 28, 57...alternative choice for implementing an upper limit of clock weights, Metrologia , 1996, 33, 227-240. [5] C. Thomas, Impact of New Clock Technologies

  20. Sensitivity study of a method for updating a finite element model of a nuclear power station cooling tower

    International Nuclear Information System (INIS)

    Billet, L.

    1994-01-01

    The Research and Development Division of Electricite de France is developing a surveillance method of cooling towers involving on-site wind-induced measurements. The method is supposed to detect structural damage in the tower. The damage is identified by tuning a finite element model of the tower on experimental mode shapes and eigenfrequencies. The sensitivity of the method was evaluated through numerical tests. First, the dynamic response of a damaged tower was simulated by varying the stiffness of some area of the model shell (from 1 % to 24 % of the total shell area). Second, the structural parameters of the undamaged cooling tower model were updated in order to make the output of the undamaged model as close as possible to the synthetic experimental data. The updating method, based on the minimization of the differences between experimental modal energies and modal energies calculated by the model, did not detect a stiffness change over less than 3 % of the shell area. Such a sensitivity is thought to be insufficient to detect tower cracks which behave like highly localized defaults. (author). 8 refs., 9 figs., 6 tabs

  1. Evaluation of a proposed optimization method for discrete-event simulation models

    Directory of Open Access Journals (Sweden)

    Alexandre Ferreira de Pinho

    2012-12-01

    Full Text Available Optimization methods combined with computer-based simulation have been utilized in a wide range of manufacturing applications. However, in terms of current technology, these methods exhibit low performance levels which are only able to manipulate a single decision variable at a time. Thus, the objective of this article is to evaluate a proposed optimization method for discrete-event simulation models based on genetic algorithms which exhibits more efficiency in relation to computational time when compared to software packages on the market. It should be emphasized that the variable's response quality will not be altered; that is, the proposed method will maintain the solutions' effectiveness. Thus, the study draws a comparison between the proposed method and that of a simulation instrument already available on the market and has been examined in academic literature. Conclusions are presented, confirming the proposed optimization method's efficiency.

  2. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  3. A Proposal of Operational Risk Management Method Using FMEA for Drug Manufacturing Computerized System

    Science.gov (United States)

    Takahashi, Masakazu; Nanba, Reiji; Fukue, Yoshinori

    This paper proposes operational Risk Management (RM) method using Failure Mode and Effects Analysis (FMEA) for drug manufacturing computerlized system (DMCS). The quality of drug must not be influenced by failures and operational mistakes of DMCS. To avoid such situation, DMCS has to be conducted enough risk assessment and taken precautions. We propose operational RM method using FMEA for DMCS. To propose the method, we gathered and compared the FMEA results of DMCS, and develop a list that contains failure modes, failures and countermeasures. To apply this list, we can conduct RM in design phase, find failures, and conduct countermeasures efficiently. Additionally, we can find some failures that have not been found yet.

  4. Updating Stiffness and Hysteretic Damping Matrices Using Measured Modal Data

    Directory of Open Access Journals (Sweden)

    Jiashang Jiang

    2018-01-01

    Full Text Available A new direct method for the finite element (FE matrix updating problem in a hysteretic (or material damping model based on measured incomplete vibration modal data is presented. With this method, the optimally approximated stiffness and hysteretic damping matrices can be easily constructed. The physical connectivity of the original model is preserved and the measured modal data are embedded in the updated model. The numerical results show that the proposed method works well.

  5. Updating Stiffness and Hysteretic Damping Matrices Using Measured Modal Data

    OpenAIRE

    Jiashang Jiang; Yongxin Yuan

    2018-01-01

    A new direct method for the finite element (FE) matrix updating problem in a hysteretic (or material) damping model based on measured incomplete vibration modal data is presented. With this method, the optimally approximated stiffness and hysteretic damping matrices can be easily constructed. The physical connectivity of the original model is preserved and the measured modal data are embedded in the updated model. The numerical results show that the proposed method works well.

  6. Applicability of the proposed evaluation method for social infrastructures to nuclear power plants

    International Nuclear Information System (INIS)

    Ichimura, Tomiyasu

    2015-01-01

    This study proposes an evaluation method for social infrastructures, and verifies the applicability of the proposed evaluation method to social infrastructures by applying it to nuclear power plants, which belong to social infrastructures. In the proposed evaluation method for social infrastructures, the authors chose four evaluation viewpoints and proposed common evaluation standards for the evaluation indexes obtained from each viewpoint. By applying this system to the evaluation of nuclear power plants, the evaluation index examples were obtained from the evaluation viewpoints. Furthermore, when the level of the common evaluation standards of the proposed evaluation method was applied to the evaluation of the activities of nuclear power plants based on the regulations, it was confirmed that these activities are at the highest level. Through this application validation, it was clarified that the proposed evaluation method for social infrastructures had certain effectiveness. The four evaluation viewpoints are 'service,' 'environment,' 'action factor,' and 'operation and management.' Part of the application examples to a nuclear power plant are as follows: (1) in the viewpoint of service: the operation rate of the power plant, and operation costs, and (2) in the viewpoint of environment: external influence related to nuclear waste and radioactivity, and external effect related to cooling water. (A.O.)

  7. Updates of CORESTA Recommended Methods after Further Collaborative Studies Carried Out under Both ISO and Health Canada Intense Smoking Regimes

    Directory of Open Access Journals (Sweden)

    Purkis SW

    2014-12-01

    Full Text Available During 2012, three CORESTA Recommended Methods (CRMs (1-3 were updated to include smoke yield and variability data under both ISO (4 and the Canadian Intense (CI (5 smoking regimes. At that time, repeatability and reproducibility data under the CI regime on smoke analytes other than “tar”, nicotine and carbon monoxide (6 and tobacco-specific nitrosamines (TSNAs (7 were not available in the public literature. The subsequent work involved the determination of the mainstream smoke yields of benzo[a]-pyrene, selected volatiles (benzene, toluene, 1,3-butadiene, isoprene, acrylonitrile, and selected carbonyls (acetaldehyde, formaldehyde, propionaldehyde, butyraldehyde, crotonaldehyde, acrolein, acetone and 2-butanone in ten cigarette products followed by statistical analyses according to the ISO protocol (8. This paper provides some additional perspective on the data variability under the ISO and CI smoking regimes not given in the CRMs.

  8. Newton’s method an updated approach of Kantorovich’s theory

    CERN Document Server

    Ezquerro Fernández, José Antonio

    2017-01-01

    This book shows the importance of studying semilocal convergence in iterative methods through Newton's method and addresses the most important aspects of the Kantorovich's theory including implicated studies. Kantorovich's theory for Newton's method used techniques of functional analysis to prove the semilocal convergence of the method by means of the well-known majorant principle. To gain a deeper understanding of these techniques the authors return to the beginning and present a deep-detailed approach of Kantorovich's theory for Newton's method, where they include old results, for a historical perspective and for comparisons with new results, refine old results, and prove their most relevant results, where alternative approaches leading to new sufficient semilocal convergence criteria for Newton's method are given. The book contains many numerical examples involving nonlinear integral equations, two boundary value problems and systems of nonlinear equations related to numerous physical phenomena. The book i...

  9. Optimal plot size in the evaluation of papaya scions: proposal and comparison of methods

    Directory of Open Access Journals (Sweden)

    Humberto Felipe Celanti

    Full Text Available ABSTRACT Evaluating the quality of scions is extremely important and it can be done by characteristics of shoots and roots. This experiment evaluated height of the aerial part, stem diameter, number of leaves, petiole length and length of roots of papaya seedlings. Analyses were performed from a blank trial with 240 seedlings of "Golden Pecíolo Curto". The determination of the optimum plot size was done by applying the methods of maximum curvature, maximum curvature of coefficient of variation and a new proposed method, which incorporates the bootstrap resampling simulation to the maximum curvature method. According to the results obtained, five is the optimal number of seedlings of papaya "Golden Pecíolo Curto" per plot. The proposed method of bootstrap simulation with replacement provides optimal plot sizes equal or higher than the maximum curvature method and provides same plot size than maximum curvature method of the coefficient of variation.

  10. A proposal on evaluation method of neutron absorption performance to substitute conventional neutron attenuation test

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Je Hyun; Shim, Chang Ho [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of); Kim, Sung Hyun [Nuclear Fuel Cycle Waste Treatment Research Division, Research Reactor Institute, Kyoto University, Osaka (Japan); Choe, Jung Hun; Cho, In Hak; Park, Hwan Seo [Ionizing Radiation Center, Nuclear Fuel Cycle Waste Treatment Research Division, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Park, Hyun Seo; Kim, Jung Ho; Kim, Yoon Ho [Ionizing Radiation Center, Korea Research Institute of Standards and Science, Daejeon (Korea, Republic of)

    2016-12-15

    For a verification of newly-developed neutron absorbers, one of guidelines on the qualification and acceptance of neutron absorbers is the neutron attenuation test. However, this approach can cause a problem for the qualifications that it cannot distinguish how the neutron attenuates from materials. In this study, an estimation method of neutron absorption performances for materials is proposed to detect both direct penetration and back-scattering neutrons. For the verification of the proposed method, MCNP simulations with the experimental system designed in this study were pursued using the polyethylene, iron, normal glass and the vitrified form. The results show that it can easily test neutron absorption ability using single absorber model. Also, from simulation results of single absorber and double absorbers model, it is verified that the proposed method can evaluate not only the direct thermal neutrons passing through materials, but also the scattered neutrons reflected to the materials. Therefore, the neutron absorption performances can be accurately estimated using the proposed method comparing with the conventional neutron attenuation test. It is expected that the proposed method can contribute to increase the reliability of the performance of neutron absorbers.

  11. A proposal on evaluation method of neutron absorption performance to substitute conventional neutron attenuation test

    International Nuclear Information System (INIS)

    Kim, Je Hyun; Shim, Chang Ho; Kim, Sung Hyun; Choe, Jung Hun; Cho, In Hak; Park, Hwan Seo; Park, Hyun Seo; Kim, Jung Ho; Kim, Yoon Ho

    2016-01-01

    For a verification of newly-developed neutron absorbers, one of guidelines on the qualification and acceptance of neutron absorbers is the neutron attenuation test. However, this approach can cause a problem for the qualifications that it cannot distinguish how the neutron attenuates from materials. In this study, an estimation method of neutron absorption performances for materials is proposed to detect both direct penetration and back-scattering neutrons. For the verification of the proposed method, MCNP simulations with the experimental system designed in this study were pursued using the polyethylene, iron, normal glass and the vitrified form. The results show that it can easily test neutron absorption ability using single absorber model. Also, from simulation results of single absorber and double absorbers model, it is verified that the proposed method can evaluate not only the direct thermal neutrons passing through materials, but also the scattered neutrons reflected to the materials. Therefore, the neutron absorption performances can be accurately estimated using the proposed method comparing with the conventional neutron attenuation test. It is expected that the proposed method can contribute to increase the reliability of the performance of neutron absorbers

  12. Proposed Sandia frequency shift for anti-islanding detection method based on artificial immune system

    Directory of Open Access Journals (Sweden)

    A.Y. Hatata

    2018-03-01

    Full Text Available Sandia frequency shift (SFS is one of the active anti-islanding detection methods that depend on frequency drift to detect an islanding condition for inverter-based distributed generation. The non-detection zone (NDZ of the SFS method depends to a great extent on its parameters. Improper adjusting of these parameters may result in failure of the method. This paper presents a proposed artificial immune system (AIS-based technique to obtain optimal parameters of SFS anti-islanding detection method. The immune system is highly distributed, highly adaptive, and self-organizing in nature, maintains a memory of past encounters, and has the ability to continually learn about new encounters. The proposed method generates less total harmonic distortion (THD than the conventional SFS, which results in faster island detection and better non-detection zone. The performance of the proposed method is derived analytically and simulated using Matlab/Simulink. Two case studies are used to verify the proposed method. The first case includes a photovoltaic (PV connected to grid and the second includes a wind turbine connected to grid. The deduced optimized parameter setting helps to achieve the “non-islanding inverter” as well as least potential adverse impact on power quality. Keywords: Anti-islanding detection, Sandia frequency shift (SFS, Non-detection zone (NDZ, Total harmonic distortion (THD, Artificial immune system (AIS, Clonal selection algorithm

  13. Computer based methods for measurement of joint space width: update of an ongoing OMERACT project

    NARCIS (Netherlands)

    Sharp, John T.; Angwin, Jane; Boers, Maarten; Duryea, Jeff; von Ingersleben, Gabriele; Hall, James R.; Kauffman, Joost A.; Landewé, Robert; Langs, Georg; Lukas, Cédric; Maillefert, Jean-Francis; Bernelot Moens, Hein J.; Peloschek, Philipp; Strand, Vibeke; van der Heijde, Désirée

    2007-01-01

    Computer-based methods of measuring joint space width (JSW) could potentially have advantages over scoring joint space narrowing, with regard to increased standardization, sensitivity, and reproducibility. In an early exercise, 4 different methods showed good agreement on measured change in JSW over

  14. Proposal and Evaluation of Management Method for College Mechatronics Education Applying the Project Management

    Science.gov (United States)

    Ando, Yoshinobu; Eguchi, Yuya; Mizukawa, Makoto

    In this research, we proposed and evaluated a management method of college mechatronics education. We applied the project management to college mechatronics education. We practiced our management method to the seminar “Microcomputer Seminar” for 3rd grade students who belong to Department of Electrical Engineering, Shibaura Institute of Technology. We succeeded in management of Microcomputer Seminar in 2006. We obtained the good evaluation for our management method by means of questionnaire.

  15. A proposed impact assessment method for genetically modified plants (AS-GMP Method)

    International Nuclear Information System (INIS)

    Jesus-Hitzschky, Katia Regina Evaristo de; Silveira, Jose Maria F.J. da

    2009-01-01

    An essential step in the development of products based on biotechnology is an assessment of their potential economic impacts and safety, including an evaluation of the potential impact of transgenic crops and practices related to their cultivation on the environment and human or animal health. The purpose of this paper is to provide an assessment method to evaluate the impact of biotechnologies that uses quantifiable parameters and allows a comparative analysis between conventional technology and technologies using GMOs. This paper introduces a method to perform an impact analysis associated with the commercial release and use of genetically modified plants, the Assessment System GMP Method. The assessment is performed through indicators that are arranged according to their dimension criterion likewise: environmental, economic, social, capability and institutional approach. To perform an accurate evaluation of the GMP specific indicators related to genetic modification are grouped in common fields: genetic insert features, GM plant features, gene flow, food/feed field, introduction of the GMP, unexpected occurrences and specific indicators. The novelty is the possibility to include specific parameters to the biotechnology under assessment. In this case by case analysis the factors of moderation and the indexes are parameterized to perform an available assessment.

  16. Validation of a method for assessing resident physicians' quality improvement proposals.

    Science.gov (United States)

    Leenstra, James L; Beckman, Thomas J; Reed, Darcy A; Mundell, William C; Thomas, Kris G; Krajicek, Bryan J; Cha, Stephen S; Kolars, Joseph C; McDonald, Furman S

    2007-09-01

    Residency programs involve trainees in quality improvement (QI) projects to evaluate competency in systems-based practice and practice-based learning and improvement. Valid approaches to assess QI proposals are lacking. We developed an instrument for assessing resident QI proposals--the Quality Improvement Proposal Assessment Tool (QIPAT-7)-and determined its validity and reliability. QIPAT-7 content was initially obtained from a national panel of QI experts. Through an iterative process, the instrument was refined, pilot-tested, and revised. Seven raters used the instrument to assess 45 resident QI proposals. Principal factor analysis was used to explore the dimensionality of instrument scores. Cronbach's alpha and intraclass correlations were calculated to determine internal consistency and interrater reliability, respectively. QIPAT-7 items comprised a single factor (eigenvalue = 3.4) suggesting a single assessment dimension. Interrater reliability for each item (range 0.79 to 0.93) and internal consistency reliability among the items (Cronbach's alpha = 0.87) were high. This method for assessing resident physician QI proposals is supported by content and internal structure validity evidence. QIPAT-7 is a useful tool for assessing resident QI proposals. Future research should determine the reliability of QIPAT-7 scores in other residency and fellowship training programs. Correlations should also be made between assessment scores and criteria for QI proposal success such as implementation of QI proposals, resident scholarly productivity, and improved patient outcomes.

  17. 78 FR 22540 - Notice of Public Meeting/Webinar: EPA Method Development Update on Drinking Water Testing Methods...

    Science.gov (United States)

    2013-04-16

    .... Environmental Protection Agency (EPA) Office of Ground Water and Drinking Water, Standards and Risk Management.../fem/agency_methods.htm . USEPA. 2009. Method Validation of U.S. Environmental Protection Agency... ENVIRONMENTAL PROTECTION AGENCY [EPA-HQ-OW-2013-0213; FRL-9803-5] Notice of Public Meeting/Webinar...

  18. Analytical methods for determination of mycotoxins: An update (2009-2014).

    Science.gov (United States)

    Turner, Nicholas W; Bramhmbhatt, Heli; Szabo-Vezse, Monika; Poma, Alessandro; Coker, Raymond; Piletsky, Sergey A

    2015-12-11

    Mycotoxins are a problematic and toxic group of small organic molecules that are produced as secondary metabolites by several fungal species that colonise crops. They lead to contamination at both the field and postharvest stages of food production with a considerable range of foodstuffs affected, from coffee and cereals, to dried fruit and spices. With wide ranging structural diversity of mycotoxins, severe toxic effects caused by these molecules and their high chemical stability the requirement for robust and effective detection methods is clear. This paper builds on our previous review and summarises the most recent advances in this field, in the years 2009-2014 inclusive. This review summarises traditional methods such as chromatographic and immunochemical techniques, as well as newer approaches such as biosensors, and optical techniques which are becoming more prevalent. A section on sampling and sample treatment has been prepared to highlight the importance of this step in the analytical methods. We close with a look at emerging technologies that will bring effective and rapid analysis out of the laboratory and into the field. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Proposal of a method for evaluating tsunami risk using response-surface methodology

    Science.gov (United States)

    Fukutani, Y.

    2017-12-01

    Information on probabilistic tsunami inundation hazards is needed to define and evaluate tsunami risk. Several methods for calculating these hazards have been proposed (e.g. Løvholt et al. (2012), Thio (2012), Fukutani et al. (2014), Goda et al. (2015)). However, these methods are inefficient, and their calculation cost is high, since they require multiple tsunami numerical simulations, therefore lacking versatility. In this study, we proposed a simpler method for tsunami risk evaluation using response-surface methodology. Kotani et al. (2016) proposed an evaluation method for the probabilistic distribution of tsunami wave-height using a response-surface methodology. We expanded their study and developed a probabilistic distribution of tsunami inundation depth. We set the depth (x1) and the slip (x2) of an earthquake fault as explanatory variables and tsunami inundation depth (y) as an object variable. Subsequently, tsunami risk could be evaluated by conducting a Monte Carlo simulation, assuming that the generation probability of an earthquake follows a Poisson distribution, the probability distribution of tsunami inundation depth follows the distribution derived from a response-surface, and the damage probability of a target follows a log normal distribution. We applied the proposed method to a wood building located on the coast of Tokyo Bay. We implemented a regression analysis based on the results of 25 tsunami numerical calculations and developed a response-surface, which was defined as y=ax1+bx2+c (a:0.2615, b:3.1763, c=-1.1802). We assumed proper probabilistic distribution for earthquake generation, inundation height, and vulnerability. Based on these probabilistic distributions, we conducted Monte Carlo simulations of 1,000,000 years. We clarified that the expected damage probability of the studied wood building is 22.5%, assuming that an earthquake occurs. The proposed method is therefore a useful and simple way to evaluate tsunami risk using a response

  20. 77 FR 24684 - Proposed Information Collection; Comment Request; 2013-2015 American Community Survey Methods...

    Science.gov (United States)

    2012-04-25

    ... proposed content changes. Thus, we need to test an alternative questionnaire design to accommodate additional content on the ACS mail questionnaire. In the 2013 ACS Questionnaire Design Test, we will study... in Puerto Rico. II. Method of Collection Questionnaire Design Test--Data collection for this test...

  1. Improving Semantic Updating Method on 3d City Models Using Hybrid Semantic-Geometric 3d Segmentation Technique

    Science.gov (United States)

    Sharkawi, K.-H.; Abdul-Rahman, A.

    2013-09-01

    to LoD4. The accuracy and structural complexity of the 3D objects increases with the LoD level where LoD0 is the simplest LoD (2.5D; Digital Terrain Model (DTM) + building or roof print) while LoD4 is the most complex LoD (architectural details with interior structures). Semantic information is one of the main components in CityGML and 3D City Models, and provides important information for any analyses. However, more often than not, the semantic information is not available for the 3D city model due to the unstandardized modelling process. One of the examples is where a building is normally generated as one object (without specific feature layers such as Roof, Ground floor, Level 1, Level 2, Block A, Block B, etc). This research attempts to develop a method to improve the semantic data updating process by segmenting the 3D building into simpler parts which will make it easier for the users to select and update the semantic information. The methodology is implemented for 3D buildings in LoD2 where the buildings are generated without architectural details but with distinct roof structures. This paper also introduces hybrid semantic-geometric 3D segmentation method that deals with hierarchical segmentation of a 3D building based on its semantic value and surface characteristics, fitted by one of the predefined primitives. For future work, the segmentation method will be implemented as part of the change detection module that can detect any changes on the 3D buildings, store and retrieve semantic information of the changed structure, automatically updates the 3D models and visualize the results in a userfriendly graphical user interface (GUI).

  2. [Sampling, storage and transport of biological materials collected from living and deceased subjects for determination of concentration levels of ethyl alcohol and similarly acting substances. A proposal of updating the blood and urine sampling protocol].

    Science.gov (United States)

    Wiergowski, Marek; Reguła, Krystyna; Pieśniak, Dorota; Galer-Tatarowicz, Katarzyna; Szpiech, Beata; Jankowski, Zbigniew

    2007-01-01

    The present paper emphasizes the most common mistakes committed at the beginning of an analytical procedure. To shorten the time and decrease the cost of determinations of substances with similar to alcohol activity, it is postulated to introduce mass-scale screening analysis of saliva collected from a living subject at the site of the event, with all positive results confirmed in blood or urine samples. If no saliva sample is collected for toxicology, a urine sample, allowing for a stat fast screening analysis, and a blood sample, to confirm the result, should be ensured. Inappropriate storage of a blood sample in the tube without a preservative can cause sample spilling and its irretrievable loss. The authors propose updating the "Blood/urine sampling protocol", with the updated version to be introduced into practice following consultations and revisions.

  3. AMCP Guide to Pharmaceutical Payment Methods, 2009 Update (Version 2.0).

    Science.gov (United States)

    2009-08-01

    The methods by which the U.S. health care system pays for prescription drugs have faced increasing scrutiny in recent years. Two key developments have emerged: (a) congressional enactment of important changes in the basis for payments for prescription drugs in the Medicare and Medicaid programs; and (b) a March 2009 decision in a federal class action lawsuit that alleged fraudulent manipulation of the dominant pricing benchmark (average wholesale price, AWP), used primarily as the basis for payment for brand-name prescription drugs. The debate about prescription drug payment methods centers on determining the most appropriate basis for calculating how payers, including patients, government agencies, employers, and health plans, should pay pharmacies and other providers for drugs. Historically, payment for prescription drugs has been based on published prices that do not necessarily reflect the actual acquisition costs paid by providers, primarily pharmacies, physicians, and hospitals. This has led policymakers to believe that Medicare and Medicaid programs have paid more than is necessary for prescription drugs. Thus, in an effort to reform the payment system and reduce drug expenditures, policymakers have made significant changes to the benchmarks used by public programs to pay for drugs, and in some instances have created new benchmarks. Private payers have followed the government's lead and begun to change their own payment methods and benchmarks. They can be expected to accelerate the change as a result of the settlement agreement approved in the March 2009 federal court decision. The settlement will result in the lowering of the AWP for more than 400 generic and brand-name drugs. In addition - and technically unrelated to the litigation and any appeals that may be taken - 2 major price data reporting companies, First DataBank and Medi-Span, announced their intent to discontinue publication of AWP within 2 years of September 26, 2009. (At the time this report

  4. Radionuclide Methods in the Diagnosis of Sacroiliitis in Patients with Spondyloarthritis: An Update

    Directory of Open Access Journals (Sweden)

    Karina Zilber

    2016-10-01

    Full Text Available Sacroiliitis, inflammation of the sacroiliac joint (SIJ, is the hallmark of ankylosing spondylitis and spondyloarthritis (SpA in general. The arsenal of recommended diagnostic modalities for imaging of the SIJ is scanty and, in practice, includes only conventional X-rays and magnetic resonance imaging (MRI. This review suggests that bone scintigraphy, particularly single-photon emission computed tomography (SPECT with calculation of indices, or SPECT in combination with low-dose computed tomography (CT can be a sensitive and specific tool for the diagnosis of sacroiliitis and can be used as part of the individualized approach to the diagnosis of axial SpA. In addition, [18F]fluoride positron emission tomography (PET/CT imaging and immunoscintigraphy, using labeled monoclonal anti-cytokine antibodies, are promising methods of current scientific interest in this field.

  5. Proposal for Requirement Validation Criteria and Method Based on Actor Interaction

    Science.gov (United States)

    Hattori, Noboru; Yamamoto, Shuichiro; Ajisaka, Tsuneo; Kitani, Tsuyoshi

    We propose requirement validation criteria and a method based on the interaction between actors in an information system. We focus on the cyclical transitions of one actor's situation against another and clarify observable stimuli and responses based on these transitions. Both actors' situations can be listed in a state transition table, which describes the observable stimuli or responses they send or receive. Examination of the interaction between both actors in the state transition tables enables us to detect missing or defective observable stimuli or responses. Typically, this method can be applied to the examination of the interaction between a resource managed by the information system and its user. As a case study, we analyzed 332 requirement defect reports of an actual system development project in Japan. We found that there were a certain amount of defects regarding missing or defective stimuli and responses, which can be detected using our proposed method if this method is used in the requirement definition phase. This means that we can reach a more complete requirement definition with our proposed method.

  6. Determination of the oxidizing property: proposal of an alternative method based on differential scanning calorimetry

    International Nuclear Information System (INIS)

    Gigante, L.; Dellavedova, M.; Pasturenzi, C.; Lunghi, A.; Mattarella, M.; Cardillo, P.

    2008-01-01

    Determination of chemical-physical and hazardous properties of substances is a very important matter in the chemical industry, considering the growing attention of public opinion regarding safety and eco-compatibility aspects of products. In the present work, attention was focused on characterization of oxidizing properties. In case of solid compounds, the current method (Dir 84/449/CEE 6) compares the maximum combustion rate of the examined substance to the maximum combustion rate of a reference mixture. This method shows a lot of disvantages and does not provide a quantitative result. In the following work an alternative method, based on DSC measurements, is proposed for the determination of oxidizing properties. [it

  7. Estimation of body fluids with bioimpedance spectroscopy: state of the art methods and proposal of novel methods

    International Nuclear Information System (INIS)

    Buendia, R; Seoane, F; Lindecrantz, K; Bosaeus, I; Gil-Pita, R; Johannsson, G; Ellegård, L; Ward, L C

    2015-01-01

    Determination of body fluids is a useful common practice in determination of disease mechanisms and treatments. Bioimpedance spectroscopy (BIS) methods are non-invasive, inexpensive and rapid alternatives to reference methods such as tracer dilution. However, they are indirect and their robustness and validity are unclear. In this article, state of the art methods are reviewed, their drawbacks identified and new methods are proposed. All methods were tested on a clinical database of patients receiving growth hormone replacement therapy. Results indicated that most BIS methods are similarly accurate (e.g.  <  0.5   ±   3.0% mean percentage difference for total body water) for estimation of body fluids. A new model for calculation is proposed that performs equally well for all fluid compartments (total body water, extra- and intracellular water). It is suggested that the main source of error in extracellular water estimation is due to anisotropy, in total body water estimation to the uncertainty associated with intracellular resistivity and in determination of intracellular water a combination of both. (paper)

  8. A Novel 3D Viscoelastic Acoustic Wave Equation Based Update Method for Reservoir History Matching

    KAUST Repository

    Katterbauer, Klemens

    2014-12-10

    The oil and gas industry has been revolutionized within the last decade, with horizontal drilling and hydraulic fracturing enabling the extraction of huge amounts of shale gas in areas previously considered impossible and the recovering of hydrocarbons in harsh environments like the arctic or in previously unimaginable depths like the off-shore exploration in the South China sea and Gulf of Mexico. With the development of 4D seismic, engineers and scientists have been enabled to map the evolution of fluid fronts within the reservoir and determine the displacement caused by the injected fluids. This in turn has led to enhanced production strategies, cost reduction and increased profits. Conventional approaches to incorporate seismic data into the history matching process have been to invert these data for constraints that are subsequently employed in the history matching process. This approach makes the incorporation computationally expensive and requires a lot of manual processing for obtaining the correct interpretation due to the potential artifacts that are generated by the generally ill-conditioned inversion problems. I have presented here a novel approach via including the time-lapse cross-well seismic survey data directly into the history matching process. The generated time-lapse seismic data are obtained from the full wave 3D viscoelastic acoustic wave equation. Furthermore an extensive analysis has been performed showing the robustness of the method and enhanced forecastability of the critical reservoir parameters, reducing uncertainties and exhibiting the benefits of a full wave 3D seismic approach. Finally, the improved performance has been statistically confirmed. The improvements illustrate the significant improvements in forecasting that are obtained via readily available seismic data without the need for inversion. This further optimizes oil production in addition to increasing return-on-investment on oil & gas field development projects, especially

  9. An update on risk factors for cartilage loss in knee osteoarthritis assessed using MRI-based semiquantitative grading methods

    Energy Technology Data Exchange (ETDEWEB)

    Alizai, Hamza [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); University of Texas Health Science Center at San Antonio, Department of Radiology, San Antonio, TX (United States); Roemer, Frank W. [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); University of Erlangen-Nuremberg, Department of Radiology, Erlangen (Germany); Hayashi, Daichi [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); Yale University School of Medicine, Department of Radiology, Bridgeport Hospital, Bridgeport, CT (United States); Crema, Michel D. [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); Hospital do Coracao and Teleimagem, Department of Radiology, Sao Paulo (Brazil); Felson, David T. [Boston University School of Medicine, Clinical Epidemiology Research and Training Unit, Boston, MA (United States); Guermazi, Ali [Boston University School of Medicine, Quantitative Imaging Center, Department of Radiology, Boston, MA (United States); Aspetar Orthopaedic and Sports Medicine Hospital, Doha (Qatar); Boston Medical Center, Boston, MA (United States)

    2014-11-07

    Arthroscopy-based semiquantitative scoring systems such as Outerbridge and Noyes' scores were the first to be developed for the purpose of grading cartilage defects. As magnetic resonance imaging (MRI) became available faor evaluation of the osteoarthritic knee joint, these systems were adapted for use with MRI. Later on, grading methods such as the Whole Organ Magnetic Resonance Score, the Boston-Leeds Osteoarthritis Knee Score and the MRI Osteoarthritis Knee Score were designed specifically for performing whole-organ assessment of the knee joint structures, including cartilage. Cartilage grades on MRI obtained with these scoring systems represent optimal outcome measures for longitudinal studies, and are designed to enhance understanding of the knee osteoarthritis disease process. The purpose of this narrative review is to describe cartilage assessment in knee osteoarthritis using currently available MRI-based semiquantitative whole-organ scoring systems, and to provide an update on the risk factors for cartilage loss in knee osteoarthritis as assessed with these scoring systems. (orig.)

  10. An update on risk factors for cartilage loss in knee osteoarthritis assessed using MRI-based semiquantitative grading methods

    International Nuclear Information System (INIS)

    Alizai, Hamza; Roemer, Frank W.; Hayashi, Daichi; Crema, Michel D.; Felson, David T.; Guermazi, Ali

    2015-01-01

    Arthroscopy-based semiquantitative scoring systems such as Outerbridge and Noyes' scores were the first to be developed for the purpose of grading cartilage defects. As magnetic resonance imaging (MRI) became available faor evaluation of the osteoarthritic knee joint, these systems were adapted for use with MRI. Later on, grading methods such as the Whole Organ Magnetic Resonance Score, the Boston-Leeds Osteoarthritis Knee Score and the MRI Osteoarthritis Knee Score were designed specifically for performing whole-organ assessment of the knee joint structures, including cartilage. Cartilage grades on MRI obtained with these scoring systems represent optimal outcome measures for longitudinal studies, and are designed to enhance understanding of the knee osteoarthritis disease process. The purpose of this narrative review is to describe cartilage assessment in knee osteoarthritis using currently available MRI-based semiquantitative whole-organ scoring systems, and to provide an update on the risk factors for cartilage loss in knee osteoarthritis as assessed with these scoring systems. (orig.)

  11. AN UPDATED 6Li(p, α)3He REACTION RATE AT ASTROPHYSICAL ENERGIES WITH THE TROJAN HORSE METHOD

    International Nuclear Information System (INIS)

    Lamia, L.; Spitaleri, C.; Sergi, M. L.; Pizzone, R. G.; Tumino, A.; La Cognata, M.; Tognelli, E.; Degl'Innocenti, S.; Prada Moroni, P. G.; Pappalardo, L.

    2013-01-01

    The lithium problem influencing primordial and stellar nucleosynthesis is one of the most interesting unsolved issues in astrophysics. 6 Li is the most fragile of lithium's stable isotopes and is largely destroyed in most stars during the pre-main-sequence (PMS) phase. For these stars, the convective envelope easily reaches, at least at its bottom, the relatively low 6 Li ignition temperature. Thus, gaining an understanding of 6 Li depletion also gives hints about the extent of convective regions. For this reason, charged-particle-induced reactions in lithium have been the subject of several studies. Low-energy extrapolations of these studies provide information about both the zero-energy astrophysical S(E) factor and the electron screening potential, U e . Thanks to recent direct measurements, new estimates of the 6 Li(p, α) 3 He bare-nucleus S(E) factor and the corresponding U e value have been obtained by applying the Trojan Horse method to the 2 H( 6 Li, α 3 He)n reaction in quasi-free kinematics. The calculated reaction rate covers the temperature window 0.01 to 2T 9 and its impact on the surface lithium depletion in PMS models with different masses and metallicities has been evaluated in detail by adopting an updated version of the FRANEC evolutionary code.

  12. Update and Improve Subsection NH - Simplified Elastic and Inelastic Design Analysis Methods

    Energy Technology Data Exchange (ETDEWEB)

    Jeries J. Abou-Hanna; Douglas L. Marriott; Timothy E. McGreevy

    2009-06-27

    The objective of this subtask is to develop a template for the 'Ideal' high temperature design Code, in which individual topics can be identified and worked on separately in order to provide the detail necessary to comprise a comprehensive Code. Like all ideals, this one may not be attainable as a practical matter. The purpose is to set a goal for what is believed the 'Ideal' design Code should address, recognizing that some elements are not mutually exclusive and that the same objectives can be achieved in different way. Most, if not all existing Codes may therefore be found to be lacking in some respects, but this does not mean necessarily that they are not comprehensive. While this subtask does attempt to list the elements which individually or in combination are considered essential in such a Code, the authors do not presume to recommend how these elements should be implemented or even, that they should all be implemented at all. The scope of this subtask is limited to compiling the list of elements thought to be necessary or at minimum, useful in such an 'Ideal' Code; suggestions are provided as to their relationship to one another. Except for brief descriptions, where these are needed for clarification, neither this repot, nor Task 9 as a whole, attempts to address details of the contents of all these elements. Some, namely primary load limits (elastic, limit load, reference stress), and ratcheting (elastic, e-p, reference stress) are dealt with specifically in other subtasks of Task 9. All others are merely listed; the expectation is that they will either be the focus of attention of other active DOE-ASME GenIV Materials Tasks, e.g. creep-fatigue, or to be considered in future DOE-ASME GenIV Materials Tasks. Since the focus of this Task is specifically approximate methods, the authors have deemed it necessary to include some discussion on what is meant by 'approximate'. However, the topic will be addressed in one or

  13. Update and Improve Subsection NH - Simplified Elastic and Inelastic Design Analysis Methods

    International Nuclear Information System (INIS)

    Abou-Hanna, Jeries J.; Marriott, Douglas L.; McGreevy, Timothy E.

    2009-01-01

    The objective of this subtask is to develop a template for the 'Ideal' high temperature design Code, in which individual topics can be identified and worked on separately in order to provide the detail necessary to comprise a comprehensive Code. Like all ideals, this one may not be attainable as a practical matter. The purpose is to set a goal for what is believed the 'Ideal' design Code should address, recognizing that some elements are not mutually exclusive and that the same objectives can be achieved in different way. Most, if not all existing Codes may therefore be found to be lacking in some respects, but this does not mean necessarily that they are not comprehensive. While this subtask does attempt to list the elements which individually or in combination are considered essential in such a Code, the authors do not presume to recommend how these elements should be implemented or even, that they should all be implemented at all. The scope of this subtask is limited to compiling the list of elements thought to be necessary or at minimum, useful in such an 'Ideal' Code; suggestions are provided as to their relationship to one another. Except for brief descriptions, where these are needed for clarification, neither this repot, nor Task 9 as a whole, attempts to address details of the contents of all these elements. Some, namely primary load limits (elastic, limit load, reference stress), and ratcheting (elastic, e-p, reference stress) are dealt with specifically in other subtasks of Task 9. All others are merely listed; the expectation is that they will either be the focus of attention of other active DOE-ASME GenIV Materials Tasks, e.g. creep-fatigue, or to be considered in future DOE-ASME GenIV Materials Tasks. Since the focus of this Task is specifically approximate methods, the authors have deemed it necessary to include some discussion on what is meant by 'approximate'. However, the topic will be addressed in one or more later subtasks. This report describes

  14. Proposed waste form performance criteria and testing methods for low-level mixed waste

    International Nuclear Information System (INIS)

    Franz, E.M.; Fuhrmann, M.; Bowerman, B.

    1995-01-01

    Proposed waste form performance criteria and testing methods were developed as guidance in judging the suitability of solidified waste as a physico-chemical barrier to releases of radionuclides and RCRA regulated hazardous components. The criteria follow from the assumption that release of contaminants by leaching is the single most important property for judging the effectiveness of a waste form. A two-tier regimen is proposed. The first tier consists of a leach test designed to determine the net, forward leach rate of the solidified waste and a leach test required by the Environmental Protection Agency (EPA). The second tier of tests is to determine if a set of stresses (i.e., radiation, freeze-thaw, wet-dry cycling) on the waste form adversely impacts its ability to retain contaminants and remain physically intact. In the absence of site-specific performance assessments (PA), two generic modeling exercises are described which were used to calculate proposed acceptable leachates

  15. Report on the consultants' meeting on preparation of the proposal for a coordinated research project to update X- and γ-ray decay data standards for detector calibration

    International Nuclear Information System (INIS)

    Nichols, A.; Herman, M.

    1998-05-01

    The IAEA Nuclear Data Section has been charged by the International Nuclear Data Committee to consider the establishment of a Coordinated Research Project (CRP) to update the IAEA database of X-ray and γ-ray Standards for Detector Calibration. This CRP should re-define the radionuclides most suited for detector calibration, extending applications to safeguards, materials analysis, environmental monitoring, and medical use. This document is a report on the Consultants' Meeting held at IAEA, Vienna, between 24-25 November 1997 to assess the current needs, re-define the most suitable radionuclides, and advise the IAEA Nuclear Data Section on the need and form of such a CRP

  16. A method proposal for cumulative environmental impact assessment based on the landscape vulnerability evaluation

    International Nuclear Information System (INIS)

    Pavlickova, Katarina; Vyskupova, Monika

    2015-01-01

    Cumulative environmental impact assessment deals with the occasional use in practical application of environmental impact assessment process. The main reasons are the difficulty of cumulative impact identification caused by lack of data, inability to measure the intensity and spatial effect of all types of impacts and the uncertainty of their future evolution. This work presents a method proposal to predict cumulative impacts on the basis of landscape vulnerability evaluation. For this purpose, qualitative assessment of landscape ecological stability is conducted and major vulnerability indicators of environmental and socio-economic receptors are specified and valuated. Potential cumulative impacts and the overall impact significance are predicted quantitatively in modified Argonne multiple matrixes while considering the vulnerability of affected landscape receptors and the significance of impacts identified individually. The method was employed in the concrete environmental impact assessment process conducted in Slovakia. The results obtained in this case study reflect that this methodology is simple to apply, valid for all types of impacts and projects, inexpensive and not time-consuming. The objectivity of the partial methods used in this procedure is improved by quantitative landscape ecological stability evaluation, assignment of weights to vulnerability indicators based on the detailed characteristics of affected factors, and grading impact significance. - Highlights: • This paper suggests a method proposal for cumulative impact prediction. • The method includes landscape vulnerability evaluation. • The vulnerability of affected receptors is determined by their sensitivity. • This method can increase the objectivity of impact prediction in the EIA process

  17. Proposal of evaluation method of tsunami wave pressure using 2D depth-integrated flow simulation

    International Nuclear Information System (INIS)

    Arimitsu, Tsuyoshi; Ooe, Kazuya; Kawasaki, Koji

    2012-01-01

    To design and construct land structures resistive to tsunami force, it is most essential to evaluate tsunami pressure quantitatively. The existing hydrostatic formula, in general, tended to underestimate tsunami wave pressure under the condition of inundation flow with large Froude number. Estimation method of tsunami pressure acting on a land structure was proposed using inundation depth and horizontal velocity at the front of the structure, which were calculated employing a 2D depth-integrated flow model based on the unstructured grid system. The comparison between the numerical and experimental results revealed that the proposed method could reasonably reproduce the vertical distribution of the maximum tsunami pressure as well as the time variation of the tsunami pressure exerting on the structure. (author)

  18. A proposed safety assurance method and its application to the fusion experimental reactor

    International Nuclear Information System (INIS)

    Okazaki, T.; Seki, Y.; Inabe, T.; Aoki, I.

    1995-01-01

    Importance categorization and hazard identification methods have been proposed for a fusion experimental reactor. A parameter, the system index, is introduced in the categorization method. The relative importance of systems with safety functions can be classified by the largeness of the system index and whether or not the system acts as a boundary for radioactive materials. This categorization can be used as the basic principle in determining structure design assessment, seismic design criteria etc. For the hazard identification the system time energy matrix is proposed, where the time and spatial distributions of hazard energies are used. This approach is formulated more systematically than an ad-hoc identification of hazard events and it is useful to select design basis events which are employed in the assessment of safety designs. (orig.)

  19. Proposed Project Selection Method for Human Support Research and Technology Development (HSR&TD)

    Science.gov (United States)

    Jones, Harry

    2005-01-01

    The purpose of HSR&TD is to deliver human support technologies to the Exploration Systems Mission Directorate (ESMD) that will be selected for future missions. This requires identifying promising candidate technologies and advancing them in technology readiness until they are acceptable. HSR&TD must select an may of technology development projects, guide them, and either terminate or continue them, so as to maximize the resulting number of usable advanced human support technologies. This paper proposes an effective project scoring methodology to support managing the HSR&TD project portfolio. Researchers strongly disagree as to what are the best technology project selection methods, or even if there are any proven ones. Technology development is risky and outstanding achievements are rare and unpredictable. There is no simple formula for success. Organizations that are satisfied with their project selection approach typically use a mix of financial, strategic, and scoring methods in an open, established, explicit, formal process. This approach helps to build consensus and develop management insight. It encourages better project proposals by clarifying the desired project attributes. We propose a project scoring technique based on a method previously used in a federal laboratory and supported by recent research. Projects are ranked by their perceived relevance, risk, and return - a new 3 R's. Relevance is the degree to which the project objective supports the HSR&TD goal of developing usable advanced human support technologies. Risk is the estimated probability that the project will achieve its specific objective. Return is the reduction in mission life cycle cost obtained if the project is successful. If the project objective technology performs a new function with no current cost, its return is the estimated cash value of performing the new function. The proposed project selection scoring method includes definitions of the criteria, a project evaluation

  20. Proposal of Constraints Analysis Method Based on Network Model for Task Planning

    Science.gov (United States)

    Tomiyama, Tomoe; Sato, Tatsuhiro; Morita, Toyohisa; Sasaki, Toshiro

    Deregulation has been accelerating several activities toward reengineering business processes, such as railway through service and modal shift in logistics. Making those activities successful, business entities have to regulate new business rules or know-how (we call them ‘constraints’). According to the new constraints, they need to manage business resources such as instruments, materials, workers and so on. In this paper, we propose a constraint analysis method to define constraints for task planning of the new business processes. To visualize each constraint's influence on planning, we propose a network model which represents allocation relations between tasks and resources. The network can also represent task ordering relations and resource grouping relations. The proposed method formalizes the way of defining constraints manually as repeatedly checking the network structure and finding conflicts between constraints. Being applied to crew scheduling problems shows that the method can adequately represent and define constraints of some task planning problems with the following fundamental features, (1) specifying work pattern to some resources, (2) restricting the number of resources for some works, (3) requiring multiple resources for some works, (4) prior allocation of some resources to some works and (5) considering the workload balance between resources.

  1. Visual assessment of BIPV retrofit design proposals for selected historical buildings using the saliency map method

    Directory of Open Access Journals (Sweden)

    Ran Xu

    2015-06-01

    Full Text Available With the increasing awareness of energy efficiency, many old buildings have to undergo a massive facade energy retrofit. How to predict the visual impact which solar installations on the aesthetic cultural value of these buildings has been a heated debate in Switzerland (and throughout the world. The usual evaluation method to describe the visual impact of BIPV is based on semantic and qualitative descriptors, and strongly dependent on personal preferences. The evaluation scale is therefore relative, flexible and imprecise. This paper proposes a new method to accurately measure the visual impact which BIPV installations have on a historical building by using the saliency map method. By imitating working principles of the human eye, it is measured how much the BIPV design proposals differ from the original building facade in the aspect of attracting human visual attention. The result is directly presented in a quantitative manner, and can be used to compare the fitness of different BIPV design proposals. The measuring process is numeric, objective and more precise.  

  2. Comparison among four proposed direct blood culture microbial identification methods using MALDI-TOF MS

    Directory of Open Access Journals (Sweden)

    Ali M. Bazzi

    2017-05-01

    Full Text Available Summary: Matrix-assisted laser desorption-ionization time-of-flight (MALDI-TOF mass spectrometry facilitates rapid and accurate identification of pathogens, which is critical for sepsis patients.In this study, we assessed the accuracy in identification of both Gram-negative and Gram-positive bacteria, except for Streptococcus viridans, using four rapid blood culture methods with Vitek MALDI-TOF-MS. We compared our proposed lysis centrifugation followed by washing and 30% acetic acid treatment method (method 2 with two other lysis centrifugation methods (washing and 30% formic acid treatment (method 1; 100% ethanol treatment (method 3, and picking colonies from 90 to 180 min subculture plates (method 4. Methods 1 and 2 identified all organisms down to species level with 100% accuracy, except for Streptococcus viridans, Streptococcus pyogenes, Enterobacter cloacae and Proteus vulgaris. The latter two were identified to genus level with 100% accuracy. Each method exhibited excellent accuracy and precision in terms of identification to genus level with certain limitations. Keywords: MALDI-TOF, Gram-negative, Gram-positive, Sepsis, Blood culture

  3. Comparison among four proposed direct blood culture microbial identification methods using MALDI-TOF MS.

    Science.gov (United States)

    Bazzi, Ali M; Rabaan, Ali A; El Edaily, Zeyad; John, Susan; Fawarah, Mahmoud M; Al-Tawfiq, Jaffar A

    Matrix-assisted laser desorption-ionization time-of-flight (MALDI-TOF) mass spectrometry facilitates rapid and accurate identification of pathogens, which is critical for sepsis patients. In this study, we assessed the accuracy in identification of both Gram-negative and Gram-positive bacteria, except for Streptococcus viridans, using four rapid blood culture methods with Vitek MALDI-TOF-MS. We compared our proposed lysis centrifugation followed by washing and 30% acetic acid treatment method (method 2) with two other lysis centrifugation methods (washing and 30% formic acid treatment (method 1); 100% ethanol treatment (method 3)), and picking colonies from 90 to 180min subculture plates (method 4). Methods 1 and 2 identified all organisms down to species level with 100% accuracy, except for Streptococcus viridans, Streptococcus pyogenes, Enterobacter cloacae and Proteus vulgaris. The latter two were identified to genus level with 100% accuracy. Each method exhibited excellent accuracy and precision in terms of identification to genus level with certain limitations. Copyright © 2016 King Saud Bin Abdulaziz University for Health Sciences. Published by Elsevier Ltd. All rights reserved.

  4. FRMAC Updates

    International Nuclear Information System (INIS)

    Mueller, P.

    1995-01-01

    This talks describes updates in the following updates in FRMAC publications concerning radiation emergencies: Monitoring and Analysis Manual; Evaluation and Assessment Manual; Handshake Series (Biannual) including exercises participated in; environmental Data and Instrument Transmission System (EDITS); Plume in a Box with all radiological data stored onto a hand-held computer; and courses given

  5. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho [Hanyang University, Seoul (Korea, Republic of); Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-05-15

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images.

  6. Proposal for outline of training and evaluation method for non-technical skills

    International Nuclear Information System (INIS)

    Nagasaka, Akihiko; Shibue, Hisao

    2015-01-01

    The purpose of this study is to systematize measures for improvement of emergency response capability focused on non-technical skills. As the results of investigation of some emergency training in nuclear power plant and referring to CRM training, following two issues were picked up. 1) Lack of practical training method for improvement of non-technical skills. 2) Lack of evaluation method of non-technical skills. Then, based on these 7 non-technical skills 'situational awareness' 'decision making' 'communication' 'teamworking' 'leadership' 'managing stress' 'coping with fatigue' are promotion factors to improve emergency response capability, we propose practical training method for each non-technical skill. Also we give example of behavioral markers as evaluation factor, and indicate approaches to introduce the evaluation method of non-technical skills. (author)

  7. A Proposal on the Quantitative Homogeneity Analysis Method of SEM Images for Material Inspections

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Kim, Jong Woo; Shin, Chang Ho; Choi, Jung-Hoon; Cho, In-Hak; Park, Hwan Seo

    2015-01-01

    A scanning electron microscope (SEM) is a method to inspect the surface microstructure of materials. The SEM uses electron beams for imaging high magnifications of material surfaces; therefore, various chemical analyses can be performed from the SEM images. Therefore, it is widely used for the material inspection, chemical characteristic analysis, and biological analysis. For the nuclear criticality analysis field, it is an important parameter to check the homogeneity of the compound material for using it in the nuclear system. In our previous study, the SEM was tried to use for the homogeneity analysis of the materials. In this study, a quantitative homogeneity analysis method of SEM images is proposed for the material inspections. The method is based on the stochastic analysis method with the information of the grayscales of the SEM images

  8. An update on MyoD evolution in teleosts and a proposed consensus nomenclature to accommodate the tetraploidization of different vertebrate genomes.

    Directory of Open Access Journals (Sweden)

    Daniel J Macqueen

    Full Text Available BACKGROUND: MyoD is a muscle specific transcription factor that is essential for vertebrate myogenesis. In several teleost species, including representatives of the Salmonidae and Acanthopterygii, but not zebrafish, two or more MyoD paralogues are conserved that are thought to have arisen from distinct, possibly lineage-specific duplication events. Additionally, two MyoD paralogues have been characterised in the allotetraploid frog, Xenopus laevis. This has lead to a confusing nomenclature since MyoD paralogues have been named outside of an appropriate phylogenetic framework. METHODS AND PRINCIPAL FINDINGS: Here we initially show that directly depicting the evolutionary relationships of teleost MyoD orthologues and paralogues is hindered by the asymmetric evolutionary rate of Acanthopterygian MyoD2 relative to other MyoD proteins. Thus our aim was to confidently position the event from which teleost paralogues arose in different lineages by a comparative investigation of genes neighbouring myod across the vertebrates. To this end, we show that genes on the single myod-containing chromosome of mammals and birds are retained in both zebrafish and Acanthopterygian teleosts in a striking pattern of double conserved synteny. Further, phylogenetic reconstruction of these neighbouring genes using Bayesian and maximum likelihood methods supported a common origin for teleost paralogues following the split of the Actinopterygii and Sarcopterygii. CONCLUSION: Our results strongly suggest that myod was duplicated during the basal teleost whole genome duplication event, but was subsequently lost in the Ostariophysi (zebrafish and Protacanthopterygii lineages. We propose a sensible consensus nomenclature for vertebrate myod genes that accommodates polyploidization events in teleost and tetrapod lineages and is justified from a phylogenetic perspective.

  9. European experiences of the proposed ASTM test method for crack arrest toughness of ferritic materials

    International Nuclear Information System (INIS)

    Jutla, T.; Lidbury, D.P.G.; Ziebs, J.; Zimmermann, C.

    1986-01-01

    The proposed ASTM test method for measuring the crack arrest toughness of ferritic materials using wedge-loaded, side-grooved, compact specimens was applied to three steels: A514 bridge steel, A588 bridge steel, and A533B pressure vessel steel. Five sets of results from different laboratories are discussed here. Notches were prepared by spark erosion, although root radii varied from ∝0.1-1.5 mm. Although fast fractures were successfully initiated, arrest did not occur in a significant number of cases. The results showed no obvious dependence of crack arrest toughness, K a , (determined by a static analysis) on crack initiation toughness, K 0 . It was found that K a decreases markedly with increasing crack jump distance. A limited amount of further work on smaller specimens of the A533B steel showed that lower K a values tended to be recorded. It is concluded that a number of points relating to the proposed test method and notch preparation are worthy of further consideration. It is pointed out that the proposed validity criteria may screen out lower bound data. Nevertheless, for present practical purposes, K a values may be regarded as useful in providing an estimate of arrest toughness - although not necessarily a conservative estimate. (orig./HP)

  10. Mining Sequential Update Summarization with Hierarchical Text Analysis

    Directory of Open Access Journals (Sweden)

    Chunyun Zhang

    2016-01-01

    Full Text Available The outbreak of unexpected news events such as large human accident or natural disaster brings about a new information access problem where traditional approaches fail. Mostly, news of these events shows characteristics that are early sparse and later redundant. Hence, it is very important to get updates and provide individuals with timely and important information of these incidents during their development, especially when being applied in wireless and mobile Internet of Things (IoT. In this paper, we define the problem of sequential update summarization extraction and present a new hierarchical update mining system which can broadcast with useful, new, and timely sentence-length updates about a developing event. The new system proposes a novel method, which incorporates techniques from topic-level and sentence-level summarization. To evaluate the performance of the proposed system, we apply it to the task of sequential update summarization of temporal summarization (TS track at Text Retrieval Conference (TREC 2013 to compute four measurements of the update mining system: the expected gain, expected latency gain, comprehensiveness, and latency comprehensiveness. Experimental results show that our proposed method has good performance.

  11. Invited Review Article: Tip modification methods for tip-enhanced Raman spectroscopy (TERS) and colloidal probe technique: A 10 year update (2006-2016) review

    Science.gov (United States)

    Yuan, C. C.; Zhang, D.; Gan, Y.

    2017-03-01

    Engineering atomic force microscopy tips for reliable tip enhanced Raman spectroscopy (TERS) and colloidal probe technique are becoming routine practices in many labs. In this 10 year update review, various new tip modification methods developed over the past decade are briefly reviewed to help researchers select the appropriate method. The perspective is put in a large context to discuss the opportunities and challenges in this area, including novel combinations of seemingly different methods, potential applications of some methods which were not originally intended for TERS tip fabrication, and the problems of high cost and poor reproducibility of tip fabrication.

  12. Contribution for an Urban Geomorphoheritage Assessment Method: Proposal from Three Geomorphosites in Rome (Italy

    Directory of Open Access Journals (Sweden)

    Pica Alessia

    2017-09-01

    Full Text Available Urban geomorphology has important implications in spatial planning of human activities, and it also has a geotouristic potential due to the relationship between cultural and geomorphological heritage. Despite the introduction of the term Anthropocene to describe the deep influence that human activities have had in recent times on Earth evolution, urban geomorphological heritage studies are relatively rare and limited and urban geotourism development is recent. The analysis of the complex urban landscape often need the integration of multidisciplinary data. This study aims to propose the first urban geomorphoheritage assessment method, which originates after long-lasting previous geomorphological and geotouristic studies on Rome city centre, it depict rare examples of the geomorphological mapping of a metropolis and, at the same time, of an inventory of urban geomorphosites. The proposal is applied to geomorphosites in the Esquilino neighbourhood of Rome, whose analysis confirm the need for an ad hoc method for assessing urban geomorphosites, as already highlighted in the most recent literature on the topic. The urban geomorphoheritage assessment method is based on: (i the urban geomorphological analysis by means of multitemporal and multidisciplinary data; (ii the geomorphosite inventory; and (iii the geomorphoheritage assessment and enhancement. One challenge is to assess invisible geomorphosites that are widespread in urban context. To this aim, we reworked the attributes describing the Value of a site for Geotourism in order to build up a specific methodology for the analysis of the urban geomorphological heritage.

  13. Proposal for an Evaluation Method for the Performance of Work Procedures.

    Science.gov (United States)

    Mohammed, Mouda; Mébarek, Djebabra; Wafa, Boulagouas; Makhlouf, Chati

    2016-12-01

    Noncompliance of operators with work procedures is a recurrent problem. This human behavior has been said to be situational and studied by many different approaches (ergonomic and others), which consider the noncompliance with work procedures to be obvious and seek to analyze its causes as well as consequences. The object of the proposed method is to solve this problem by focusing on the performance of work procedures and ensuring improved performance on a continuous basis. This study has multiple results: (1) assessment of the work procedures' performance by a multicriteria approach; (2) the use of a continuous improvement approach as a framework for the sustainability of the assessment method of work procedures' performance; and (3) adaptation of the Stop-Card as a facilitator support for continuous improvement of work procedures. The proposed method emphasizes to put in value the inputs of continuous improvement of the work procedures in relation with the conventional approaches which adopt the obvious evidence of the noncompliance to the working procedures and seek to analyze the cause-effect relationships related to this unacceptable phenomenon, especially in strategic industry.

  14. Using a fuzzy comprehensive evaluation method to determine product usability: A proposed theoretical framework.

    Science.gov (United States)

    Zhou, Ronggang; Chan, Alan H S

    2017-01-01

    In order to compare existing usability data to ideal goals or to that for other products, usability practitioners have tried to develop a framework for deriving an integrated metric. However, most current usability methods with this aim rely heavily on human judgment about the various attributes of a product, but often fail to take into account of the inherent uncertainties in these judgments in the evaluation process. This paper presents a universal method of usability evaluation by combining the analytic hierarchical process (AHP) and the fuzzy evaluation method. By integrating multiple sources of uncertain information during product usability evaluation, the method proposed here aims to derive an index that is structured hierarchically in terms of the three usability components of effectiveness, efficiency, and user satisfaction of a product. With consideration of the theoretical basis of fuzzy evaluation, a two-layer comprehensive evaluation index was first constructed. After the membership functions were determined by an expert panel, the evaluation appraisals were computed by using the fuzzy comprehensive evaluation technique model to characterize fuzzy human judgments. Then with the use of AHP, the weights of usability components were elicited from these experts. Compared to traditional usability evaluation methods, the major strength of the fuzzy method is that it captures the fuzziness and uncertainties in human judgments and provides an integrated framework that combines the vague judgments from multiple stages of a product evaluation process.

  15. Novel approach to improve the attitude update rate of a star tracker.

    Science.gov (United States)

    Zhang, Shuo; Xing, Fei; Sun, Ting; You, Zheng; Wei, Minsong

    2018-03-05

    The star tracker is widely used in attitude control systems of spacecraft for attitude measurement. The attitude update rate of a star tracker is important to guarantee the attitude control performance. In this paper, we propose a novel approach to improve the attitude update rate of a star tracker. The electronic Rolling Shutter (RS) imaging mode of the complementary metal-oxide semiconductor (CMOS) image sensor in the star tracker is applied to acquire star images in which the star spots are exposed with row-to-row time offsets, thereby reflecting the rotation of star tracker at different times. The attitude estimation method with a single star spot is developed to realize the multiple attitude updates by a star image, so as to reach a high update rate. The simulation and experiment are performed to verify the proposed approaches. The test results demonstrate that the proposed approach is effective and the attitude update rate of a star tracker is increased significantly.

  16. The clinically-integrated randomized trial: proposed novel method for conducting large trials at low cost

    Directory of Open Access Journals (Sweden)

    Scardino Peter T

    2009-03-01

    Full Text Available Abstract Introduction Randomized controlled trials provide the best method of determining which of two comparable treatments is preferable. Unfortunately, contemporary randomized trials have become increasingly expensive, complex and burdened by regulation, so much so that many trials are of doubtful feasibility. Discussion Here we present a proposal for a novel, streamlined approach to randomized trials: the "clinically-integrated randomized trial". The key aspect of our methodology is that the clinical experience of the patient and doctor is virtually indistinguishable whether or not the patient is randomized, primarily because outcome data are obtained from routine clinical data, or from short, web-based questionnaires. Integration of a randomized trial into routine clinical practice also implies that there should be an attempt to randomize every patient, a corollary of which is that eligibility criteria are minimized. The similar clinical experience of patients on- and off-study also entails that the marginal cost of putting an additional patient on trial is negligible. We propose examples of how the clinically-integrated randomized trial might be applied in four distinct areas of medicine: comparisons of surgical techniques, "me too" drugs, rare diseases and lifestyle interventions. Barriers to implementing clinically-integrated randomized trials are discussed. Conclusion The proposed clinically-integrated randomized trial may allow us to enlarge dramatically the number of clinical questions that can be addressed by randomization.

  17. Proposed waste form performance criteria and testing methods for low-level mixed waste

    International Nuclear Information System (INIS)

    Franz, E.M.; Fuhrmann, M.; Bowerman, B.; Bates, S.; Peters, R.

    1994-08-01

    This document describes proposed waste form performance criteria and testing method that could be used as guidance in judging viability of a waste form as a physico-chemical barrier to releases of radionuclides and RCRA regulated hazardous components. It is assumed that release of contaminants by leaching is the single most important property by which the effectiveness of a waste form is judged. A two-tier regimen is proposed. The first tier includes a leach test required by the Environmental Protection Agency and a leach test designed to determine the net forward leach rate for a variety of materials. The second tier of tests are to determine if a set of stresses (i.e., radiation, freeze-thaw, wet-dry cycling) on the waste form adversely impact its ability to retain contaminants and remain physically intact. It is recommended that the first tier tests be performed first to determine acceptability. Only on passing the given specifications for the leach tests should other tests be performed. In the absence of site-specific performance assessments (PA), two generic modeling exercises are described which were used to calculate proposed acceptable leach rates

  18. Circular Updates

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Circular Updates are periodic sequentially numbered instructions to debriefing staff and observers informing them of changes or additions to scientific and specimen...

  19. Email Updates

    Science.gov (United States)

    ... of this page: https://medlineplus.gov/listserv.html Email Updates To use the sharing features on this ... view your email history or unsubscribe. Prevent MedlinePlus emails from being marked as "spam" or "junk" To ...

  20. Gene set analysis: limitations in popular existing methods and proposed improvements.

    Science.gov (United States)

    Mishra, Pashupati; Törönen, Petri; Leino, Yrjö; Holm, Liisa

    2014-10-01

    Gene set analysis is the analysis of a set of genes that collectively contribute to a biological process. Most popular gene set analysis methods are based on empirical P-value that requires large number of permutations. Despite numerous gene set analysis methods developed in the past decade, the most popular methods still suffer from serious limitations. We present a gene set analysis method (mGSZ) based on Gene Set Z-scoring function (GSZ) and asymptotic P-values. Asymptotic P-value calculation requires fewer permutations, and thus speeds up the gene set analysis process. We compare the GSZ-scoring function with seven popular gene set scoring functions and show that GSZ stands out as the best scoring function. In addition, we show improved performance of the GSA method when the max-mean statistics is replaced by the GSZ scoring function. We demonstrate the importance of both gene and sample permutations by showing the consequences in the absence of one or the other. A comparison of asymptotic and empirical methods of P-value estimation demonstrates a clear advantage of asymptotic P-value over empirical P-value. We show that mGSZ outperforms the state-of-the-art methods based on two different evaluations. We compared mGSZ results with permutation and rotation tests and show that rotation does not improve our asymptotic P-values. We also propose well-known asymptotic distribution models for three of the compared methods. mGSZ is available as R package from cran.r-project.org. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  1. Proposal of Screening Method of Sleep Disordered Breathing Using Fiber Grating Vision Sensor

    Science.gov (United States)

    Aoki, Hirooki; Nakamura, Hidetoshi; Nakajima, Masato

    Every conventional respiration monitoring technique requires at least one sensor to be attached to the body of the subject during measurement, thereby imposing a sense of restraint that results in aversion against measurements that would last over consecutive days. To solve this problem, we developed a respiration monitoring system for sleepers, and it uses a fiber-grating vision sensor, which is a type of active image sensor to achieve non-contact respiration monitoring. In this paper, we verified the effectiveness of the system, and proposed screening method of the sleep disordered breathing. It was shown that our system could equivalently measure the respiration with thermistor and accelerograph. And, the respiratory condition of sleepers can be grasped by our screening method in one look, and it seems to be useful for the support of the screening of sleep disordered breathing.

  2. Proposal for a new detection method of substance abuse risk in Croatian adolescents

    Directory of Open Access Journals (Sweden)

    Sanja Tatalovic Vorkapic

    2011-01-01

    Full Text Available One of the most important factors of successful substance abuse treatment is the early start of the same treatment. Recent selection method for identification of Croatian adolescents in the substance abuse risk that has been using drug tests from urine samples, has been simple and exact on the one hand, but on the other, has been very rare and usually guided by the pressure of parents or the court. Besides, such method presented the source of legal and ethical questions. So, the proposal of application of standardized psychological tests during systematic medical exams of Croatian adolescents at the age range of 15-22 years could help with the early detection of those adolescents who were in the substance abuse risk or already had the developed addiction problem.

  3. Proposal and Implementation of a Robust Sensing Method for DVB-T Signal

    Science.gov (United States)

    Song, Chunyi; Rahman, Mohammad Azizur; Harada, Hiroshi

    This paper proposes a sensing method for TV signals of DVB-T standard to realize effective TV White Space (TVWS) Communication. In the TVWS technology trial organized by the Infocomm Development Authority (iDA) of Singapore, with regard to the sensing level and sensing time, detecting DVB-T signal at the level of -120dBm over an 8MHz channel with a sensing time below 1 second is required. To fulfill such a strict sensing requirement, we propose a smart sensing method which combines feature detection and energy detection (CFED), and is also characterized by using dynamic threshold selection (DTS) based on a threshold table to improve sensing robustness to noise uncertainty. The DTS based CFED (DTS-CFED) is evaluated by computer simulations and is also implemented into a hardware sensing prototype. The results show that the DTS-CFED achieves a detection probability above 0.9 for a target false alarm probability of 0.1 for DVB-T signals at the level of -120dBm over an 8MHz channel with the sensing time equals to 0.1 second.

  4. A Proposal of New Spherical Particle Modeling Method Based on Stochastic Sampling of Particle Locations in Monte Carlo Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Kim, Do Hyun; Kim, Jong Kyung [Hanyang Univ., Seoul (Korea, Republic of); Noh, Jea Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2013-10-15

    To the high computational efficiency and user convenience, the implicit method had received attention; however, it is noted that the implicit method in the previous studies has low accuracy at high packing fraction. In this study, a new implicit method, which can be used at any packing fraction with high accuracy, is proposed. In this study, the implicit modeling method in the spherical particle distributed medium for using the MC simulation is proposed. A new concept in the spherical particle sampling was developed to solve the problems in the previous implicit methods. The sampling method was verified by simulating the sampling method in the infinite and finite medium. The results show that the particle implicit modeling with the proposed method was accurately performed in all packing fraction boundaries. It is expected that the proposed method can be efficiently utilized for the spherical particle distributed mediums, which are the fusion reactor blanket, VHTR reactors, and shielding analysis.

  5. Disintegration of sublingual tablets: proposal for a validated test method and acceptance criterion.

    Science.gov (United States)

    Weda, M; van Riet-Nales, D A; van Aalst, P; de Kaste, D; Lekkerkerker, J F F

    2006-12-01

    In the Netherlands the market share of isosorbide dinitrate 5 mg sublingual tablets is dominated by 2 products (A and B). In the last few years complaints have been received from health care professionals on product B. During patient use the disintegration of the tablet was reported to be slow and/or incomplete, and ineffectiveness was experienced. In the European Pharmacopoeia (Ph. Eur.) no requirement is present for the disintegration time of sublingual tablets. The purpose of this study was to compare the in vitro disintegration time of products A and B, and to establish a suitable test method and acceptance criterion. A and B were tested with the Ph. Eur. method described in the monograph on disintegration of tablets and capsules as well as with 3 modified tests using the same Ph. Eur. apparatus, but without movement of the basket-rack assembly. In modified test 1 and modified test 2 water was used as medium (900 ml and 50 ml respectively), whereas in modified test 3 artificial saliva was used (50 ml). In addition, disintegration was tested in Nessler tubes with 0.5 and 2 ml of water. Finally, the Ph. Eur. method was also applied to other sublingual tablets with other drug substances on the Dutch market. With modified test 3 no disintegration could be achieved within 20 min. With the Ph. Eur. method and modified tests 1 and 2 product A and B differed significantly (p disintegration times. These 3 methods were capable of discriminating between products and between batches. The time measured with the Ph. Eur. method was significantly lower compared to modified tests 1 and 2 (p tablets the disintegration time should be tested. The Ph. Eur. method is considered suitable for this test. In view of the products currently on the market and taking into consideration requirements in the United States Pharmacopeia and Japanese Pharmacopoeia, an acceptance criterion of not more than 2 min is proposed.

  6. Proposed method to calculate FRMAC intervention levels for the assessment of radiologically contaminated food and comparison of the proposed method to the U.S. FDA's method to calculate derived intervention levels

    Energy Technology Data Exchange (ETDEWEB)

    Kraus, Terrence D.; Hunt, Brian D.

    2014-02-01

    This report reviews the method recommended by the U.S. Food and Drug Administration for calculating Derived Intervention Levels (DILs) and identifies potential improvements to the DIL calculation method to support more accurate ingestion pathway analyses and protective action decisions. Further, this report proposes an alternate method for use by the Federal Emergency Radiological Assessment Center (FRMAC) to calculate FRMAC Intervention Levels (FILs). The default approach of the FRMAC during an emergency response is to use the FDA recommended methods. However, FRMAC recommends implementing the FIL method because we believe it to be more technically accurate. FRMAC will only implement the FIL method when approved by the FDA representative on the Federal Advisory Team for Environment, Food, and Health.

  7. Update of European bioethics

    DEFF Research Database (Denmark)

    Rendtorff, Jacob Dahl

    2015-01-01

    This paper presents an update of the research on European bioethics undertaken by the author together with Professor Peter Kemp since the 1990s, on Basic ethical principles in European bioethics and biolaw. In this European approach to basic ethical principles in bioethics and biolaw......, the principles of autonomy, dignity, integrity and vulnerability are proposed as the most important ethical principles for respect for the human person in biomedical and biotechnological development. This approach to bioethics and biolaw is presented here in a short updated version that integrates the earlier...... research in a presentation of the present understanding of the basic ethical principles in bioethics and biolaw....

  8. Robot Visual Tracking via Incremental Self-Updating of Appearance Model

    Directory of Open Access Journals (Sweden)

    Danpei Zhao

    2013-09-01

    Full Text Available This paper proposes a target tracking method called Incremental Self-Updating Visual Tracking for robot platforms. Our tracker treats the tracking problem as a binary classification: the target and the background. The greyscale, HOG and LBP features are used in this work to represent the target and are integrated into a particle filter framework. To track the target over long time sequences, the tracker has to update its model to follow the most recent target. In order to deal with the problems of calculation waste and lack of model-updating strategy with the traditional methods, an intelligent and effective online self-updating strategy is devised to choose the optimal update opportunity. The strategy of updating the appearance model can be achieved based on the change in the discriminative capability between the current frame and the previous updated frame. By adjusting the update step adaptively, severe waste of calculation time for needless updates can be avoided while keeping the stability of the model. Moreover, the appearance model can be kept away from serious drift problems when the target undergoes temporary occlusion. The experimental results show that the proposed tracker can achieve robust and efficient performance in several benchmark-challenging video sequences with various complex environment changes in posture, scale, illumination and occlusion.

  9. A revised glossary of terms most commonly used by clinical electroencephalographers and updated proposal for the report format of the EEG findings : Revision 2017

    NARCIS (Netherlands)

    Kane, Nick; Acharya, Jayant; Benickzy, Sandor; Caboclo, Luis; Finnigan, Simon; Kaplan, Peter W.; Shibasaki, Hiroshi; Pressler, Ronit; van Putten, Michel J.A.M.

    2017-01-01

    This glossary includes the terms most commonly used in clinical EEG. It is based on the previous proposals (Chatrian et al., 1974; Noachtar et al., 1999) and includes terms necessary to describe the EEG and to generate the EEG report. All EEG phenomena should be described as precisely as possible in

  10. Proposed update to the taxonomy of the genera Hepacivirus and Pegivirus within the Flaviviridae family

    DEFF Research Database (Denmark)

    Smith, Donald B.; Becher, Paul; Bukh, Jens

    2016-01-01

    Proposals are described for the assignment of recently reported viruses, infecting rodents, bats and other mammalian species, to new species within the Hepacivirus and Pegivirus genera (family Flaviviridae). Assignments into 14 Hepacivirus species (Hepacivirus A– N) and 11 Pegivirus species (Pegi...

  11. A Method for Proposing Valued-Adding Attributes in Customized Housing

    Directory of Open Access Journals (Sweden)

    Cynthia S. Hentschke

    2014-12-01

    Full Text Available In most emerging economies, there has been many incentives and high availability of funding for low-cost housing projects. This has encouraged product standardization and the application of mass production ideas, based on the assumption that this is the most effective strategy for reducing costs. However, the delivery of highly standardized housing units to customers with different needs, without considering their lifestyle and perception of value, often results in inadequate products. Mass customization has been pointed out as an effective strategy to improve value generation in low-cost housing projects, and to avoid waste caused by renovations done in dwellings soon after occupancy. However, one of the main challenges for the implementation of mass customization is the definition of a set of relevant options based on users’ perceived value. The aim of this paper is to propose a method for defining value adding attributes in customized housing projects, which can support decision-making in product development. The means-end chain theory was used as theoretical framework to connect product attributes and costumers’ values, through the application of the laddering technique. The method was tested in two house-building projects delivered by a company from Brazil. The main contribution of this method is to indicate the customization units that are most important for users along with the explanation of why those units are the most relevant ones.

  12. Proposal for a method to estimate nutrient shock effects in bacteria

    Directory of Open Access Journals (Sweden)

    Azevedo Nuno F

    2012-08-01

    Full Text Available Abstract Background Plating methods are still the golden standard in microbiology; however, some studies have shown that these techniques can underestimate the microbial concentrations and diversity. A nutrient shock is one of the mechanisms proposed to explain this phenomenon. In this study, a tentative method to assess nutrient shock effects was tested. Findings To estimate the extent of nutrient shock effects, two strains isolated from tap water (Sphingomonas capsulata and Methylobacterium sp. and two culture collection strains (E. coli CECT 434 and Pseudomonas fluorescens ATCC 13525 were exposed both to low and high nutrient conditions for different times and then placed in low nutrient medium (R2A and rich nutrient medium (TSA. The average improvement (A.I. of recovery between R2A and TSA for the different times was calculated to more simply assess the difference obtained in culturability between each medium. As expected, A.I. was higher when cells were plated after the exposition to water than when they were recovered from high-nutrient medium showing the existence of a nutrient shock for the diverse bacteria used. S. capsulata was the species most affected by this phenomenon. Conclusions This work provides a method to consistently determine the extent of nutrient shock effects on different microorganisms and hence quantify the ability of each species to deal with sudden increases in substrate concentration.

  13. How update schemes influence crowd simulations

    International Nuclear Information System (INIS)

    Seitz, Michael J; Köster, Gerta

    2014-01-01

    Time discretization is a key modeling aspect of dynamic computer simulations. In current pedestrian motion models based on discrete events, e.g. cellular automata and the Optimal Steps Model, fixed-order sequential updates and shuffle updates are prevalent. We propose to use event-driven updates that process events in the order they occur, and thus better match natural movement. In addition, we present a parallel update with collision detection and resolution for situations where computational speed is crucial. Two simulation studies serve to demonstrate the practical impact of the choice of update scheme. Not only do density-speed relations differ, but there is a statistically significant effect on evacuation times. Fixed-order sequential and random shuffle updates with a short update period come close to event-driven updates. The parallel update scheme overestimates evacuation times. All schemes can be employed for arbitrary simulation models with discrete events, such as car traffic or animal behavior. (paper)

  14. Proposal of adaptive human interface and study of interface evaluation method for plant operators

    International Nuclear Information System (INIS)

    Ujita, Hiroshi; Kubota, Ryuji.

    1994-01-01

    In this report, a new concept of human interface adaptive to plant operators' mental model, cognitive process and psychological state which change with time is proposed. It is composed of a function to determine information which should be indicated to operators based on the plant situation, a function to estimate operators' internal conditions, and a function to arrange the information amount, position, timing, form etc. based on their conditions. The method to evaluate the fitness of the interface by using the analysis results based on cognitive science, ergonomics, psychology and physiology is developed to achieve such an interface. Fundamental physiological experiments have been performed. Stress and workload can be identified by the ratio of the power average of the α wave fraction of a brain wave and be distinguished by the ratio of the standard deviation of the R-R interval in test and at rest, in the case of low stress such as mouse operation, calculation and walking. (author)

  15. Proposal of adaptive human interface and study of interface evaluation method for plant operators

    Energy Technology Data Exchange (ETDEWEB)

    Ujita, Hiroshi [Hitachi Ltd., Ibaraki (Japan). Energy Research Lab.; Kubota, Ryuji

    1994-07-01

    In this report, a new concept of human interface adaptive to plant operators' mental model, cognitive process and psychological state which change with time is proposed. It is composed of a function to determine information which should be indicated to operators based on the plant situation, a function to estimate operators' internal conditions, and a function to arrange the information amount, position, timing, form etc. based on their conditions. The method to evaluate the fitness of the interface by using the analysis results based on cognitive science, ergonomics, psychology and physiology is developed to achieve such an interface. Fundamental physiological experiments have been performed. Stress and workload can be identified by the ratio of the power average of the [alpha] wave fraction of a brain wave and be distinguished by the ratio of the standard deviation of the R-R interval in test and at rest, in the case of low stress such as mouse operation, calculation and walking. (author).

  16. A proposal on alternative sampling-based modeling method of spherical particles in stochastic media for Monte Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Song Hyun; Lee, Jae Yong; KIm, Do Hyun; Kim, Jong Kyung [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of); Noh, Jae Man [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-08-15

    Chord length sampling method in Monte Carlo simulations is a method used to model spherical particles with random sampling technique in a stochastic media. It has received attention due to the high calculation efficiency as well as user convenience; however, a technical issue regarding boundary effect has been noted. In this study, after analyzing the distribution characteristics of spherical particles using an explicit method, an alternative chord length sampling method is proposed. In addition, for modeling in finite media, a correction method of the boundary effect is proposed. Using the proposed method, sample probability distributions and relative errors were estimated and compared with those calculated by the explicit method. The results show that the reconstruction ability and modeling accuracy of the particle probability distribution with the proposed method were considerably high. Also, from the local packing fraction results, the proposed method can successfully solve the boundary effect problem. It is expected that the proposed method can contribute to the increasing of the modeling accuracy in stochastic media.

  17. A proposal on alternative sampling-based modeling method of spherical particles in stochastic media for Monte Carlo simulation

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Lee, Jae Yong; KIm, Do Hyun; Kim, Jong Kyung; Noh, Jae Man

    2015-01-01

    Chord length sampling method in Monte Carlo simulations is a method used to model spherical particles with random sampling technique in a stochastic media. It has received attention due to the high calculation efficiency as well as user convenience; however, a technical issue regarding boundary effect has been noted. In this study, after analyzing the distribution characteristics of spherical particles using an explicit method, an alternative chord length sampling method is proposed. In addition, for modeling in finite media, a correction method of the boundary effect is proposed. Using the proposed method, sample probability distributions and relative errors were estimated and compared with those calculated by the explicit method. The results show that the reconstruction ability and modeling accuracy of the particle probability distribution with the proposed method were considerably high. Also, from the local packing fraction results, the proposed method can successfully solve the boundary effect problem. It is expected that the proposed method can contribute to the increasing of the modeling accuracy in stochastic media

  18. Proposed Suitable Methods to Detect Transient Regime Switching to Improve Power Quality with Wavelet Transform

    Directory of Open Access Journals (Sweden)

    Javad Safaee Kuchaksaraee

    2016-10-01

    Full Text Available The increasing consumption of electrical energy and the use of non-linear loads that create transient regime states in distribution networks is increasing day by day. This is the only reason due to which the analysis of power quality for energy sustainability in power networks has become more important. Transients are often created by energy injection through switching or lightning and make changes in voltage and nominal current. Sudden increase or decrease in voltage or current makes characteristics of the transient regime. This paper shed some lights on the capacitor bank switching, which is one of the main causes for oscillatory transient regime states in the distribution network, using wavelet transform. The identification of the switching current of capacitor bank and the internal fault current of the transformer to prevent the unnecessary outage of the differential relay, it propose a new smart method. The accurate performance of this method is shown by simulation in EMTP and MATLAB (matrix laboratory software.

  19. Proposed method to produce a highly polarized e+ beam for future linear colliders

    International Nuclear Information System (INIS)

    Okugi, Toshiyuki; Chiba, Masami; Kurihara, Yoshimasa

    1996-01-01

    We propose a method to produce a spin-polarized e + beam using e + e - pair-creation by circularly polarized photons. Assuming Compton scattering of an unpolarized e - beam and circularly polarized laser light, scattered γ-rays at the high end of the energy spectrum are also circularly polarized. If those γ-rays are utilized to create e ± pairs on a thin target, the spin-polarization is preserved for e + 's at the high end of their energy spectrum. By using the injector linac of Accelerator Test Facility at KEK and a commercially available Nd:YAG pulse laser, we can expect about 10 5 polarized e + 's per second with a degree of polarization of 80% and a kinetic energy of 35-80 MeV. The apparatus for creation and measurement of polarized e + 's is being constructed. We present new idea for possible application of our method to future linear colliders by utilizing a high-power CO 2 laser. (author)

  20. A calibration method for proposed XRF measurements of arsenic and selenium in nail clippings

    International Nuclear Information System (INIS)

    Gherase, Mihai R; Fleming, David E B

    2011-01-01

    A calibration method for proposed x-ray fluorescence (XRF) measurements of arsenic and selenium in nail clippings is demonstrated. Phantom nail clippings were produced from a whole nail phantom (0.7 mm thickness, 25 x 25 mm 2 area) and contained equal concentrations of arsenic and selenium ranging from 0 to 20 μg g -1 in increments of 5 μg g -1 . The phantom nail clippings were then grouped in samples of five different masses: 20, 40, 60, 80 and 100 mg for each concentration. Experimental x-ray spectra were acquired for each of the sample masses using a portable x-ray tube and a detector unit. Calibration lines (XRF signal in a number of counts versus stoichiometric elemental concentration) were produced for each of the two elements. A semi-empirical relationship between the mass of the nail phantoms (m) and the slope of the calibration line (s) was determined separately for arsenic and selenium. Using this calibration method, one can estimate elemental concentrations and their uncertainties from the XRF spectra of human nail clippings. (note)

  1. Proposed method for reconstructing velocity profiles using a multi-electrode electromagnetic flow meter

    International Nuclear Information System (INIS)

    Kollár, László E; Lucas, Gary P; Zhang, Zhichao

    2014-01-01

    An analytical method is developed for the reconstruction of velocity profiles using measured potential distributions obtained around the boundary of a multi-electrode electromagnetic flow meter (EMFM). The method is based on the discrete Fourier transform (DFT), and is implemented in Matlab. The method assumes the velocity profile in a section of a pipe as a superposition of polynomials up to sixth order. Each polynomial component is defined along a specific direction in the plane of the pipe section. For a potential distribution obtained in a uniform magnetic field, this direction is not unique for quadratic and higher-order components; thus, multiple possible solutions exist for the reconstructed velocity profile. A procedure for choosing the optimum velocity profile is proposed. It is applicable for single-phase or two-phase flows, and requires measurement of the potential distribution in a non-uniform magnetic field. The potential distribution in this non-uniform magnetic field is also calculated for the possible solutions using weight values. Then, the velocity profile with the calculated potential distribution which is closest to the measured one provides the optimum solution. The reliability of the method is first demonstrated by reconstructing an artificial velocity profile defined by polynomial functions. Next, velocity profiles in different two-phase flows, based on results from the literature, are used to define the input velocity fields. In all cases, COMSOL Multiphysics is used to model the physical specifications of the EMFM and to simulate the measurements; thus, COMSOL simulations produce the potential distributions on the internal circumference of the flow pipe. These potential distributions serve as inputs for the analytical method. The reconstructed velocity profiles show satisfactory agreement with the input velocity profiles. The method described in this paper is most suitable for stratified flows and is not applicable to axisymmetric flows in

  2. Review of Dercum’s disease and proposal of diagnostic criteria, diagnostic methods, classification and management

    Directory of Open Access Journals (Sweden)

    Hansson Emma

    2012-04-01

    Full Text Available Abstract Definition and clinical picture We propose the minimal definition of Dercum’s disease to be generalised overweight or obesity in combination with painful adipose tissue. The associated symptoms in Dercum’s disease include fatty deposits, easy bruisability, sleep disturbances, impaired memory, depression, difficulty concentrating, anxiety, rapid heartbeat, shortness of breath, diabetes, bloating, constipation, fatigue, weakness and joint aches. Classification We suggest that Dercum’s disease is classified into: I. Generalised diffuse form A form with diffusely widespread painful adipose tissue without clear lipomas, II. Generalised nodular form - a form with general pain in adipose tissue and intense pain in and around multiple lipomas, and III. Localised nodular form - a form with pain in and around multiple lipomas IV. Juxtaarticular form - a form with solitary deposits of excess fat for example at the medial aspect of the knee. Epidemiology Dercum’s disease most commonly appears between the ages of 35 and 50 years and is five to thirty times more common in women than in men. The prevalence of Dercum’s disease has not yet been exactly established. Aetiology Proposed, but unconfirmed aetiologies include: nervous system dysfunction, mechanical pressure on nerves, adipose tissue dysfunction and trauma. Diagnosis and diagnostic methods Diagnosis is based on clinical criteria and should be made by systematic physical examination and thorough exclusion of differential diagnoses. Advisably, the diagnosis should be made by a physician with a broad experience of patients with painful conditions and knowledge of family medicine, internal medicine or pain management. The diagnosis should only be made when the differential diagnoses have been excluded. Differential diagnosis Differential diagnoses include: fibromyalgia, lipoedema, panniculitis, endocrine disorders, primary psychiatric disorders, multiple symmetric lipomatosis, familial

  3. A proposed through-flow inverse method for the design of mixed-flow pumps

    Science.gov (United States)

    Borges, Joao Eduardo

    1991-01-01

    A through-flow (hub-to-shroud) truly inverse method is proposed and described. It uses an imposition of mean swirl, i.e., radius times mean tangential velocity, given throughout the meridional section of the turbomachine as an initial design specification. In the present implementation, it is assumed that the fluid is inviscid, incompressible, and irrotational at inlet and that the blades are supposed to have zero thickness. Only blade rows that impart to the fluid a constant work along the space are considered. An application of this procedure to design the rotor of a mixed-flow pump is described in detail. The strategy used to find a suitable mean swirl distribution and the other design inputs is also described. The final blade shape and pressure distributions on the blade surface are presented, showing that it is possible to obtain feasible designs using this technique. Another advantage of this technique is the fact that it does not require large amounts of CPU time.

  4. Restoring stream habitat connectivity: a proposed method for prioritizing the removal of resident fish passage barriers.

    Science.gov (United States)

    O'Hanley, Jesse R; Wright, Jed; Diebel, Matthew; Fedora, Mark A; Soucy, Charles L

    2013-08-15

    Systematic methods for prioritizing the repair and removal of fish passage barriers, while growing of late, have hitherto focused almost exclusively on meeting the needs of migratory fish species (e.g., anadromous salmonids). An important but as of yet unaddressed issue is the development of new modeling approaches which are applicable to resident fish species habitat restoration programs. In this paper, we develop a budget constrained optimization model for deciding which barriers to repair or remove in order to maximize habitat availability for stream resident fish. Habitat availability at the local stream reach is determined based on the recently proposed C metric, which accounts for the amount, quality, distance and level of connectivity to different stream habitat types. We assess the computational performance of our model using geospatial barrier and stream data collected from the Pine-Popple Watershed, located in northeast Wisconsin (USA). The optimization model is found to be an efficient and practical decision support tool. Optimal solutions, which are useful in informing basin-wide restoration planning efforts, can be generated on average in only a few minutes. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. How to identify partial exposures to ionizing radiation? Proposal for a cytogenetic method

    International Nuclear Information System (INIS)

    Fernandes, T.S.; Silva, E.B.; Pinto, M.M.P.L.; Amaral, A.; Lloyd, David

    2013-01-01

    In cases of radiological incidents or in occupational exposures to ionizing radiation, the majority of exposures are not related to the total body, but only partial. In this context, if the cytogenetic dosimetry is performed, there will be an underestimation of the absorbed dose due to the dilution of irradiated cells with non-irradiated cells. Considering the norms of NR 32 - Safety and Health in the Work of Health Service - which recommends cytogenetic dosimetry in the investigation of accidental exposures to ionizing radiations, it is necessary to develop of a tool to provide a better identification of partial exposures. With this aim, a partial body exposure was simulated by mixing, in vitro, 70% of blood irradiated with 4 Gy of X-rays with 30% of unirradiated blood from the same healthy donor. Aliquots of this mixture were cultured for 48 and 72 hours. Prolonging the time of cell culture from 48 to 72 hours produced no significant change in the yield of dicentrics. However, when only M1 (first division cells) were analyzed, the frequency of dicentrics per cell was increased. Prolonging the time of cell culture allowed cells in mitotic delay by irradiation to reach metaphase, and thus provides enough time for the damage to be visualized. The results of this research present the proposed method as an important tool in the investigation of exposed individuals, allowing associating the cytogenetic analysis with the real percentage of irradiated cells, contributing significantly for the decision making in terms of occupational health. (author)

  6. A Proposal of Client Application Architecture using Loosely Coupled Component Connection Method in Banking Branch System

    Science.gov (United States)

    Someya, Harushi; Mori, Yuichi; Abe, Masahiro; Machida, Isamu; Hasegawa, Atsushi; Yoshie, Osamu

    Due to the deregulation of financial industry, the branches in banking industry need to shift to the sales-oriented bases from the operation-oriented bases. For corresponding to this movement, new banking branch systems are being developed. It is the main characteristics of new systems that we bring the form operations that have traditionally been performed at each branch into the centralized operation center for the purpose of rationalization and efficiency of the form operations. The branches treat a wide variety of forms. The forms can be described by common items in many cases, but the items include the different business logic and each form has the different relation among the items. And there is a need to develop the client application by user oneself. Consequently the challenge is to arrange the development environment that is high reusable, easy customizable and user developable. We propose a client application architecture that has a loosely coupled component connection method, and allows developing the applications by only describing the screen configurations and their transitions in XML documents. By adopting our architecture, we developed client applications of the centralized operation center for the latest banking branch system. Our experiments demonstrate good performances.

  7. Proposed algorithm to improve job shop production scheduling using ant colony optimization method

    Science.gov (United States)

    Pakpahan, Eka KA; Kristina, Sonna; Setiawan, Ari

    2017-12-01

    This paper deals with the determination of job shop production schedule on an automatic environment. On this particular environment, machines and material handling system are integrated and controlled by a computer center where schedule were created and then used to dictate the movement of parts and the operations at each machine. This setting is usually designed to have an unmanned production process for a specified interval time. We consider here parts with various operations requirement. Each operation requires specific cutting tools. These parts are to be scheduled on machines each having identical capability, meaning that each machine is equipped with a similar set of cutting tools therefore is capable of processing any operation. The availability of a particular machine to process a particular operation is determined by the remaining life time of its cutting tools. We proposed an algorithm based on the ant colony optimization method and embedded them on matlab software to generate production schedule which minimize the total processing time of the parts (makespan). We test the algorithm on data provided by real industry and the process shows a very short computation time. This contributes a lot to the flexibility and timelines targeted on an automatic environment.

  8. Proposal of how to update the standard information requirements in REACH, PPPR and BPR – a testing strategy for identification of endocrine disruptors

    DEFF Research Database (Denmark)

    Holbech, Henrik; Bjerregaard, Poul; Hass, Ulla

    to these, new test methods that include endocrine sensitive endpoints have been included with regard to human health and the environment. Similar data requirements and new test methods that include endocrine sensitive endpoints are included in the guidance on Regulation (EU) No 528/2012 on how to fulfil...... review on EDs and the revised strategy for the future work on endocrine disruptors, focusing on adequate detection of substances with endocrine disrupting properties under various legislative frameworks, including REACH (EC No 1907/2006), the Plant Protection Products Regulation (PPPR) (EC No 1107....../2009) and the Biocidal Products Regulation (BPR) (EC No 528/2012). There are currently no specific information requirements or testing strategies with regard to endocrine disruption in REACH and other relevant legislations. However, in relation to biocides and recently also to plant protection products, indications...

  9. WIMS-D library update

    International Nuclear Information System (INIS)

    2007-05-01

    WIMS-D (Winfrith Improved Multigroup Scheme-D) is the name of a family of software packages for reactor lattice calculations and is one of the few reactor lattice codes in the public domain and available on noncommercial terms. WIMSD-5B has recently been released from the OECD Nuclear Energy Agency Data Bank, and features major improvements in machine portability, as well as incorporating a few minor corrections. This version supersedes WIMS-D/4, which was released by the Winfrith Technology Centre in the United Kingdom for IBM machines and has been adapted for various other computer platforms in different laboratories. The main weakness of the WIMS-D package is the multigroup constants library, which is based on very old data. The relatively good performance of WIMS-D is attributed to a series of empirical adjustments to the multigroup data. However, the adjustments are not always justified on the basis of more accurate and recent experimental measurements. Following the release of new and revised evaluated nuclear data files, it was felt that the performance of WIMS-D could be improved by updating the associated library. The WIMS-D Library Update Project (WLUP) was initiated in the early 1990s with the support of the IAEA. This project consisted of voluntary contributions from a large number of participants. Several benchmarks for testing the library were identified and analysed, the WIMSR module of the NJOY code system was upgraded and the author of NJOY accepted the proposed updates for the official code system distribution. A detailed parametric study was performed to investigate the effects of various data processing input options on the integral results. In addition, the data processing methods for the main reactor materials were optimized. Several partially updated libraries were produced for testing purposes. The final stage of the WLUP was organized as a coordinated research project (CRP) in order to speed up completion of the fully updated library

  10. Finite-element-model updating using computational intelligence techniques applications to structural dynamics

    CERN Document Server

    Marwala, Tshilidzi

    2010-01-01

    Finite element models (FEMs) are widely used to understand the dynamic behaviour of various systems. FEM updating allows FEMs to be tuned better to reflect measured data and may be conducted using two different statistical frameworks: the maximum likelihood approach and Bayesian approaches. Finite Element Model Updating Using Computational Intelligence Techniques applies both strategies to the field of structural mechanics, an area vital for aerospace, civil and mechanical engineering. Vibration data is used for the updating process. Following an introduction a number of computational intelligence techniques to facilitate the updating process are proposed; they include: • multi-layer perceptron neural networks for real-time FEM updating; • particle swarm and genetic-algorithm-based optimization methods to accommodate the demands of global versus local optimization models; • simulated annealing to put the methodologies into a sound statistical basis; and • response surface methods and expectation m...

  11. Further comments on the sequential probability ratio testing methods

    Energy Technology Data Exchange (ETDEWEB)

    Kulacsy, K. [Hungarian Academy of Sciences, Budapest (Hungary). Central Research Inst. for Physics

    1997-05-23

    The Bayesian method for belief updating proposed in Racz (1996) is examined. The interpretation of the belief function introduced therein is found, and the method is compared to the classical binary Sequential Probability Ratio Testing method (SPRT). (author).

  12. On preconditioner updates for sequences of saddle-point linear systems

    Directory of Open Access Journals (Sweden)

    Simone Valentina De

    2018-02-01

    Full Text Available Updating preconditioners for the solution of sequences of large and sparse saddle- point linear systems via Krylov methods has received increasing attention in the last few years, because it allows to reduce the cost of preconditioning while keeping the efficiency of the overall solution process. This paper provides a short survey of the two approaches proposed in the literature for this problem: updating the factors of a preconditioner available in a block LDLT form, and updating a preconditioner via a limited-memory technique inspired by quasi-Newton methods.

  13. Proposing Some New Ecliptics in New Testament Studies Enabled by Digital Humanities-Based Methods

    Directory of Open Access Journals (Sweden)

    James Libby

    2016-04-01

    Full Text Available “Fragmentation” is a well-worn watchword in contemporary biblical studies. But is endless fragmentation across the traditional domains of epistemology, methodology and hermeneutics the inevitable future for the postmodern exercise of biblical scholarship? In our view, multiple factors mitigate against such a future, but two command our attention here. First, digital humanities itself, through its principled use of corpora, databases and computer-based methods, seems to be remarkably capable of producing findings with high levels of face validity (interpretive agreement across multiple hermeneutical perspectives and communities. Second, and perhaps more subversively, there is a substantial body of practitioners that, per Kearney, actively question postmodernity’s impress as the final port of call for philosophy. For these practitioners deconstruction has become both indispensable — by delegitimizing hegemonies — but, in its own way, metanarratival by stultifying all other iterative, dialectical and critical processes that have historically motivated scholarship. Sensing this impasse, Kearney (1987, pp. 43-45 proposes a reimagining that is not only critical but that also embraces ποίησις, the possibility of optimistic, creative work. Such a stance within digital humanities would affirm that poietic events emerge not only through frictions and fragmentation (e.g. Kinder and McPherson 2014, pp. xiii-xviii but also through commonalties and convergence. Our approach here will be to demonstrate such a reimagining, rather than to argue for it, using two worked examples in the Greek New Testament (GNT. Those examples – digital humanities-enabled papyrology and digital humanities-enabled statistical linguistics – demonstrate ways in which the data of the text itself can be used to interrogate our perspectives and suggest that our perspectives must remain ever open to such inquiries. We conclude with a call for digital humanities to

  14. A "conservative" method of thoracic wall dissection: a proposal for teaching human anatomy.

    Science.gov (United States)

    Barberini, Fabrizio; Brunone, Francesca

    2008-01-01

    The common methods of dissection exposing the thoracic organs include crossing of the wall together with wide resection of its muscular planes. In order to preserve these structures, a little demolishing technique of the thoracic wall is proposed, entering the thoracic cavity without extensive resection of the pectoral muscles. This method is based on the fact that these muscles rise up from the wall, like a bridge connecting the costal plane with the upper limb, and that the pectoralis major shows a segmental constitution. SUPERIOR LIMIT: Resect the sternal manubrium transversely between the 1st and the 2nd rib. The incision is prolonged along the 1st intercostal space, separating the first sterno-costal segment of the pectoralis major from the second one, and involving the intercostal muscles as far as the medial margin of the pectoralis minor. This muscle must be raised up, and the transverse resection continued below its medial margin latero-medially along the 1st intercostal space, to rejoin the cut performed before. Then, the incision of the 1st intercostal space is prolonged below the lateral margin of the pectoralis minor, which must be kept raised up, medio-laterally as far as the anterior axillary line. INFERIOR LIMIT: It corresponds to the inferior border of the thoracic cage, resected from the xiphoid process to the anterior axillary line, together with the sterno-costal insertions of the diaphragm. Then, an incision of the sterno-pericardial ligaments and a median sternotomy from the xiphoid process to the transverse resection of the manubrium should be performed. LATERAL LIMIT: From the point of crossing of the anterior axillary line with the inferior limit, resect the ribs from the 10th to the 2nd one. The lateral part of the pectoralis major must be raised up, so that the costal resection may be continued below it. Then, at the lateral extremity of the superior incision, the first and the second sternocostal segment of the pectoralis major must be

  15. A proposed new method for the determination of the solar irradiance at EUV wavelength range

    Science.gov (United States)

    Feldman, Uri; Doschek, G. A.; Seely, J. F.; Landi, E.; Dammasch, I.

    , 1.4x106 and 3x106 K. At the transition region (2x104 -2x105 K) where the structures are not isothermal the slopes of the emission measure vs. temperature stay the same independent of the solar activity. In our talk we will propose a variation to the EM method for the determination of the solar irradiance described above. The modified method will be based on line intensity calculations from the actual solar EM values at the above specified discrete temperatures. The EM in those temperatures could in principle be derived from solar observations spanning a fairly limited wavelengths range.

  16. The benefits of paired-agent imaging in molecular-guided surgery: an update on methods and applications (Conference Presentation)

    Science.gov (United States)

    Tichauer, Kenneth M.

    2016-03-01

    One of the major complications with conventional imaging-agent-based molecular imaging, particularly for cancer imaging, is variability in agent delivery and nonspecific retention in biological tissue. Such factors can account to "swamp" the signal arising from specifically bound imaging agent, which is presumably indicative of the concentration of targeted biomolecule. In the 1950s, Pressman et al. proposed a method of accounting for these delivery and retention effects by normalizing targeted antibody retention to the retention of a co-administered "untargeted"/control imaging agent [1]. Our group resurrected the approach within the last 5 years, finding ways to utilize this so-called "paired-agent" imaging approach to directly quantify biomolecule concentration in tissue (in vitro, ex vivo, and in vivo) [2]. These novel paired-agent imaging approaches capable of quantifying biomolecule concentration provide enormous potential for being adapted to and optimizing molecular-guided surgery, which has a principle goal of identifying distinct biological tissues (tumor, nerves, etc…) based on their distinct molecular environment. This presentation will cover the principles and nuances of paired-agent imaging, as well as the current status of the field and future applications. [1] D. Pressman, E. D. Day, and M. Blau, "The use of paired labeling in the determination of tumor-localizing antibodies," Cancer Res, 17(9), 845-50 (1957). [2] K. M. Tichauer, Y. Wang, B. W. Pogue et al., "Quantitative in vivo cell-surface receptor imaging in oncology: kinetic modeling and paired-agent principles from nuclear medicine and optical imaging," Phys Med Biol, 60(14), R239-69 (2015).

  17. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  18. Age correction in monitoring audiometry: method to update OSHA age-correction tables to include older workers.

    Science.gov (United States)

    Dobie, Robert A; Wojcik, Nancy C

    2015-07-13

    The US Occupational Safety and Health Administration (OSHA) Noise Standard provides the option for employers to apply age corrections to employee audiograms to consider the contribution of ageing when determining whether a standard threshold shift has occurred. Current OSHA age-correction tables are based on 40-year-old data, with small samples and an upper age limit of 60 years. By comparison, recent data (1999-2006) show that hearing thresholds in the US population have improved. Because hearing thresholds have improved, and because older people are increasingly represented in noisy occupations, the OSHA tables no longer represent the current US workforce. This paper presents 2 options for updating the age-correction tables and extending values to age 75 years using recent population-based hearing survey data from the US National Health and Nutrition Examination Survey (NHANES). Both options provide scientifically derived age-correction values that can be easily adopted by OSHA to expand their regulatory guidance to include older workers. Regression analysis was used to derive new age-correction values using audiometric data from the 1999-2006 US NHANES. Using the NHANES median, better-ear thresholds fit to simple polynomial equations, new age-correction values were generated for both men and women for ages 20-75 years. The new age-correction values are presented as 2 options. The preferred option is to replace the current OSHA tables with the values derived from the NHANES median better-ear thresholds for ages 20-75 years. The alternative option is to retain the current OSHA age-correction values up to age 60 years and use the NHANES-based values for ages 61-75 years. Recent NHANES data offer a simple solution to the need for updated, population-based, age-correction tables for OSHA. The options presented here provide scientifically valid and relevant age-correction values which can be easily adopted by OSHA to expand their regulatory guidance to

  19. Standardized patients in audiology: a proposal for a new method of evaluating clinical competence.

    Science.gov (United States)

    Dinsmore, Brooke Freeman; Bohnert, Carrie; Preminger, Jill E

    2013-05-01

    While accrediting organizations require AuD programs to provide evidence that their students are able to demonstrate knowledge and competencies in specific content areas, there are no generally accepted mechanisms for the assessment and the measurement of these proficiencies. We propose that AuD programs consider developing standardized patient (SP) cases in order to develop consistent summative assessment programs within and across universities. The purpose of this article is to provide a framework for establishing SP programs to evaluate competencies in AuD students by detailing the history of SP cases and their use, developing a rationale for this method of assessment, and outlining the steps for writing and implementing SP cases. Literature review. SPs have been used to assess clinical competence in medical students for over 50 yr. The prevalence of SP assessment in allied health professions (e.g., dentistry, psychology, pharmacy) has increased over the last two decades but has only gained a limited following in audiology. SP assessment has been implemented in medical education using the Objective Structured Clinical Examination, a multistation, timed exam that uses fictional cases to assess students' clinical abilities. To date, only one published report has been completed that evaluates the use of SPs to assess clinical abilities in audiology students. This article expands upon the work of English et al (2007) and their efforts to use SPs to evaluate counseling abilities. To this end, we describe the steps necessary to write a case, procedures to determine performance requirements, and the need to develop remediation plans. As an example, we include a case that we have developed in order to evaluate vestibular assessment and patient communication skills. Utilizing SP assessment in audiology education would provide useful means to evaluate competence in a uniform way. Future research is necessary to develop reliable and valid cases that may be implemented

  20. Dynamic model updating based on strain mode shape and natural frequency using hybrid pattern search technique

    Science.gov (United States)

    Guo, Ning; Yang, Zhichun; Wang, Le; Ouyang, Yan; Zhang, Xinping

    2018-05-01

    Aiming at providing a precise dynamic structural finite element (FE) model for dynamic strength evaluation in addition to dynamic analysis. A dynamic FE model updating method is presented to correct the uncertain parameters of the FE model of a structure using strain mode shapes and natural frequencies. The strain mode shape, which is sensitive to local changes in structure, is used instead of the displacement mode for enhancing model updating. The coordinate strain modal assurance criterion is developed to evaluate the correlation level at each coordinate over the experimental and the analytical strain mode shapes. Moreover, the natural frequencies which provide the global information of the structure are used to guarantee the accuracy of modal properties of the global model. Then, the weighted summation of the natural frequency residual and the coordinate strain modal assurance criterion residual is used as the objective function in the proposed dynamic FE model updating procedure. The hybrid genetic/pattern-search optimization algorithm is adopted to perform the dynamic FE model updating procedure. Numerical simulation and model updating experiment for a clamped-clamped beam are performed to validate the feasibility and effectiveness of the present method. The results show that the proposed method can be used to update the uncertain parameters with good robustness. And the updated dynamic FE model of the beam structure, which can correctly predict both the natural frequencies and the local dynamic strains, is reliable for the following dynamic analysis and dynamic strength evaluation.

  1. Identification of material parameters for plasticity models: A comparative study on the finite element model updating and the virtual fields method

    Science.gov (United States)

    Martins, J. M. P.; Thuillier, S.; Andrade-Campos, A.

    2018-05-01

    The identification of material parameters, for a given constitutive model, can be seen as the first step before any practical application. In the last years, the field of material parameters identification received an important boost with the development of full-field measurement techniques, such as Digital Image Correlation. These techniques enable the use of heterogeneous displacement/strain fields, which contain more information than the classical homogeneous tests. Consequently, different techniques have been developed to extract material parameters from full-field measurements. In this study, two of these techniques are addressed, the Finite Element Model Updating (FEMU) and the Virtual Fields Method (VFM). The main idea behind FEMU is to update the parameters of a constitutive model implemented in a finite element model until both numerical and experimental results match, whereas VFM makes use of the Principle of Virtual Work and does not require any finite element simulation. Though both techniques proved their feasibility in linear and non-linear constitutive models, it is rather difficult to rank their robustness in plasticity. The purpose of this work is to perform a comparative study in the case of elasto-plastic models. Details concerning the implementation of each strategy are presented. Moreover, a dedicated code for VFM within a large strain framework is developed. The reconstruction of the stress field is performed through a user subroutine. A heterogeneous tensile test is considered to compare FEMU and VFM strategies.

  2. Updating optical pseudoinverse associative memories.

    Science.gov (United States)

    Telfer, B; Casasent, D

    1989-07-01

    Selected algorithms for adding to and deleting from optical pseudoinverse associative memories are presented and compared. New realizations of pseudoinverse updating methods using vector inner product matrix bordering and reduced-dimensionality Karhunen-Loeve approximations (which have been used for updating optical filters) are described in the context of associative memories. Greville's theorem is reviewed and compared with the Widrow-Hoff algorithm. Kohonen's gradient projection method is expressed in a different form suitable for optical implementation. The data matrix memory is also discussed for comparison purposes. Memory size, speed and ease of updating, and key vector requirements are the comparison criteria used.

  3. Ontology Update in the Cognitive Model of Ontology Learning

    Directory of Open Access Journals (Sweden)

    Zhang De-Hai

    2016-01-01

    Full Text Available Ontology has been used in many hot-spot fields, but most ontology construction methods are semiautomatic, and the construction process of ontology is still a tedious and painstaking task. In this paper, a kind of cognitive models is presented for ontology learning which can simulate human being’s learning from world. In this model, the cognitive strategies are applied with the constrained axioms. Ontology update is a key step when the new knowledge adds into the existing ontology and conflict with old knowledge in the process of ontology learning. This proposal designs and validates the method of ontology update based on the axiomatic cognitive model, which include the ontology update postulates, axioms and operations of the learning model. It is proved that these operators subject to the established axiom system.

  4. A Proposal of Product Development Collaboration Method Using User Support Information and its Experimental Evaluation

    Science.gov (United States)

    Tanaka, Mitsuru; Kataoka, Masatoshi; Koizumi, Hisao

    As the market changes more rapidly and new products continue to get more complex and multifunctional, product development collaboration with competent partners and leading users is getting more important to come up with new products that are successful in the market in a timely manner. ECM (engineering chain management) and SCM (supply chain management) are supply-side approaches toward this collaboration. In this paper, we propose a demand-side approach toward product development collaboration with users based on the information gathered through user support interactions. The approach and methodology proposed here was applied to a real data set, and its effectiveness was verified.

  5. Quantifying Update Effects in Citizen-Oriented Software

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2009-02-01

    Full Text Available Defining citizen-oriented software. Detailing technical issues regarding update process in this kind of software. Presenting different effects triggered by types of update. Building model for update costs estimation, including producer-side and consumer-side effects. Analyzing model applicability on INVMAT – large scale matrix inversion software. Proposing a model for update effects estimation. Specifying ways for softening effects of inaccurate updates.

  6. A Proposed Method for Improving the Performance of P-Type GaAs IMPATTs

    Directory of Open Access Journals (Sweden)

    H. A. El-Motaafy

    2012-07-01

    Full Text Available A special waveform is proposed and assumed to be the optimum waveform for p-type GaAs IMPATTs. This waveform is deduced after careful and extensive study of the performance of these devices. The results presented here indicate the superiority of the performance of the IMPATTs driven by the proposed waveform over that obtained when the same IMPATTs are driven by the conventional sinusoidal waveform. These results are obtained using a full-scale computer simulation program that takes fully into account all the physical effects pertinent to IMPATT operation.  In this paper, it is indicated that the superiority of the proposed waveform is attributed to its ability to reduce the bad effects that usually degrade the IMPATT performance such as the space-charge effect and the drift-velocity dropping below saturation effect. The superiority is also attributed to the ability of the proposed waveform to improve the phase relationship between the terminal voltage and the induced current.Key Words: Computer-Aided Design, GaAs IMPATT, Microwave Engineering

  7. A Proposed Model of Retransformed Qualitative Data within a Mixed Methods Research Design

    Science.gov (United States)

    Palladino, John M.

    2009-01-01

    Most models of mixed methods research design provide equal emphasis of qualitative and quantitative data analyses and interpretation. Other models stress one method more than the other. The present article is a discourse about the investigator's decision to employ a mixed method design to examine special education teachers' advocacy and…

  8. Updating Recursive XML Views of Relations

    DEFF Research Database (Denmark)

    Choi, Byron; Cong, Gao; Fan, Wenfei

    2009-01-01

    This paper investigates the view update problem for XML views published from relational data. We consider XML views defined in terms of mappings directed by possibly recursive DTDs compressed into DAGs and stored in relations. We provide new techniques to efficiently support XML view updates...... specified in terms of XPath expressions with recursion and complex filters. The interaction between XPath recursion and DAG compression of XML views makes the analysis of the XML view update problem rather intriguing. Furthermore, many issues are still open even for relational view updates, and need...... to be explored. In response to these, on the XML side, we revise the notion of side effects and update semantics based on the semantics of XML views, and present effecient algorithms to translate XML updates to relational view updates. On the relational side, we propose a mild condition on SPJ views, and show...

  9. Proposal on the mitigation methods of thermal stress near the sodium

    International Nuclear Information System (INIS)

    Ando, Masanori; Kasahara, Naoto

    2003-09-01

    A Reactor vessel of fast rector plants contains high temperature liquid sodium in its inside and its upper end is supported by a low temperature structures. Therefore, a significant temperature gradient will arise at the vessel wall near the sodium surface. For this reason, a large thermal stress will be generated around this part. To lower this stress and to protect the vessel, a number of methods have been applied the plants. Generally, these mitigation methods by protection equipments for thermal stress also have some problems such as, increase a mount of materials or to be complicate for control, hard to maintenance and so on. In this research, authors suggested another simple methods for thermal stress, and evaluated their effects using computer analysis. The results obtained in this research are as follows. Authors suggested one method, circulate high temperature gas around outside of the vessel and evaluated the effects of this method by analysis. In case of using this method, Sn (one of index values of design) value might be getting lower about 45%. Authors also suggested another method by setting up a heat transfer plate outside of the vessel and evaluated the effects of this method by analysis. Effects of this method depend on material of the plate. In case of using Carbon as material of plate, Sn value might be 27% lower and in case of using 12Cr steel as material of plate, Sn value might be 15% lower. Authors also suggested another method by changing material of the guard vessel to be the one which has good ability of heat transfer and evaluated the effects of this method by analysis. In case of changing material of guard vessel to 12Cr steel, Sn value might be lower about 12%. (author)

  10. Proposing water balance method for water availability estimation in Indonesian regional spatial planning

    Science.gov (United States)

    Juniati, A. T.; Sutjiningsih, D.; Soeryantono, H.; Kusratmoko, E.

    2018-01-01

    The water availability (WA) of a region is one of important consideration in both the formulation of spatial plans and the evaluation of the effectiveness of actual land use in providing sustainable water resources. Information on land-water needs vis-a-vis their availability in a region determines the state of the surplus or deficit to inform effective land use utilization. How to calculate water availability have been described in the Guideline in Determining the Carrying Capacity of the Environment in Regional Spatial Planning. However, the method of determining the supply and demand of water on these guidelines is debatable since the determination of WA in this guideline used a rational method. The rational method is developed the basis for storm drain design practice and it is essentially a peak discharge method peak discharge calculation method. This paper review the literature in methods of water availability estimation which is described descriptively, and present arguments to claim that water balance method is a more fundamental and appropriate tool in water availability estimation. A better water availability estimation method would serve to improve the practice in preparing formulations of Regional Spatial Plan (RSP) as well as evaluating land use capacity in providing sustainable water resources.

  11. New proposal of moderator temperature coefficient estimation method using gray-box model in NPP, (1)

    International Nuclear Information System (INIS)

    Mori, Michitsugu; Kagami, Yuichi; Kanemoto, Shigeru; Enomoto, Mitsuhiro; Tamaoki, Tetsuo; Kawamura, Shinichiro

    2004-01-01

    The purpose of the present paper is to establish a new void reactivity coefficient (VRC) estimation method based on gray box modeling concept. The gray box model consists of a point kinetics model as the first principle model and a fitting model of moderator temperature kinetics. Applying Kalman filter and maximum likehood estimation algorithms to the gray box model, MTC can be estimated. The verification test is done by Monte Carlo simulation, and, it is shown that the present method gives the best estimation results comparing with the conventional methods from the viewpoints of non-biased and smallest scattering estimation performance. Furthermore, the method is verified via real plant data analysis. The reason of good performance of the present method is explained by proper definition of likelihood function based on explicit expression of observation and system noise in the gray box model. (author)

  12. A proposal of parameter determination method in the residual strength degradation model for the prediction of fatigue life (I)

    International Nuclear Information System (INIS)

    Kim, Sang Tae; Jang, Seong Soo

    2001-01-01

    The static and fatigue tests have been carried out to verify the validity of a generalized residual strength degradation model. And a new method of parameter determination in the model is verified experimentally to account for the effect of tension-compression fatigue loading of spheroidal graphite cast iron. It is shown that the correlation between the experimental results and the theoretical prediction on the statistical distribution of fatigue life by using the proposed method is very reasonable. Furthermore, it is found that the correlation between the theoretical prediction and the experimental results of fatigue life in case of tension-tension fatigue data in composite material appears to be reasonable. Therefore, the proposed method is more adjustable in the determination of the parameter than maximum likelihood method and minimization technique

  13. Assessment of four calculation methods proposed by the EC for waste hazardous property HP 14 'Ecotoxic'.

    Science.gov (United States)

    Hennebert, Pierre; Humez, Nicolas; Conche, Isabelle; Bishop, Ian; Rebischung, Flore

    2016-02-01

    Legislation published in December 2014 revised both the List of Waste (LoW) and amended Appendix III of the revised Waste Framework Directive 2008/98/EC; the latter redefined hazardous properties HP 1 to HP 13 and HP 15 but left the assessment of HP 14 unchanged to allow time for the Directorate General of the Environment of the European Commission to complete a study that is examining the impacts of four different calculation methods for the assessment of HP 14. This paper is a contribution to the assessment of the four calculation methods. It also includes the results of a fifth calculation method; referred to as "Method 2 with extended M-factors". Two sets of data were utilised in the assessment; the first (Data Set #1) comprised analytical data for 32 different waste streams (16 hazardous (H), 9 non-hazardous (NH) and 7 mirror entries, as classified by the LoW) while the second data set (Data Set #2), supplied by the eco industries, comprised analytical data for 88 waste streams, all classified as hazardous (H) by the LoW. Two approaches were used to assess the five calculation methods. The first approach assessed the relative ranking of the five calculation methods by the frequency of their classification of waste streams as H. The relative ranking of the five methods (from most severe to less severe) is: Method 3>Method 1>Method 2 with extended M-factors>Method 2>Method 4. This reflects the arithmetic ranking of the concentration limits of each method when assuming M=10, and is independent of the waste streams, or the H/NH/Mirror status of the waste streams. A second approach is the absolute matching or concordance with the LoW. The LoW is taken as a reference method and the H wastes are all supposed to be HP 14. This point is discussed in the paper. The concordance for one calculation method is established by the number of wastes with identical classification by the considered calculation method and the LoW (i.e. H to H, NH to NH). The discordance is

  14. A proposed method of measuring the electric-dipole moment of the neutron by ultracold neutron interferometry

    International Nuclear Information System (INIS)

    Freedman, M.S.; Peshkin, M.; Ringo, G.R.; Dombeck, T.W.

    1989-08-01

    The use of an ultracold neutron interferometer incorporating an electrostatic accelerator having a strong electric field gradient to accelerate neutrons by their possible electric dipole moment is proposed as a method of measuring the neutron electric dipole moment. The method appears to have the possibility of extending the sensitivity of the measurement by several orders of magnitude, perhaps to 10 -30 e-cm. 9 refs., 3 figs

  15. Small Private Online Research: A Proposal for A Numerical Methods Course Based on Technology Use and Blended Learning

    Science.gov (United States)

    Cepeda, Francisco Javier Delgado

    2017-01-01

    This work presents a proposed model in blended learning for a numerical methods course evolved from traditional teaching into a research lab in scientific visualization. The blended learning approach sets a differentiated and flexible scheme based on a mobile setup and face to face sessions centered on a net of research challenges. Model is…

  16. Enhancing the Social Network Dimension of Lifelong Competence Development and Management Systems: A Proposal of Methods and Tools

    NARCIS (Netherlands)

    Cheak, Alicia; Angehrn, Albert; Sloep, Peter

    2006-01-01

    Cheak, A. M., Angehrn, A. A., & Sloep, P. (2006). Enhancing the social network dimension of lifelong competence development and management systems: A proposal of methods and tools. In R. Koper & K. Stefanov (Eds.). Proceedings of International Workshop in Learning Networks for Lifelong Competence

  17. Enhancing the Social Network Dimension of Lifelong Competence Development and Management Systems: A proposal of methods and tools

    NARCIS (Netherlands)

    Cheak, Alicia; Angehrn, Albert; Sloep, Peter

    2006-01-01

    Cheak, A. M., Angehrn, A. A., & Sloep, P. B. (2006). Enhancing the social network dimension of lifelong competence development and management systems: A proposal of methods and tools. In E. J. R. Koper & K. Stefanov (Eds.), Proceedings of International Workshop on Learning Networks for Lifelong

  18. Proposal for a new method of reactor neutron flux distribution determination

    Energy Technology Data Exchange (ETDEWEB)

    Popic, V R [Institute of nuclear sciences Boris Kidric, Vinca, Beograd (Serbia and Montenegro)

    1964-01-15

    A method, based on the measurements of the activity produced in a medium flowing with variable velocity through a reactor, for the determination of the neutron flux distribution inside a reactor is considered theoretically (author)

  19. Conceptual design of covering method for the proposed LILW near-surface repository at Cernavoda

    International Nuclear Information System (INIS)

    Diaconu, Daniela

    2003-01-01

    The disposal concept of the low and intermediate level (LIL) wastes resulting during NPP operation combines both the natural and engineered barriers in order to ensure the safety of the environment and population. Saligny site has been proposed for LIL waste disposal. Preliminary performance assessments indicate that the loess and clay layers are efficient natural barriers against water flow and radionuclide migration through the vadose zone to the local aquifers. At present, the studies on site characterization are concentrated on investigation of the potential factors affecting the long-term integrity of the disposal facility. This analysis showed that surface erosion by wind and water and bio-intrusion by plant roots and burrowing animals could affect the long-term disposal safety. Based on the preliminary erosion results, as well as on the high probability of bio-intrusion by the plant roots and burrowing animals (i.e. moles, mice), different covering systems able to ensure the long-term safety of the repository has been proposed and analyzed. FEHM and HYDRUS 2D water flow simulations have been performed in order to compare their efficiency in the diminution of the infiltration rate in the repository. From this point of view, the covering system combining the capillary barrier and the resistive layer proved to have the best behavior

  20. International survey of methods used in health technology assessment (HTA: does practice meet the principles proposed for good research?

    Directory of Open Access Journals (Sweden)

    Stephens JM

    2012-08-01

    , and data availability for emerging technologies.Conclusion: This is the first international survey to specifically assess the state of HTA research methods. Future efforts should expand the respondent sample to include more emerging markets and update the results of this survey to specifically address additional aspects of research methods in HTA.Keywords: survey, technology assessment, payers, research methods, reimbursement

  1. Improve the functional status of students using the proposed method recovery

    Directory of Open Access Journals (Sweden)

    Evtukh M.I.

    2012-12-01

    Full Text Available Purpose - to improve the organizational and methodological foundations of physical education for the improvement of high school students in training. The study involved 152 students of the second year of the International Economics and Humanities University named after Stepan Demyanchuk. Students were divided into control (n = 76 and primary (n = 76 groups, which were similar in age and physical development. At the end of the study, through the application of the proposed technique improvement in students the core group, was able to restore the function of the respiratory and cardiovascular systems to the possibilities of healthy untrained people. A similar increase in the functionality of the core group of students registered with the definition of the index Skibinski - held a combined evaluation of functions of the respiratory and cardiovascular systems of students and determine its growth with satisfactory to good level.

  2. Proposing co-design of personas as a method to heighten validity and engage users

    DEFF Research Database (Denmark)

    Albrechtsen, Charlotte; Pedersen, Majbrit; Pedersen, Nicholai Friis

    2016-01-01

    This paper proposes co-designing personas with users as a strategy to overcome a challenge inherent in the design of personas or fictitious users: On one hand, personas should appear realistic and believable as individuals, and on the other hand, personas should represent a broader range of users....... By involving empirical users in all parts of the process of persona design, the risk of creating personas that are too stereotypical is minimized, as the participating users enrich the data on which the personas are based with up-to-date and firsthand contextual knowledge. Advantages of co-designing personas...... with users is illustrated by a case from higher education in which personas were co-designed with students as part of a project aiming at designing a smartphone application for Master's thesis students. © 2016, IGI Global....

  3. R and D proposals to improve outages operation. Methods, practices and tools

    International Nuclear Information System (INIS)

    Dionis, Francois

    2014-01-01

    This paper deals with outage operation improvement. It offers a number of tracks on the interactions between the operation activities and maintenance, with a methodological perspective and proposals concerning the Information System. On the methodological point of view, a clever plant systems modeling may allow representing the needed characteristics in order to optimize tagouts, alignment procedures and the schedule. Tools must be taken n into account for new tagout practices such as tags sharing. It is possible to take advantage of 2D drawings integrated into the information system in order to improve the data controls and to visualize operation activities. An integrated set of mobile applications should allow field operators to join the information system for a better and safer performance. (author)

  4. Proposed method for determining the thickness of glass in solar collector panels

    Science.gov (United States)

    Moore, D. M.

    1980-01-01

    An analytical method was developed for determining the minimum thickness for simply supported, rectangular glass plates subjected to uniform normal pressure environmental loads such as wind, earthquake, snow, and deadweight. The method consists of comparing an analytical prediction of the stress in the glass panel to a glass breakage stress determined from fracture mechanics considerations. Based on extensive analysis using the nonlinear finite element structural analysis program ARGUS, design curves for the structural analysis of simply supported rectangular plates were developed. These curves yield the center deflection, center stress and corner stress as a function of a dimensionless parameter describing the load intensity. A method of estimating the glass breakage stress as a function of a specified failure rate, degree of glass temper, design life, load duration time, and panel size is also presented.

  5. Brief communication: a proposed osteological method for the estimation of pubertal stage in human skeletal remains.

    Science.gov (United States)

    Shapland, Fiona; Lewis, Mary E

    2013-06-01

    Puberty forms an important threshold between childhood and adulthood, but this subject has received little attention in bioarchaeology. The new application of clinical methods to assess pubertal stage in adolescent skeletal remains is explored, concentrating on the development of the mandibular canine, hamate, hand phalanges, iliac crest and distal radius. Initial results from the medieval cemetery of St. Peter's Church, Barton-upon-Humber, England suggest that application of these methods may provide insights into aspects of adolescent development. This analysis indicates that adolescents from this medieval site were entering the pubertal growth spurt at a similar age to their modern counterparts, but that the later stages of pubertal maturation were being significantly delayed, perhaps due to environmental stress. Continued testing and refinement of these methods on living adolescents is still necessary to improve our understanding of their significance and accuracy in predicting pubertal stages. Copyright © 2013 Wiley Periodicals, Inc.

  6. Proposed Model for Integrating RAMS Method in the Design Process in Construction

    Directory of Open Access Journals (Sweden)

    Saad Al-Jibouri

    2010-05-01

    Full Text Available There is a growing trend in the Netherlands for outsourcing public construction activities to the private sector through the use of integrated contracts. There is also an increasing emphasis from public clients on the use of RAMS and life cycle costing (LCC in the design process of infrastructural projects to improve the performance of designed systems and optimize the project cost. RAMS is an acronym for `reliability, availability, maintainability and safety' and represents a collection of techniques to provide predictions of the performance targets of the required system. Increasingly, RAMS targets are being specified in invitation to tender or contract documents and the parties responsible for the design are required to provide evidence of its application in their design. Recent evidence from practice, complemented with a literature study, has shown that the knowledge and application of RAMS in infrastructural designs are in their infancy compared with other industrial sectors and many designers in construction do not have the necessary knowledge and experience to apply it. This paper describes a proposed model for the integration of RAMS and LCC into the design process in construction. A variation of the model for the application of RAMS in `design, build, finance and maintain' (DBFM contracts that include maintenance requirements is also proposed. The two models involve providing guidelines to simplify the application of RAMs by the designers. The model has been validated for its practicality and usefulness during a workshop by experienced designers. DOI: 10.3763/aedm.2008.0100 Published in the Journal AEDM - Volume 5, Number 4, 2009 , pp. 179-192(14

  7. Hypothesis: primary antiangiogenic method proposed to treat early stage breast cancer

    International Nuclear Information System (INIS)

    Retsky, Michael W; Hrushesky, William JM; Gukas, Isaac D

    2009-01-01

    Women with Down syndrome very rarely develop breast cancer even though they now live to an age when it normally occurs. This may be related to the fact that Down syndrome persons have an additional copy of chromosome 21 where the gene that codes for the antiangiogenic protein Endostatin is located. Can this information lead to a primary antiangiogenic therapy for early stage breast cancer that indefinitely prolongs remission? A key question that arises is when is the initial angiogenic switch thrown in micrometastases? We have conjectured that avascular micrometastases are dormant and relatively stable if undisturbed but that for some patients angiogenesis is precipitated by surgery. We also proposed that angiogenesis of micrometastases very rarely occurs before surgical removal of the primary tumor. If that is so, it seems possible that we could suggest a primary antiangiogenic therapy but the problem then arises that starting a therapy before surgery would interfere with wound healing. The therapy must be initiated at least one day prior to surgical removal of the primary tumor and kept at a Down syndrome level perhaps indefinitely. That means the drug must have virtually no toxicity and not interfere meaningfully with wound healing. This specifically excludes drugs that significantly inhibit the VEGF pathway since that is important for wound healing and because these agents have some toxicity. Endostatin is apparently non-toxic and does not significantly interfere with wound healing since Down syndrome patients have no abnormal wound healing problems. We propose a therapy for early stage breast cancer consisting of Endostatin at or above Down syndrome levels starting at least one day before surgery and continuing at that level. This should prevent micrometastatic angiogenesis resulting from surgery or at any time later. Adjuvant chemotherapy or hormone therapy should not be necessary. This can be continued indefinitely since there is no acquired resistance that

  8. Proposed method of assembly for the BCD silicon strip vertex detector modules

    International Nuclear Information System (INIS)

    Lindenmeyer, C.

    1989-01-01

    The BCD Silicon strip Vertex Detector is constructed of 10 identical central region modules and 18 similar forward region modules. This memo describes a method of assembling these modules from individual silicon wafers. Each wafer is fitted with associated front end electronics and cables and has been tested to insure that only good wafers reach the final assembly stage. 5 figs

  9. Interpreting Statistical Significance Test Results: A Proposed New "What If" Method.

    Science.gov (United States)

    Kieffer, Kevin M.; Thompson, Bruce

    As the 1994 publication manual of the American Psychological Association emphasized, "p" values are affected by sample size. As a result, it can be helpful to interpret the results of statistical significant tests in a sample size context by conducting so-called "what if" analyses. However, these methods can be inaccurate…

  10. Texting to increase physical activity among teenagers (TXT Me!): Rationale, design, and methods proposal

    Science.gov (United States)

    Physical activity decreases from childhood through adulthood. Among youth, teenagers (teens) achieve the lowest levels of physical activity, and high school age youth are particularly at risk of inactivity. Effective methods are needed to increase youth physical activity in a way that can be maintai...

  11. Proposal of a method for formulating strategy in small and medium enterprises

    Directory of Open Access Journals (Sweden)

    Luís Henrique Piovezan

    2008-07-01

    Full Text Available Strategy models found in the literature are usually more suitable for big companies. However, small and medium enterprises (SME also need to plan their strategies, but in such a way that considers their peculiarities. In this context, this paper presents a simple method for strategy formulation and deployment in SME. This method was developed through a sequence of cases studies, developed in small companies (10 to 500 employees. The final version of this method is a seven-step framework that considers both business environment and firm core competencies. The final aim is the alignment of business and manufacturing strategies. This framework can be considered suitable for SME, since it is simple and allows saving time and scarce available resources for strategy formulation, both important issues in this kind of enterprises. Finally, a case study is presented, encompassing the analysis of the application of the final version of the method in a small Brazilian company. Key-words: Competitive Strategy, Small Business Strategy, Manufacturing Strategy.

  12. Elliptical broken line method for calculating capillary density in nailfold capillaroscopy: Proposal and evaluation.

    Science.gov (United States)

    Karbalaie, Abdolamir; Abtahi, Farhad; Fatemi, Alimohammad; Etehadtavakol, Mahnaz; Emrani, Zahra; Erlandsson, Björn-Erik

    2017-09-01

    Nailfold capillaroscopy is a practical method for identifying and obtaining morphological changes in capillaries which might reveal relevant information about diseases and health. Capillaroscopy is harmless, and seems simple and repeatable. However, there is lack of established guidelines and instructions for acquisition as well as the interpretation of the obtained images; which might lead to various ambiguities. In addition, assessment and interpretation of the acquired images are very subjective. In an attempt to overcome some of these problems, in this study a new modified technique for assessment of nailfold capillary density is introduced. The new method is named elliptic broken line (EBL) which is an extension of the two previously known methods by defining clear criteria for finding the apex of capillaries in different scenarios by using a fitted elliptic. A graphical user interface (GUI) is developed for pre-processing, manual assessment of capillary apexes and automatic correction of selected apexes based on 90° rule. Intra- and inter-observer reliability of EBL and corrected EBL is evaluated in this study. Four independent observers familiar with capillaroscopy performed the assessment for 200 nailfold videocapillaroscopy images, form healthy subject and systemic lupus erythematosus patients, in two different sessions. The results show elevation from moderate (ICC=0.691) and good (ICC=0.753) agreements to good (ICC=0.750) and good (ICC=0.801) for intra- and inter-observer reliability after automatic correction of EBL. This clearly shows the potential of this method to improve the reliability and repeatability of assessment which motivates us for further development of automatic tool for EBL method. Copyright © 2017 Elsevier Inc. All rights reserved.

  13. New clinical validation method for automated sphygmomanometer: a proposal by Japan ISO-WG for sphygmomanometer standard.

    Science.gov (United States)

    Shirasaki, Osamu; Asou, Yosuke; Takahashi, Yukio

    2007-12-01

    Owing to fast or stepwise cuff deflation, or measuring at places other than the upper arm, the clinical accuracy of most recent automated sphygmomanometers (auto-BPMs) cannot be validated by one-arm simultaneous comparison, which would be the only accurate validation method based on auscultation. Two main alternative methods are provided by current standards, that is, two-arm simultaneous comparison (method 1) and one-arm sequential comparison (method 2); however, the accuracy of these validation methods might not be sufficient to compensate for the suspicious accuracy in lateral blood pressure (BP) differences (LD) and/or BP variations (BPV) between the device and reference readings. Thus, the Japan ISO-WG for sphygmomanometer standards has been studying a new method that might improve validation accuracy (method 3). The purpose of this study is to determine the appropriateness of method 3 by comparing immunity to LD and BPV with those of the current validation methods (methods 1 and 2). The validation accuracy of the above three methods was assessed in human participants [N=120, 45+/-15.3 years (mean+/-SD)]. An oscillometric automated monitor, Omron HEM-762, was used as the tested device. When compared with the others, methods 1 and 3 showed a smaller intra-individual standard deviation of device error (SD1), suggesting their higher reproducibility of validation. The SD1 by method 2 (P=0.004) significantly correlated with the participant's BP, supporting our hypothesis that the increased SD of device error by method 2 is at least partially caused by essential BPV. Method 3 showed a significantly (P=0.0044) smaller interparticipant SD of device error (SD2), suggesting its higher interparticipant consistency of validation. Among the methods of validation of the clinical accuracy of auto-BPMs, method 3, which showed the highest reproducibility and highest interparticipant consistency, can be proposed as being the most appropriate.

  14. Optimal update with multiple out-of-sequence measurements

    Science.gov (United States)

    Zhang, Shuo; Bar-Shalom, Yaakov

    2011-06-01

    In multisensor target tracking systems receiving out-of-sequence measurements from local sensors is a common situation. In the last decade many algorithms have been proposed to update a target state with an OOSM optimally or suboptimally. However, what one faces in the real world is multiple OOSMs, which arrive at the fusion center in, generally, arbitrary orders, e.g., in succession or interleaved with in-sequence measurements. A straightforward approach to deal with this multi-OOSM problem is by sequentially applying a given OOSM algorithm; however, this simple solution does not guarantee optimal update under the multi-OOSM scenario. The present paper discusses the differences between the single-OOSM processing and the multi-OOSM processing, and presents the general solution to the multi-OOSM problem, called the complete in-sequence information (CISI) approach. Given an OOSM, in addition to updating the target state at the most recent time, the CISI approach also updates the states between the OOSM time and the most recent time, including the state at the OOSM time. Three novel CISI methods are developed in this paper: the information filter-equivalent measurement (IF-EqM) method, the CISI fixed-point smoothing (CISI-FPS) method and the CISI fixed-interval smoothing (CISI-FIS) method. Numerical examples are given to show the optimality of these CISI methods under various multi-OOSM scenarios.

  15. An Innovative Oil Pollution Containment Method for Ship Wrecks Proposed for Offshore Well Blow-outs

    OpenAIRE

    ANDRITSOS Fivos; COJINS Hans

    2011-01-01

    In the aftermath of the PRESTIGE disaster, an innovative system for the prompt intervention on oil pollution sources (primarily ship wrecks) at great depths was conceived at the Joint Research Center of the European Commission. This system, with some re-engineering, could also serve for collecting oil and gas leaking after an offshore well blow-out and could constitute a reference method for prompt intervention on deep water oil pollution sources like ship wrecks and blown-out offshore wells....

  16. [Proposal of a method for collective analysis of work-related accidents in the hospital setting].

    Science.gov (United States)

    Osório, Claudia; Machado, Jorge Mesquita Huet; Minayo-Gomez, Carlos

    2005-01-01

    The article presents a method for the analysis of work-related accidents in hospitals, with the double aim of analyzing accidents in light of actual work activity and enhancing the vitality of the various professions that comprise hospital work. This process involves both research and intervention, combining knowledge output with training of health professionals, fostering expanded participation by workers in managing their daily work. The method consists of stimulating workers to recreate the situation in which a given accident occurred, shifting themselves to the position of observers of their own work. In the first stage of analysis, workers are asked to show the work analyst how the accident occurred; in the second stage, the work accident victim and analyst jointly record the described series of events in a diagram; in the third, the resulting record is re-discussed and further elaborated; in the fourth, the work accident victim and analyst evaluate and implement measures aimed to prevent the accident from recurring. The article concludes by discussing the method's possibilities and limitations in the hospital setting.

  17. [Proposal of a conceptual method of supportive care for co-active patients].

    Science.gov (United States)

    Abidli, Yamine; Piette, Danielle; Casini, Annalisa

    2015-01-01

    There is a broad consensus on the importance for health professionals to support co-active patients. However, in practice, very few "patient care partnership" approaches have been developed. We hypothesized that the lack of investment in supporting patient care partnerships is due to the lack of interest in the skills needed by caregivers to provide such support. This paper intends to address thisgap. The patient care partnership method is studied, adapted and developed from existing models. It complements, harmonizes and integrates various schools of thought arising from the need to place the patient at the center of care and life in general. The patient care partnership method includes 7 stages during which the professional accompanies the patient through the process of care. The methodological approach for training professionals is designed to ensure that professionals experience the change as well as its difficulties of the change they expect from the patient in the care relationship. This method now needs to be validated by the experience of other professionals in order define the limits of application and to allow further development.

  18. Creep-fatigue evaluation method for weld joint of Mod.9Cr-1Mo steel Part II: Plate bending test and proposal of a simplified evaluation method

    Energy Technology Data Exchange (ETDEWEB)

    Ando, Masanori, E-mail: ando.masanori@jaea.go.jp; Takaya, Shigeru, E-mail: takaya.shigeru@jaea.go.jp

    2016-12-15

    Highlights: • Creep-fatigue evaluation method for weld joint of Mod.9Cr-1Mo steel is proposed. • A simplified evaluation method is also proposed for the codification. • Both proposed evaluation method was validated by the plate bending test. • For codification, the local stress and strain behavior was analyzed. - Abstract: In the present study, to develop an evaluation procedure and design rules for Mod.9Cr-1Mo steel weld joints, a method for evaluating the creep-fatigue life of Mod.9Cr-1Mo steel weld joints was proposed based on finite element analysis (FEA) and a series of cyclic plate bending tests of longitudinal and horizontal seamed plates. The strain concentration and redistribution behaviors were evaluated and the failure cycles were estimated using FEA by considering the test conditions and metallurgical discontinuities in the weld joints. Inelastic FEA models consisting of the base metal, heat-affected zone and weld metal were employed to estimate the elastic follow-up behavior caused by the metallurgical discontinuities. The elastic follow-up factors determined by comparing the elastic and inelastic FEA results were determined to be less than 1.5. Based on the estimated elastic follow-up factors obtained via inelastic FEA, a simplified technique using elastic FEA was proposed for evaluating the creep-fatigue life in Mod.9Cr-1Mo steel weld joints. The creep-fatigue life obtained using the plate bending test was compared to those estimated from the results of inelastic FEA and by a simplified evaluation method.

  19. Creep-fatigue evaluation method for weld joint of Mod.9Cr-1Mo steel Part II: Plate bending test and proposal of a simplified evaluation method

    International Nuclear Information System (INIS)

    Ando, Masanori; Takaya, Shigeru

    2016-01-01

    Highlights: • Creep-fatigue evaluation method for weld joint of Mod.9Cr-1Mo steel is proposed. • A simplified evaluation method is also proposed for the codification. • Both proposed evaluation method was validated by the plate bending test. • For codification, the local stress and strain behavior was analyzed. - Abstract: In the present study, to develop an evaluation procedure and design rules for Mod.9Cr-1Mo steel weld joints, a method for evaluating the creep-fatigue life of Mod.9Cr-1Mo steel weld joints was proposed based on finite element analysis (FEA) and a series of cyclic plate bending tests of longitudinal and horizontal seamed plates. The strain concentration and redistribution behaviors were evaluated and the failure cycles were estimated using FEA by considering the test conditions and metallurgical discontinuities in the weld joints. Inelastic FEA models consisting of the base metal, heat-affected zone and weld metal were employed to estimate the elastic follow-up behavior caused by the metallurgical discontinuities. The elastic follow-up factors determined by comparing the elastic and inelastic FEA results were determined to be less than 1.5. Based on the estimated elastic follow-up factors obtained via inelastic FEA, a simplified technique using elastic FEA was proposed for evaluating the creep-fatigue life in Mod.9Cr-1Mo steel weld joints. The creep-fatigue life obtained using the plate bending test was compared to those estimated from the results of inelastic FEA and by a simplified evaluation method.

  20. A proposal of a three-dimensional CT measurement method of maxillofacial structure

    International Nuclear Information System (INIS)

    Tanaka, Ray; Hayashi, Takafumi

    2007-01-01

    Three-dimensional CT measurement is put in practice in order to grasp the pathological condition on diseases such as the temporomandibular joint disorder, maxillofacial anomaly, jaw deformity, or fracture which cause the morphologic changes of the maxillofacial bones. On the 3D measurement, the unique system that is obtained by volume rendering 3D images with a simultaneous reference of axial images combined with coronal and sagittal multi-planar reconstruction (MPR) images (we call this MPR referential method), is employed in order to define the measurement points. Our purpose in this report is to indicate the usefulness of this unique method by comparing with the common way to define the measurement points on only 3D reconstruction images without consulting of MPR images. Clinical CT data obtained from a male patient with skeletal malocclusion was used. Contiguous axial images were reconstructed at 4 times magnification, with a reconstruction interval of 0.5 mm, focused on the temporomandibular joint region in his left side. After these images were converted to Digital Imaging and Communications in Medicine (DICOM) format and sent to personal computer (PC), 3D reconstruction image was created using free 3D DICOM medical image viewer. The coordinates of 3 measurement points (the lateral and medial pole of the mandibular condyle, and the left foramen ovale) were defined with MPR images (MPR coordinates) as reference coordinates, and then the coordinates that were defined on only 3D reconstruction image without consulting to MPR images (3D coordinates) were compared to those of MPR coordinates. Three examiners were engaged independently 10 times for every measurement point. In our result, there was no correspondence between 3D coordinates and MPR coordinates, and contribution of 3D coordinates showed a variety in every measurement point and in every observer. We deemed that ''MPR referential method'' is useful to assess the location of the target point of anatomical

  1. Proposed thermodynamic method to determine the vortex mass in layered superconductors

    International Nuclear Information System (INIS)

    Moler, K.A.; Fetter, A.L.; Kapitulnik, A.

    1995-01-01

    The authors describe a simple method to study vortex dynamics that can determine or set an upper limit on the vortex mass. The specific heat of the vortex lattice in layered superconductors has a classical limit of 1 k B per pancake vortex if the vortex mass is zero. If the vortex mass m v is finite, a new Einstein branch of normal modes will appear with a crossover temperature Θ E ∝ m v -1 , and the specific heat will saturate at a new classical limit of 2 k B per pancake vortex

  2. A proposal for a determination method of element division on an analytical model for finite element elastic waves propagation analysis

    International Nuclear Information System (INIS)

    Ishida, Hitoshi; Meshii, Toshiyuki

    2010-01-01

    This study proposes an element size selection method named the 'Impact-Meshing (IM) method' for a finite element waves propagation analysis model, which is characterized by (1) determination of element division of the model with strain energy in the whole model, (2) static analysis (dynamic analysis in a single time step) with boundary conditions which gives a maximum change of displacement in the time increment and inertial (impact) force caused by the displacement change. In this paper, an example of application of the IM method to 3D ultrasonic wave propagation problem in an elastic solid is described. These examples showed an analysis result with a model determined by the IM method was convergence and calculation time for determination of element subdivision was reduced to about 1/6 by the IM Method which did not need determination of element subdivision by a dynamic transient analysis with 100 time steps. (author)

  3. Fair and efficient tariffs for wind energy : principles, method, proposal, data and potential consequences in France

    International Nuclear Information System (INIS)

    Chabot, B.

    2001-01-01

    In 2000, the government of France announced a national energy plan that included the installation of 5,000 to 10,000 MW of wind power by 2010. It also announced a new system based on fixed tariffs that would replace the EOLE 2005 calls for tenders for projects under 12 MW. This paper described the principles and methods used to develop this fair and efficient tariff system for wind energy in France. The Agence de l'Environnement et de la Maitrise de l'Energie (ADEME) uses the Profitability Index Method to help define a wind energy tariff system for wind power plants under 12 MW. This paper presents some figures of the related over-cost incurred with the new tariff system which makes it possible for energy developers in France to develop huge wind potential at a pace equal to other countries with fixed premium prices. The over-cost of the new tariff system is not too high, plus it could be passed equally over all consumers of electricity. The tariff system will help France comply with its national, European and international commitments regarding climate change and with the future European directive on electricity generated from renewable energy sources. 8 refs., 1 tab., 4 figs

  4. [Proposal of a costing method for the provision of sterilization in a public hospital].

    Science.gov (United States)

    Bauler, S; Combe, C; Piallat, M; Laurencin, C; Hida, H

    2011-07-01

    To refine the billing to institutions whose operations of sterilization are outsourced, a sterilization cost approach was developed. The aim of the study is to determine the value of a sterilization unit (one point "S") evolving according to investments, quantities processed, types of instrumentation or packaging. The time of preparation has been selected from all sub-processes of sterilization to determine the value of one point S. The time of preparation of sterilized large and small containers and pouches were raised. The reference time corresponds to one bag (equal to one point S). Simultaneously, the annual operating cost of sterilization was defined and divided into several areas of expenditure: employees, equipments and building depreciation, supplies, and maintenance. A total of 136 crossing times of containers were measured. Time to prepare a pouch has been estimated at one minute (one S). A small container represents four S and a large container represents 10S. By dividing the operating cost of sterilization by the total number of points of sterilization over a given period, the cost of one S can be determined. This method differs from traditional costing method in sterilizing services, considering each item of expenditure. This point S will be the base for billing of subcontracts to other institutions. Copyright © 2011 Elsevier Masson SAS. All rights reserved.

  5. Board Game in Physics Classes—a Proposal for a New Method of Student Assessment

    Science.gov (United States)

    Dziob, Daniel

    2018-03-01

    The aim of this study was to examine the impact of assessing students' achievements in a physics course in the form of a group board game. Research was conducted in two groups of 131 high school students in Poland. In each school, the research sample was divided into experimental and control groups. Each group was taught by the same teacher and participated in the same courses and tests before the game. Just after finishing the course on waves and vibrations (school 1) and optics (school 2), experimental groups took part in a group board game to assess their knowledge. One week after the game, the experimental and control groups (not involved in the game) took part in the post-tests. Students from the experimental groups performed better in the game than in the tests given before the game. As well their results in the post-tests were significantly higher statistically than students from the control groups. Simultaneously, student's opinions in the experimental groups about the board game as an assessment method were collected in an open-descriptive form and in a short questionnaire, and analyzed. Results showed that students experienced a positive attitude toward the assessment method, a reduction of test anxiety and an increase in their motivation for learning.

  6. [The strategic research areas of a University Hospital: proposal of a quali-quantitative method.

    Science.gov (United States)

    Iezzi, Elisa; Ardissino, Diego; Ferrari, Carlo; Vitale, Marco; Caminiti, Caterina

    2018-02-01

    This work aimed to objectively identify the main research areas at the University Hospital of Parma. To this end, a multidisciplinary working group, comprising clinicians, researchers, and hospital management, was formed to develop a shared quali-quantitative method. Easily retrievable performance indicators were selected from the literature (concerning bibliometric data and grant acquisition), and a scoring system developed to assign weights to each indicator. Subsequently, Research Team Leaders were identified from the hospital's "Research Plan", a document produced every three years which contains information on the main research themes carried out at each Department, involved staff and available resources, provided by health care professionals themselves. The selected performance indicators were measured for each Team Leader, and scores assigned, thus creating a ranking list. Through the analyses of the research themes of top Team Leaders, the Working Group identified the following five strategic research areas: (a) personalized treatment in oncology and hematology; (b) chronicization mechanisms in immunomediate diseases; (c) old and new risk factors for cardiovascular diseases; (d) nutritional disorders, metabolic and chronic-degenerative diseases; (e) molecular diagnostic and predictive markers. We have developed an objective method to identify a hospital's main research areas. Its application can guide resource allocation and can offer ways to value the work of professionals involved in research.

  7. Alpins and thibos vectorial astigmatism analyses: proposal of a linear regression model between methods

    Directory of Open Access Journals (Sweden)

    Giuliano de Oliveira Freitas

    2013-10-01

    Full Text Available PURPOSE: To determine linear regression models between Alpins descriptive indices and Thibos astigmatic power vectors (APV, assessing the validity and strength of such correlations. METHODS: This case series prospectively assessed 62 eyes of 31 consecutive cataract patients with preoperative corneal astigmatism between 0.75 and 2.50 diopters in both eyes. Patients were randomly assorted among two phacoemulsification groups: one assigned to receive AcrySof®Toric intraocular lens (IOL in both eyes and another assigned to have AcrySof Natural IOL associated with limbal relaxing incisions, also in both eyes. All patients were reevaluated postoperatively at 6 months, when refractive astigmatism analysis was performed using both Alpins and Thibos methods. The ratio between Thibos postoperative APV and preoperative APV (APVratio and its linear regression to Alpins percentage of success of astigmatic surgery, percentage of astigmatism corrected and percentage of astigmatism reduction at the intended axis were assessed. RESULTS: Significant negative correlation between the ratio of post- and preoperative Thibos APVratio and Alpins percentage of success (%Success was found (Spearman's ρ=-0.93; linear regression is given by the following equation: %Success = (-APVratio + 1.00x100. CONCLUSION: The linear regression we found between APVratio and %Success permits a validated mathematical inference concerning the overall success of astigmatic surgery.

  8. Automating Construction of Machine Learning Models With Clinical Big Data: Proposal Rationale and Methods.

    Science.gov (United States)

    Luo, Gang; Stone, Bryan L; Johnson, Michael D; Tarczy-Hornoch, Peter; Wilcox, Adam B; Mooney, Sean D; Sheng, Xiaoming; Haug, Peter J; Nkoy, Flory L

    2017-08-29

    To improve health outcomes and cut health care costs, we often need to conduct prediction/classification using large clinical datasets (aka, clinical big data), for example, to identify high-risk patients for preventive interventions. Machine learning has been proposed as a key technology for doing this. Machine learning has won most data science competitions and could support many clinical activities, yet only 15% of hospitals use it for even limited purposes. Despite familiarity with data, health care researchers often lack machine learning expertise to directly use clinical big data, creating a hurdle in realizing value from their data. Health care researchers can work with data scientists with deep machine learning knowledge, but it takes time and effort for both parties to communicate effectively. Facing a shortage in the United States of data scientists and hiring competition from companies with deep pockets, health care systems have difficulty recruiting data scientists. Building and generalizing a machine learning model often requires hundreds to thousands of manual iterations by data scientists to select the following: (1) hyper-parameter values and complex algorithms that greatly affect model accuracy and (2) operators and periods for temporally aggregating clinical attributes (eg, whether a patient's weight kept rising in the past year). This process becomes infeasible with limited budgets. This study's goal is to enable health care researchers to directly use clinical big data, make machine learning feasible with limited budgets and data scientist resources, and realize value from data. This study will allow us to achieve the following: (1) finish developing the new software, Automated Machine Learning (Auto-ML), to automate model selection for machine learning with clinical big data and validate Auto-ML on seven benchmark modeling problems of clinical importance; (2) apply Auto-ML and novel methodology to two new modeling problems crucial for care

  9. Soft sensor modelling by time difference, recursive partial least squares and adaptive model updating

    International Nuclear Information System (INIS)

    Fu, Y; Xu, O; Yang, W; Zhou, L; Wang, J

    2017-01-01

    To investigate time-variant and nonlinear characteristics in industrial processes, a soft sensor modelling method based on time difference, moving-window recursive partial least square (PLS) and adaptive model updating is proposed. In this method, time difference values of input and output variables are used as training samples to construct the model, which can reduce the effects of the nonlinear characteristic on modelling accuracy and retain the advantages of recursive PLS algorithm. To solve the high updating frequency of the model, a confidence value is introduced, which can be updated adaptively according to the results of the model performance assessment. Once the confidence value is updated, the model can be updated. The proposed method has been used to predict the 4-carboxy-benz-aldehyde (CBA) content in the purified terephthalic acid (PTA) oxidation reaction process. The results show that the proposed soft sensor modelling method can reduce computation effectively, improve prediction accuracy by making use of process information and reflect the process characteristics accurately. (paper)

  10. EPICOR-II resin characterization and proposed methods for degradation analysis. Rev. 1

    International Nuclear Information System (INIS)

    Doyle, J.D.; McConnell, J.W. Jr.; Sanders, R.D. Sr.

    1984-06-01

    One goal of the EPICOR-II Research and Disposition Program is the examination of the EPICOR-II organic ion-exchange resins for physical and chemical degradation. This report summarizes preliminary information necessary for the evaluation of the resins for degradation. Degradation of the synthetic organic ion-exchange resins should be efficiently and accurately measurable by using the baseline data provided by the nonirradiated resin characterization. The degradation threshold is about 10 8 rads, approximately the same dose rate the resins will have received by the examination date. If degradation has not occurred at the first examination point, later examinations will detect resin degradation using the same analytical methods. The results from the characterization tests will yield practical and useful data on the actual effects of radiation on commercial synthetic organic ion-exchange resins. 10 references, 12 figures

  11. Proposal for a Method for Business Model Performance Assessment: Toward an Experimentation Tool for Business Model Innovation

    Directory of Open Access Journals (Sweden)

    Antonio Batocchio

    2017-04-01

    Full Text Available The representation of business models has been recently widespread, especially in the pursuit of innovation. However, defining a company’s business model is sometimes limited to discussion and debates. This study observes the need for performance measurement so that business models can be data-driven. To meet this goal, the work proposed as a hypothesis the creation of a method that combines the practices of the Balanced Scorecard with a method of business models representation – the Business Model Canvas. Such a combination was based on study of conceptual adaptation, resulting in an application roadmap. A case study application was performed to check the functionality of the proposition, focusing on startup organizations. It was concluded that based on the performance assessment of the business model it is possible to propose the search for change through experimentation, a path that can lead to business model innovation.

  12. Proposal for element size and time increment selection guideline by 3-D finite element method for elastic waves propagation analysis

    International Nuclear Information System (INIS)

    Ishida, Hitoshi; Meshii, Toshiyuki

    2008-01-01

    This paper proposes a guideline for selection of element size and time increment by 3-D finite element method, which is applied to elastic wave propagation analysis for a long distance of a large structure. An element size and a time increment are determined by quantitative evaluation of strain, which must be 0 on the analysis model with a uniform motion, caused by spatial and time discretization. (author)

  13. PROPOSED SIMPLE METHOD FOR ELECTROCARDIOGRAM RECORDING IN FREE-RANGING ASIAN ELEPHANTS (ELEPHAS MAXIMUS).

    Science.gov (United States)

    Chai, Norin; Pouchelon, Jean Louis; Bouvard, Jonathan; Sillero, Leonor Camacho; Huynh, Minh; Segalini, Vincent; Point, Lisa; Croce, Veronica; Rigaux, Goulven; Highwood, Jack; Chetboul, Valérie

    2016-03-01

    Electrocardiography represents a relevant diagnostic tool for detecting cardiac disease in animals. Elephants can present various congenital and acquired cardiovascular diseases. However, few electrophysiologic studies have been reported in captive elephants, mainly due to challenging technical difficulties in obtaining good-quality electrocardiogram (ECG) tracings, and no data are currently available for free-ranging Asian elephants (Elephas maximus). The purpose of this pilot prospective study was to evaluate the feasibility of using a simple method for recording ECG tracings in wild, apparently healthy, unsedated Asian elephants (n = 7) in the standing position. Successful six-lead recordings (I, II, III, aVR, aVL, and aVF) were obtained, with the aVL lead providing the best-quality tracings in most animals. Variables measured in the aVL lead included heart rate, amplitudes and duration of the P waves, QRS complexes, T and U waves, and duration of the PR, QT, and QU intervals. A negative deflection following positive P waves, representative of an atrial repolarization wave (Ta wave), was observed for five out of the seven elephants.

  14. A proposed method to detect kinematic differences between and within individuals.

    Science.gov (United States)

    Frost, David M; Beach, Tyson A C; McGill, Stuart M; Callaghan, Jack P

    2015-06-01

    The primary objective was to examine the utility of a novel method of detecting "actual" kinematic changes using the within-subject variation. Twenty firefighters were assigned to one of two groups (lifting or firefighting). Participants performed 25 repetitions of two lifting or firefighting tasks, in three sessions. The magnitude and within-subject variation of several discrete kinematic measures were computed. Sequential averages of each variable were used to derive a cubic, quadratic and linear regression equation. The efficacy of each equation was examined by contrasting participants' sequential means to their 25-trial mean±1SD and 2SD. The magnitude and within-subject variation of each dependent measure was repeatable for all tasks; however, each participant did not exhibit the same movement patterns as the group. The number of instances across all variables, tasks and testing sessions whereby the 25-trial mean±1SD was contained within the boundaries established by the regression equations increased as the aggregate scores included more trials. Each equation achieved success in at least 88% of all instances when three trials were included in the sequential mean (95% with five trials). The within-subject variation may offer a means to examine participant-specific changes without having to collect a large number of trials. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Proposal of inspection method of radiation protection applied to nuclear medicine establishments

    International Nuclear Information System (INIS)

    Mendes, Leopoldino da Cruz Gouveia

    2003-01-01

    The principal objective of this paper is to implement a method of an impartial and efficient inspection, due to a correct and secure dose of ionizing radiation in the field of Nuclear Medicine. The Radiological Protection Model was tested in 113 Nuclear Medicine Services all over the country, according to a biannual analysis frequency (1996, 1998, 2000 and 2002). The data sheet comprised general information about the structure of the NMS and a technical approach. In the analytical process, a methodology of inputting different importance levels to each of the 82 features was adopted, based on the risk factors stated in the CNEN NE's and in the IAEA recommendations, as well. From this point of view, as a feature does not fit one of the rules above, it will correspond to a radioprotection fault and be imparted a grade. The sum of those grades, classified the NMS in one of the three different ranges, as follows: - operating without restriction - 100 points and below- operating with restriction - between 100 and 300 points - temporary shutdown - above and equal to 300 points. The allowance of the second group to carry on operating should be attached to a defined and restricted period of time (six to twelve months), supposed large enough to the NMS solving the problems being new evaluation proceeded then. The NMS's classified in the third group are supposed to go back into operation only when fit all the pending radioprotection requirements. Until the next regular evaluation, meanwhile a multiplication factor 2 n was applied to the recalcitrant NMS s where n is the number of unwilling occurrences. The previous establishment of those items of radioprotection, with its respective grade, excluded subjective and personal values in the judgement and technical evaluation of the institutions. (author)

  16. Updated methods for assessing the impacts of nearby gas drilling and production on neighborhood air quality and human health.

    Science.gov (United States)

    Olaguer, Eduardo P; Erickson, Matthew; Wijesinghe, Asanga; Neish, Brad; Williams, Jeff; Colvin, John

    2016-02-01

    An explosive growth in natural gas production within the last decade has fueled concern over the public health impacts of air pollutant emissions from oil and gas sites in the Barnett and Eagle Ford shale regions of Texas. Commonly acknowledged sources of uncertainty are the lack of sustained monitoring of ambient concentrations of pollutants associated with gas mining, poor quantification of their emissions, and inability to correlate health symptoms with specific emission events. These uncertainties are best addressed not by conventional monitoring and modeling technology, but by increasingly available advanced techniques for real-time mobile monitoring, microscale modeling and source attribution, and real-time broadcasting of air quality and human health data over the World Wide Web. The combination of contemporary scientific and social media approaches can be used to develop a strategy to detect and quantify emission events from oil and gas facilities, alert nearby residents of these events, and collect associated human health data, all in real time or near-real time. The various technical elements of this strategy are demonstrated based on the results of past, current, and planned future monitoring studies in the Barnett and Eagle Ford shale regions. Resources should not be invested in expanding the conventional air quality monitoring network in the vicinity of oil and gas exploration and production sites. Rather, more contemporary monitoring and data analysis techniques should take the place of older methods to better protect the health of nearby residents and maintain the integrity of the surrounding environment.

  17. A proposed architecture and method of operation for improving the protection of privacy and confidentiality in disease registers

    Directory of Open Access Journals (Sweden)

    Churches Tim

    2003-01-01

    Full Text Available Abstract Background Disease registers aim to collect information about all instances of a disease or condition in a defined population of individuals. Traditionally methods of operating disease registers have required that notifications of cases be identified by unique identifiers such as social security number or national identification number, or by ensembles of non-unique identifying data items, such as name, sex and date of birth. However, growing concern over the privacy and confidentiality aspects of disease registers may hinder their future operation. Technical solutions to these legitimate concerns are needed. Discussion An alternative method of operation is proposed which involves splitting the personal identifiers from the medical details at the source of notification, and separately encrypting each part using asymmetrical (public key cryptographic methods. The identifying information is sent to a single Population Register, and the medical details to the relevant disease register. The Population Register uses probabilistic record linkage to assign a unique personal identification (UPI number to each person notified to it, although not necessarily everyone in the entire population. This UPI is shared only with a single trusted third party whose sole function is to translate between this UPI and separate series of personal identification numbers which are specific to each disease register. Summary The system proposed would significantly improve the protection of privacy and confidentiality, while still allowing the efficient linkage of records between disease registers, under the control and supervision of the trusted third party and independent ethics committees. The proposed architecture could accommodate genetic databases and tissue banks as well as a wide range of other health and social data collections. It is important that proposals such as this are subject to widespread scrutiny by information security experts, researchers and

  18. Comments and Remarks over Classic Linear Loop-Gain Method for Oscillator Design and Analysis. New Proposed Method Based on NDF/RRT

    Directory of Open Access Journals (Sweden)

    J. L. Jimenez-Martin

    2012-04-01

    Full Text Available Present paper describes a new method for designing oscillators based on the Normalized Determinant Function (NDF and Return Relations (RRT . First a review of the loop-gain method will be performed, showing pros, cons and including some examples for exploring wrong so- lutions provided by this method. Wrong solutions, because some conditions have to be previously fulfilled in order to obtain right ones, which will be described and finally, demonstrate that NDF analysis is necessary, including Return Relations (RRT usefulness, which in fact are related with the True Loop-Gain. Finally concluding this paper, steps for oscillator design and analysis, using the proposed NDF/RRT method will be presented, compared to wrong previous solutions pointing out new accuracy achieved on oscillation frequency and QL prediction. Also, more new examples, of plane reference oscillators (Z/Y/rho, will be added for which loop gain method application is clearly difficult or even impossible, solving them with the new proposed NDF/RRT method.

  19. UPDATING AN EXPERT ELICITATION IN THE LIGHT OF NEW DATA: TEN YEARS OF PROBABILISTIC VOLCANIC HAZARD ANALYSIS FOR THE PROPOSED HIGH-LEVEL RADIOACTIVE WASTE REPOSITORY AT YUCCA MOUNTAIN, NEVADA

    International Nuclear Information System (INIS)

    F.V. Perry; A. Cogbill; R. Kelley

    2005-01-01

    The U.S. Department of Energy (DOE) considers volcanism to be a potentially disruptive class of events that could affect the safety of the proposed high-level waste repository at Yucca Mountain. Volcanic hazard assessment in monogenetic volcanic fields depends on an adequate understanding of the temporal and spatial pattern of past eruptions. At Yucca Mountain, the hazard is due to an 11 Ma-history of basaltic volcanism with the latest eruptions occurring in three Pleistocene episodes to the west and south of Yucca Mountain. An expert elicitation convened in 1995-1996 by the DOE estimated the mean hazard of volcanic disruption of the repository as slightly greater than 10 -8 dike intersections per year with an uncertainty of about two orders of magnitude. Several boreholes in the region have encountered buried basalt in alluvial-filled basins; the youngest of these basalts is dated at 3.8 Ma. The possibility of additional buried basalt centers is indicated by a previous regional aeromagnetic survey conducted by the USGS that detected approximately 20 magnetic anomalies that could represent buried basalt volcanoes. Sensitivity studies indicate that the postulated presence of buried post-Miocene volcanoes to the east of Yucca Mountain could increase the hazard by an order of magnitude, and potentially significantly impact the results of the earlier expert elicitation. Our interpretation of the aeromagnetic data indicates that post-Miocene basalts are not present east of Yucca Mountain, but that magnetic anomalies instead represent faulted and buried Miocene basalt that correlates with nearby surface exposures. This interpretation is being tested by drilling. The possibility of uncharacterized buried volcanoes that could significantly change hazard estimates led DOE to support an update of the expert elicitation in 2004-2006. In support of the expert elicitation data needs, the DOE is sponsoring (1) a new higher-resolution, helicopter-borne aeromagnetic survey

  20. Sensitivity analysis of a complex, proposed geologic waste disposal system using the Fourier Amplitude Sensitivity Test method

    International Nuclear Information System (INIS)

    Lu Yichi; Mohanty, Sitakanta

    2001-01-01

    The Fourier Amplitude Sensitivity Test (FAST) method has been used to perform a sensitivity analysis of a computer model developed for conducting total system performance assessment of the proposed high-level nuclear waste repository at Yucca Mountain, Nevada, USA. The computer model has a large number of random input parameters with assigned probability density functions, which may or may not be uniform, for representing data uncertainty. The FAST method, which was previously applied to models with parameters represented by the uniform probability distribution function only, has been modified to be applied to models with nonuniform probability distribution functions. Using an example problem with a small input parameter set, several aspects of the FAST method, such as the effects of integer frequency sets and random phase shifts in the functional transformations, and the number of discrete sampling points (equivalent to the number of model executions) on the ranking of the input parameters have been investigated. Because the number of input parameters of the computer model under investigation is too large to be handled by the FAST method, less important input parameters were first screened out using the Morris method. The FAST method was then used to rank the remaining parameters. The validity of the parameter ranking by the FAST method was verified using the conditional complementary cumulative distribution function (CCDF) of the output. The CCDF results revealed that the introduction of random phase shifts into the functional transformations, proposed by previous investigators to disrupt the repetitiveness of search curves, does not necessarily improve the sensitivity analysis results because it destroys the orthogonality of the trigonometric functions, which is required for Fourier analysis

  1. AN UPDATED {sup 6}Li(p, {alpha}){sup 3}He REACTION RATE AT ASTROPHYSICAL ENERGIES WITH THE TROJAN HORSE METHOD

    Energy Technology Data Exchange (ETDEWEB)

    Lamia, L.; Spitaleri, C.; Sergi, M. L. [Dipartimento di Fisica e Astronomia, Universita di Catania, I-95123 Catania (Italy); Pizzone, R. G.; Tumino, A.; La Cognata, M. [INFN-Laboratori Nazionali del Sud, I-95125 Catania (Italy); Tognelli, E.; Degl' Innocenti, S.; Prada Moroni, P. G. [Dipartimento di Fisica, Universita di Pisa, I-56127 Pisa (Italy); Pappalardo, L. [Dipartimento di Fisica e Scienze della Terra, Universita di Ferrara, I-44100 Ferrara (Italy)

    2013-05-01

    The lithium problem influencing primordial and stellar nucleosynthesis is one of the most interesting unsolved issues in astrophysics. {sup 6}Li is the most fragile of lithium's stable isotopes and is largely destroyed in most stars during the pre-main-sequence (PMS) phase. For these stars, the convective envelope easily reaches, at least at its bottom, the relatively low {sup 6}Li ignition temperature. Thus, gaining an understanding of {sup 6}Li depletion also gives hints about the extent of convective regions. For this reason, charged-particle-induced reactions in lithium have been the subject of several studies. Low-energy extrapolations of these studies provide information about both the zero-energy astrophysical S(E) factor and the electron screening potential, U{sub e} . Thanks to recent direct measurements, new estimates of the {sup 6}Li(p, {alpha}){sup 3}He bare-nucleus S(E) factor and the corresponding U{sub e} value have been obtained by applying the Trojan Horse method to the {sup 2}H({sup 6}Li, {alpha} {sup 3}He)n reaction in quasi-free kinematics. The calculated reaction rate covers the temperature window 0.01 to 2T{sub 9} and its impact on the surface lithium depletion in PMS models with different masses and metallicities has been evaluated in detail by adopting an updated version of the FRANEC evolutionary code.

  2. Model Updating Nonlinear System Identification Toolbox, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology (ZONA) proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology that utilizes flight data with...

  3. UPDATING UNDER RISK CONDITION

    Directory of Open Access Journals (Sweden)

    VĂDUVA CECILIA ELENA

    2018-02-01

    Full Text Available The foundation for future firm development is investment. Agents have a risk aversion requiring higher returns as the risks associated with the project will be greater. The investment decision determines the market firm's affirmation, increasing the market share, dominating the market. Making an investment at a certain point will determine certain cash flows throughout the life of the project, and a residual value can be obtained when it is taken out of service. The flows and payments for the investment project can be more easily tracked if we are proposing a constant update rate. We will be able to analyze, based on various factors, three techniques for determining the discount rate for investment projects: the opportunity cost, the risk-free rate, and a series of risk premiums, the weighted average cost of capital. People without financial training make value judgments for investment projects based on other market opportunities, comparing the returns that any investment offers to other pay options. An investor has a sum of money he wants to make - if he does not invest in a project, he will invest in another, that will bring him a certain amount of money, choosing the most advantageous project by comparison. All projects are characterized by identical risks, and the agents are considered indifferent to the risks. The answer given by financial theory and practice to the disadvantage of rates in the opportunity cost category is the discount rate calculated as a sum of the risk-free rate and a risk premium, defining the risk as a factor whose action may cause a possible decrease in cash of the available flows. Higher objectivity is presented by the opportunity cost update rates of update because it refers to known variables but cannot be perfectly matched to the performance of the investment process.

  4. Indoor Spatial Updating with Reduced Visual Information

    OpenAIRE

    Legge, Gordon E.; Gage, Rachel; Baek, Yihwa; Bochsler, Tiana M.

    2016-01-01

    Purpose Spatial updating refers to the ability to keep track of position and orientation while moving through an environment. People with impaired vision may be less accurate in spatial updating with adverse consequences for indoor navigation. In this study, we asked how artificial restrictions on visual acuity and field size affect spatial updating, and also judgments of the size of rooms. Methods Normally sighted young adults were tested with artificial restriction of acuity in Mild Blur (S...

  5. Astrophysics Update 2

    CERN Document Server

    Mason, John W

    2006-01-01

    "Astrophysics Updates" is intended to serve the information needs of professional astronomers and postgraduate students about areas of astronomy, astrophysics and cosmology that are rich and active research spheres. Observational methods and the latest results of astronomical research are presented as well as their theoretical foundations and interrelations. The contributed commissioned articles are written by leading exponents in a format that will appeal to professional astronomers and astrophysicists who are interested in topics outside their own specific areas of research. This collection of timely reviews may also attract the interest of advanced amateur astronomers seeking scientifically rigorous coverage.

  6. Medicare program; update of ratesetting methodology, payment rates, payment policies, and the list of covered procedures for ambulatory surgical centers effective October 1, 1998; reopening of comment period and delay in adoption of the proposed rule as final--HCFA. Notice of reopening of comment period for proposed rule and delay in adoption of provisions of the proposed rule as final.

    Science.gov (United States)

    1998-10-01

    This notice reopens the comment period for a proposed rule affecting Medicare payments to ambulatory surgical centers (ASCs) that was originally published in the Federal Register on June 12, 1998 (63 FR 32290). This document gives notice of a delay in the adoption of the provisions of the June 12, 1998 ASC proposed rule as a final rule to be concurrent with the adoption as final of the hospital outpatient prospective payment system (PPS) that is the subject of a proposed rule published in the Federal Register on September 8, 1998 (63 FR 47551). In addition this document confirms that the current ASC payment rates that are effective for services furnished on or after October 1, 1998, will remain in effect until rebased ASC rates and the provisions of the June 12, 1998 ASC proposed rule are adopted as final to be concurrent with the adoption as final of the Medicare hospital PPS.

  7. The Proposal to “Snapshot” Raim Method for Gnss Vessel Receivers Working in Poor Space Segment Geometry

    Directory of Open Access Journals (Sweden)

    Nowak Aleksander

    2015-12-01

    Full Text Available Nowadays, we can observe an increase in research on the use of small unmanned autonomous vessel (SUAV to patrol and guiding critical areas including harbours. The proposal to “snapshot” RAIM (Receiver Autonomous Integrity Monitoring method for GNSS receivers mounted on SUAV operating in poor space segment geometry is presented in the paper. Existing “snapshot” RAIM methods and algorithms which are used in practical applications have been developed for airborne receivers, thus two main assumptions have been made. The first one is that the geometry of visible satellites is strong. It means that the exclusion of any satellite from the positioning solution don’t cause significant deterioration of Dilution of Precision (DOP coefficients. The second one is that only one outlier could appear in pseudorange measurements. In case of SUAV operating in harbour these two assumptions cannot be accepted. Because of their small dimensions, GNSS antenna is only a few decimetres above sea level and regular ships, buildings and harbour facilities block and reflect satellite signals. Thus, different approach to “snapshot” RAIM is necessary. The proposal to method based on analyses of allowable maximal separation of positioning sub-solutions with using some information from EGNOS messages is described in the paper. Theoretical assumptions and results of numerical experiments are presented.

  8. Single-Shell Tank (SST) Retrieval Sequence Fiscal Year 2000 Update

    International Nuclear Information System (INIS)

    GARFIELD, J.S.

    2000-01-01

    This document describes the baseline single-shell tank (SST) waste retrieval sequence for the River Protection Project (RPP) updated for Fiscal Year 2000. The SST retrieval sequence identifies the proposed retrieval order (sequence), the tank selection and prioritization rationale, and planned retrieval dates for Hanford SSTs. In addition, the tank selection criteria and reference retrieval method for this sequence are discussed

  9. The 2014 updated version of the Confusion Assessment Method for the Intensive Care Unit compared to the 5th version of the Diagnostic and Statistical Manual of Mental Disorders and other current methods used by intensivists.

    Science.gov (United States)

    Chanques, Gérald; Ely, E Wesley; Garnier, Océane; Perrigault, Fanny; Eloi, Anaïs; Carr, Julie; Rowan, Christine M; Prades, Albert; de Jong, Audrey; Moritz-Gasser, Sylvie; Molinari, Nicolas; Jaber, Samir

    2018-03-01

    One third of patients admitted to an intensive care unit (ICU) will develop delirium. However, delirium is under-recognized by bedside clinicians without the use of delirium screening tools, such as the Intensive Care Delirium Screening Checklist (ICDSC) or the Confusion Assessment Method for the ICU (CAM-ICU). The CAM-ICU was updated in 2014 to improve its use by clinicians throughout the world. It has never been validated compared to the new reference standard, the Diagnostic and Statistical Manual of Mental Disorders 5th version (DSM-5). We made a prospective psychometric study in a 16-bed medical-surgical ICU of a French academic hospital, to measure the diagnostic performance of the 2014 updated CAM-ICU compared to the DSM-5 as the reference standard. We included consecutive adult patients with a Richmond Agitation Sedation Scale (RASS) ≥ -3, without preexisting cognitive disorders, psychosis or cerebral injury. Delirium was independently assessed by neuropsychological experts using an operationalized approach to DSM-5, by investigators using the CAM-ICU and the ICDSC, by bedside clinicians and by ICU patients. The sensitivity, specificity, positive and negative predictive values were calculated considering neuropsychologist DSM-5 assessments as the reference standard (primary endpoint). CAM-ICU inter-observer agreement, as well as that between delirium diagnosis methods and the reference standard, was summarized using κ coefficients, which were subsequently compared using the Z-test. Delirium was diagnosed by experts in 38% of the 108 patients included for analysis. The CAM-ICU had a sensitivity of 83%, a specificity of 100%, a positive predictive value of 100% and a negative predictive value of 91%. Compared to the reference standard, the CAM-ICU had a significantly (p DSM-5 criteria and reliable regarding inter-observer agreement in a research setting. Delirium remains under-recognized by bedside clinicians.

  10. An assessment of the long term suitability of present and proposed methods for the management of uranium mill tailings

    International Nuclear Information System (INIS)

    1979-07-01

    Proposals for safe, long-term containment of conventional tailings include 1) storage under water, 2) storage in active, abandoned or specially created underground mines and, 3) storage in open pits, with subsequent flooding or covering with overburden. The underwater proposal can meet most of the requirements of long term containment; however, extensive study of existing tailings deposits in deep water locations will be needed. Underground mines cannot provide sufficient storage capacity, since the tailings bulk during mill operation can occupy twice the volume of the original ore. It is possible to reduce the hazard by reducing the radium and thorium content of the tailings. Proposals for such an undertaking include ore beneficiation with rejection of the relatively innocuous fraction, radium-thorium removal in the mill, and significant changes in both ore processing and treatment of tailings. It is concluded that surface-stored tailings are vulnerable over the long term to dispersion by leaching and water erosion, and that access to a tailings site cannot be prevented, while only a major climatic or seismic event could disturb tailings stored in suitable underwater or underground mine sites. The criteria for determining suitability for each method, however, will need to be identified, tested, and accepted through the normal process of modeling, pilot plant evaluation, monitoring and evaluation. (author)

  11. Proposal of a New Method for Neutron Dosimetry Based on Spectral Information Obtained by Application of Artificial Neural Networks

    International Nuclear Information System (INIS)

    Fehrenbacher, G.; Schuetz, R.; Hahn, K.; Sprunck, M.; Cordes, E.; Biersack, J.P.; Wahl, W.

    1999-01-01

    A new method for the monitoring of neutron radiation is proposed. It is based on the determination of spectral information on the neutron field in order to derive dose quantities like the ambient dose equivalent, the dose equivalent, or other dose quantities which depend on the neutron energy. The method uses a multi-element system consisting of converter type silicon detectors. The unfolding procedure is based on an artificial neural network (ANN). The response function of each element is determined by a computational model considering the neutron interaction with the dosemeter layers and the subsequent transport of produced ions. An example is given for a multi-element system. The ANN is trained by a given set of neutron spectra and then applied to count responses obtained in neutron fields. Four examples of spectra unfolded using the ANN are presented. (author)

  12. Proposal for an alignment method of the CLIC linear accelerator - From geodesic networks to the active pre-alignment

    International Nuclear Information System (INIS)

    Touze, T.

    2011-01-01

    The compact linear collider (CLIC) is the particle accelerator project proposed by the european organization for nuclear research (CERN) for high energy physics after the large hadron collider (LHC). Because of the nano-metric scale of the CLIC leptons beams, the emittance growth budget is very tight. It induces alignment tolerances on the positions of the CLIC components that have never been achieved before. The last step of the CLIC alignment will be done according to the beam itself. It falls within the competence of the physicists. However, in order to implement the beam-based feedback, a challenging pre-alignment is required: 10 μm at 3σ along a 200 m sliding window. For such a precision, the proposed solution must be compatible with a feedback between the measurement and repositioning systems. The CLIC pre-alignment will have to be active. This thesis does not demonstrate the feasibility of the CLIC active pre-alignment but shows the way to the last developments that have to be done for that purpose. A method is proposed. Based on the management of the Helmert transformations between Euclidean coordinate systems, from the geodetic networks to the metrological measurements, this method is likely to solve the CLIC pre-alignment problem. Large scale facilities have been built and Monte-Carlo simulations have been made in order to validate the mathematical modeling of the measurement systems and of the alignment references. When this is done, it will be possible to extrapolate the modeling to the entire CLIC length. It will be the last step towards the demonstration of the CLIC pre-alignment feasibility. (author)

  13. Precision of glucose measurements in control sera by isotope dilution/mass spectrometry: proposed definitive method compared with a reference method

    International Nuclear Information System (INIS)

    Pelletier, O.; Arratoon, C.

    1987-01-01

    This improved isotope-dilution gas chromatographic/mass spectrometric (GC/MS) method, in which [ 13 C]glucose is the internal standard, meets the requirements of a Definitive Method. In a first study with five reconstituted lyophilized sera, a nested analysis of variance of GC/MS values indicated considerable among-vial variation. The CV for 32 measurements per serum ranged from 0.5 to 0.9%. However, concentration and uncertainty values (mmol/L per gram of serum) assigned to one serum by the NBS Definitive Method (7.56 +/- 0.28) were practically identical to those obtained with the proposed method (7.57 +/- 0.20). In the second study, we used twice more [ 13 C]glucose diluent to assay four serum pools and two lyophilized sera. The CV ranged from 0.26 to 0.5% for the serum pools and from 0.28 to 0.59% for the lyophilized sera. In comparison, results by the hexokinase/glucose-6-phosphate dehydrogenase reference method agreed within acceptable limits with those by the Definitive Method but tended to be slightly higher (up to 3%) for lyophilized serum samples or slightly lower (up to 2.5%) for serum pools

  14. The fourth research co-ordination meeting (RCM) on 'Updated codes and methods to reduce the calculational uncertainties of liquid metal fast reactors reactivity effects'. Working material

    International Nuclear Information System (INIS)

    2003-01-01

    The fourth Research Co-ordination Meeting (RCM) of the Co-ordinated Research Project (CRP) on 'Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effect' was held during 19-23 May, 2003 in Obninsk, Russian Federation. The general objective of the CRP is to validate, verify and improve methodologies and computer codes used for the calculation of reactivity coefficients in fast reactors aiming at enhancing the utilization of plutonium and minor actinides. The first RCM took place in Vienna on 24 - 26 November 1999. The meeting was attended by 19 participants from 7 Member States and one from an international organization (France, Germany, India, Japan, Rep. of Korea, Russian Federation, the United Kingdom, and IAEA). The participants from two Member States (China and the U.S.A.) provided their results and presentation materials even though being absent at the meeting. The results for several relevant reactivity parameters obtained by the participants with their own state-of-the-art basic data and codes, were compared in terms of calculational uncertainty, and their effects on the ULOF transient behavior of the hybrid BN- 600 core were evaluated. Contributions of the participants in the benchmark analyses is shown. This report first addresses the benchmark definitions and specifications given for each Phase and briefly introduces the basic data, computer codes, and methodologies applied to the benchmark analyses by various participants. Then, the results obtained by the participants in terms of calculational uncertainty and their effect on the core transient behavior are intercompared. Finally it addresses some conclusions drawn in the benchmarks

  15. Verbal Auditory Cueing of Improvisational Dance: A Proposed Method for Training Agency in Parkinson’s Disease

    Science.gov (United States)

    Batson, Glenna; Hugenschmidt, Christina E.; Soriano, Christina T.

    2016-01-01

    Dance is a non-pharmacological intervention that helps maintain functional independence and quality of life in people with Parkinson’s disease (PPD). Results from controlled studies on group-delivered dance for people with mild-to-moderate stage Parkinson’s have shown statistically and clinically significant improvements in gait, balance, and psychosocial factors. Tested interventions include non-partnered dance forms (ballet and modern dance) and partnered (tango). In all of these dance forms, specific movement patterns initially are learned through repetition and performed in time-to-music. Once the basic steps are mastered, students may be encouraged to improvise on the learned steps as they perform them in rhythm with the music. Here, we summarize a method of teaching improvisational dance that advances previous reported benefits of dance for people with Parkinson’s disease (PD). The method relies primarily on improvisational verbal auditory cueing with less emphasis on directed movement instruction. This method builds on the idea that daily living requires flexible, adaptive responses to real-life challenges. In PD, movement disorders not only limit mobility but also impair spontaneity of thought and action. Dance improvisation demands open and immediate interpretation of verbally delivered movement cues, potentially fostering the formation of spontaneous movement strategies. Here, we present an introduction to a proposed method, detailing its methodological specifics, and pointing to future directions. The viewpoint advances an embodied cognitive approach that has eco-validity in helping PPD meet the changing demands of daily living. PMID:26925029

  16. Proposal for evaluation methodology on impact resistant performance and construction method of tornado missile protection net structure

    International Nuclear Information System (INIS)

    Namba, Kosuke; Shirai, Koji

    2014-01-01

    In nuclear power plants, the necessity of the Tornado Missile Protection Structure is becoming a technical key issue. Utilization of the net structure seems to be one of the realistic counter measures from the point of the view of the mitigation wind and seismic loads. However, the methodology for the selection of the net suitable materials, the energy absorption design method and the construction method are not sufficiently established. In this report, three materials (high-strength metal mesh, super strong polyethylene fiber net and steel grating) were selected for the candidate material and the material screening tests, the energy absorption tests by free drop test using the heavy weight and the impact tests with the small diameter missile. As a result, high-strength metal mesh was selected as a suitable material for tornado missile protection net structure. Moreover, the construction method to obtain the good energy absorption performance of the material and the practical design method to estimate the energy absorption of the high-strength metal mesh under tornado missile impact load were proposed. (author)

  17. Fulbright update

    Science.gov (United States)

    Opportunities to teach or perform postdoctoral research in the earth and atmospheric sciences under the Senior Scholar Fulbright awards program for 1984-1985 (Eos, March 1, 1983, p. 81) are available in 14 countries, according to the Council for International Exchange of Scholars.The countries and the specialization opportunities are Algeria, any specialization; Australia, mineral processing research; India, any specialization in geology or geophysics; Israel, environmental studies; Korea, any specialization; Lebanon, geophysics, geotectonics, and structural geology; Morocco, research methods in science education; Pakistan, geology, marine biology, and mineralogy; Poland, mining technology; Sudan, geology and remote sensing; Thailand, planning and environmental change; USSR, any specialization; Yugoslavia, any research specialization; and Zimbabwe, exploration geophysics and solid earth geophysics.

  18. Proposal of Environmental Impact Assessment Method for Concrete in South Korea: An Application in LCA (Life Cycle Assessment

    Directory of Open Access Journals (Sweden)

    Tae Hyoung Kim

    2016-11-01

    Full Text Available This study aims to develop a system for assessing the impact of the substances discharged from concrete production process on six environmental impact categories, i.e., global warming (GWP, acidification (AP, eutrophication (EP, abiotic depletion (ADP, ozone depletion (ODP, and photochemical oxidant creation (POCP, using the life a cycle assessment (LCA method. To achieve this, this study proposed an LCA method specifically applicable to the Korean concrete industry by adapting the ISO standards to suit the Korean situations. The proposed LCA method involves a system that performs environmental impact assessment on the basis of input information on concrete mix design, transport distance, and energy consumption in a batch plant. The Concrete Lifecycle Assessment System (CLAS thus developed provides user-friendly support for environmental impact assessment with specialized database for concrete mix materials and energy sources. In the case analysis using the CLAS, among the substances discharged from the production of 24 MPa concrete, those contributing to GWP, AP, EP, ADP, ODP, and POCP were assessed to amount to 309 kg-CO2 eq/m3, 28.7 kg-SO2 eq/m3, 5.21 kg-PO43− eq/m3, 0.000049 kg-CFC11 eq/m3, 34 kg/m3, and 21 kg-Ethylene eq/m3, respectively. Of these six environmental impact categories selected for the LCA in this study, ordinary Portland cement (OPC was found to contribute most intensely to GWP and POCP, and aggregates, to AP, EP, ODP, and ADP. It was also found that the mix design with increased prop proportion of recycled aggregate was found to contribute to reducing the impact in all other categories.

  19. Design and development for updating national 1:50,000 topographic databases in China

    Directory of Open Access Journals (Sweden)

    CHEN Jun

    2010-02-01

    data quality controlling, a series of technical production specifications, and a network of updating production units in different geographic places in the country. 1.3 Results A group of updating models and technical methods were proposed after a systematic investigation of the key problems arising from the continuous updating of national 1:50,000 map databases. A set of specific software tools and packages was further developed to support large area updating. With these innovative methodologies and tools, a total of 19,150 map sheets at 1:50,000 scales had been updated and such a massive task was completed in an acceptable time frame, i.e., from 2060-2010. The data currency of national 1:50,000 map databases has been raised from 20-30 years to 5 years! 1.4 Conclusion A modern state requires accurate up to date maps and keeping them up to date on a regular basis is a massive task for a country the size of China. National Geomatics Center of China (NGCC has solved this problem by using the latest data sources and developing new techniques. The methodologies developed in this paper are suited to regular updating in other rapidly developing nations and establish a model which can be followed in similar circumstances throughout the world.

  20. CSTT Update: Fuel Quality Analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Brosha, Eric L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Lujan, Roger W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Mukundan, Rangachary [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Rockward, Tommy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Romero, Christopher J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Williams, Stefan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Wilson, Mahlon S. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-02-06

    These are slides from a presentation. The following topics are covered: project background (scope and approach), developing the prototype (timeline), update on intellectual property, analyzer comparisons (improving humidification, stabilizing the baseline, applying clean-up strategy, impact of ionomer content and improving clean-up), proposed operating mode, considerations for testing in real-world conditions (Gen 1 analyzer electronics development, testing partner identified, field trial planning), summary, and future work.

  1. Method for developing arrangements for response to a nuclear or radiological emergency. Updating IAEA-TECDOC-953. Emergency preparedness and response. Publication date: October 2003

    International Nuclear Information System (INIS)

    2003-09-01

    Response (EPR) series is an update to IAEA-TECDOC-953. It aims to: fulfil in part the IAEA's function under article 5.a(ii) of the Assistance Convention, and to provide a compendium of best practice for planners aiming both to comply with the Requirements and to improve their own capabilities for responding to radiation emergencies, while the Secretariat facilitates consensus on formal guidance for meeting the Safety Requirements. The publication incorporates material from existing IAEA emergency preparedness Safety Guides updating it to be consistent with the Requirements, to incorporate best practice, the results of research and the latest lessons identified in past emergencies, and to reflect relevant issues of international law. It provides a practical source of information relevant to the development of an integrated national, local and operator capability for emergency response based on the potential nature and magnitude of the risk. In order to apply the method described in this publication, emergency planners should have a good understanding of the basic principles for response to a nuclear or radiological emergency. They should review the relevant international guidance beforehand. This publication provides information concerning methodologies, techniques and available results of research relating to response to nuclear or radiological emergencies. It also provides a practical, step-by-step method for developing integrated operator, local and national capabilities for emergency response. It does not provide IAEA endorsed guidance or recommendations because this material has not undergone the process of peer reviews needed to become part of the IAEA Safety Standards Series. This publication concerns preparations for radiation emergencies. The range of potential radiation emergencies of concern is enormous, extending from a major reactor emergency to emergencies involving lost or stolen radioactive material. This method covers planning for the entire range. The

  2. Updating Road Networks by Local Renewal from GPS Trajectories

    Directory of Open Access Journals (Sweden)

    Tao Wu

    2016-09-01

    Full Text Available The long production cycle and huge cost of collecting road network data often leave the data lagging behind the latest real conditions. However, this situation is rapidly changing as the positioning techniques ubiquitously used in mobile devices are gradually being implemented in road network research and applications. Currently, the predominant approaches infer road networks from mobile location information (e.g., GPS trajectory data directly using various extracting algorithms, which leads to expensive consumption of computational resources in the case of large-scale areas. For this reason, we propose an alternative that renews road networks with a novel spiral strategy, including a hidden Markov model (HMM for detecting potential problems in existing road network data and a method to update the data, on the local scale, by generating new road segments from trajectory data. The proposed approach reduces computation costs on roads with completed or updated information by updating problem road segments in the minimum range of the road network. We evaluated the performance of our proposals using GPS traces collected from taxies and OpenStreetMap (OSM road networks covering urban areas of Wuhan City.

  3. Mapping subsurface pathways for contaminant migration at a proposed low level waste disposal site using electromagnetic methods

    International Nuclear Information System (INIS)

    Pin, F.G.; Ketelle, R.H.

    1984-01-01

    Electromagnetic methods have been used to measure apparent terrain conductivity in the downstream portion of a watershed in which a waste disposal site is proposed. At that site, the pathways for waste migration in ground water are controlled by subsurface channels. The channels are identified using isocurves of measured apparent conductivity. Two upstream channel branches are found to merge into a single downstream channel which constitutes the main drainage path out of the watershed. The identification and mapping of the ground water pathways is an important contribution to the site characterization study and the pathways analysis. The direct applications of terrain conductivity mapping to the planning of the monitoring program, the hydrogeological testing, and the modeling study are demonstrated. 7 references, 4 figures

  4. Proposal of quality indicators for cardiac rehabilitation after acute coronary syndrome in Japan: a modified Delphi method and practice test.

    Science.gov (United States)

    Ohtera, Shosuke; Kanazawa, Natsuko; Ozasa, Neiko; Ueshima, Kenji; Nakayama, Takeo

    2017-01-27

    Cardiac rehabilitation is underused and its quality in practice is unclear. A quality indicator is a measurable element of clinical practice performance. This study aimed to propose a set of quality indicators for cardiac rehabilitation following an acute coronary event in the Japanese population and conduct a small-size practice test to confirm feasibility and applicability of the indicators in real-world clinical practice. This study used a modified Delphi technique (the RAND/UCLA appropriateness method), a consensus method which involves an evidence review, a face-to-face multidisciplinary panel meeting and repeated anonymous rating. Evidence to be reviewed included clinical practice guidelines available in English or Japanese and existing quality indicators. Performance of each indicator was assessed retrospectively using medical records at a university hospital in Japan. 10 professionals in cardiac rehabilitation for the consensus panel. In the literature review, 23 clinical practice guidelines and 16 existing indicators were identified to generate potential indicators. Through the consensus-building process, a total of 30 indicators were assessed and finally 13 indicators were accepted. The practice test (n=39) revealed that 74% of patients underwent cardiac rehabilitation. Median performance of process measures was 93% (IQR 46-100%). 'Communication with the doctor who referred the patient to cardiac rehabilitation' and 'continuous participation in cardiac rehabilitation' had low performance (32% and 38%, respectively). A modified Delphi technique identified a comprehensive set of quality indicators for cardiac rehabilitation. The single-site, small-size practice test confirmed that most of the proposed indicators were measurable in real-world clinical practice. However, some clinical processes which are not covered by national health insurance in Japan had low performance. Further studies will be needed to clarify and improve the quality of care in cardiac

  5. THE PROPOSED METHODOLOGIES FOR THE SIX SIGMA METHOD AND TQM STRATEGY AS WELL AS THEIR APPLICATION IN PRACTICE IN MACEDONIA

    Directory of Open Access Journals (Sweden)

    Elizabeta Mitreva

    2014-05-01

    Full Text Available This paper presents the proposed methodologies for the Six Sigma method and the TQM strategy as well as their application in practice in Macedonia. Although the philosophy of the total quality management (TQM is deeply involved in many industries and business areas of European and other countries it is insufficiently known and present in our country and other developing countries. The same applies to the Six Sigma approach of reducing the dispersion of a process and it is present in a small fraction in Macedonian companies. The results of the implementation have shown that the application of the Six Sigma approach does not refer to the number of defects per million opportunities but to the systematic and systemic lowering of the dispersion process. The operation and effect of the implementation of the six sigma method engages experts that receive a salary depending on the success of the Six Sigma program. On other hand the results of the application of the TQM methodology within the Macedonian companies will depend on the commitment of all employees and their motivation.

  6. A proposed method for accurate 3D analysis of cochlear implant migration using fusion of cone beam CT

    Directory of Open Access Journals (Sweden)

    Guido eDees

    2016-01-01

    Full Text Available IntroductionThe goal of this investigation was to compare fusion of sequential cone beam CT volumes to the gold standard (fiducial registration in order to be able to analyze clinical CI migration with high accuracy in three dimensions. Materials and MethodsPaired time-lapsed cone beam CT volumes were performed on five human cadaver temporal bones and one human subject. These volumes were fused using 3D Slicer 4 and BRAINSFit software. Using a gold standard fiducial technique, the accuracy, robustness and performance time of the fusion process were assessed.Results This proposed fusion protocol achieves a sub voxel mean Euclidean distance of 0.05 millimeter in human cadaver temporal bones and 0.16 millimeter when applied to the described in vivo human synthetic data set in over 95% of all fusions. Performance times are less than two minutes.ConclusionHere a new and validated method based on existing techniques is described which could be used to accurately quantify migration of cochlear implant electrodes.

  7. Real Time Updating Genetic Network Programming for Adapting to the Change of Stock Prices

    Science.gov (United States)

    Chen, Yan; Mabu, Shingo; Shimada, Kaoru; Hirasawa, Kotaro

    The key in stock trading model is to take the right actions for trading at the right time, primarily based on the accurate forecast of future stock trends. Since an effective trading with given information of stock prices needs an intelligent strategy for the decision making, we applied Genetic Network Programming (GNP) to creating a stock trading model. In this paper, we propose a new method called Real Time Updating Genetic Network Programming (RTU-GNP) for adapting to the change of stock prices. There are three important points in this paper: First, the RTU-GNP method makes a stock trading decision considering both the recommendable information of technical indices and the candlestick charts according to the real time stock prices. Second, we combine RTU-GNP with a Sarsa learning algorithm to create the programs efficiently. Also, sub-nodes are introduced in each judgment and processing node to determine appropriate actions (buying/selling) and to select appropriate stock price information depending on the situation. Third, a Real Time Updating system has been firstly introduced in our paper considering the change of the trend of stock prices. The experimental results on the Japanese stock market show that the trading model with the proposed RTU-GNP method outperforms other models without real time updating. We also compared the experimental results using the proposed method with Buy&Hold method to confirm its effectiveness, and it is clarified that the proposed trading model can obtain much higher profits than Buy&Hold method.

  8. Proposing New Methods to Enhance the Low-Resolution Simulated GPR Responses in the Frequency and Wavelet Domains

    Directory of Open Access Journals (Sweden)

    Reza Ahmadi

    2014-12-01

    Full Text Available To date, a number of numerical methods, including the popular Finite-Difference Time Domain (FDTD technique, have been proposed to simulate Ground-Penetrating Radar (GPR responses. Despite having a number of advantages, the finite-difference method also has pitfalls such as being very time consuming in simulating the most common case of media with high dielectric permittivity, causing the forward modelling process to be very long lasting, even with modern high-speed computers. In the present study the well-known hyperbolic pattern response of horizontal cylinders, usually found in GPR B-Scan images, is used as a basic model to examine the possibility of reducing the forward modelling execution time. In general, the simulated GPR traces of common reflected objects are time shifted, as with the Normal Moveout (NMO traces encountered in seismic reflection responses. This suggests the application of Fourier transform to the GPR traces, employing the time-shifting property of the transformation to interpolate the traces between the adjusted traces in the frequency domain (FD. Therefore, in the present study two post-processing algorithms have been adopted to increase the speed of forward modelling while maintaining the required precision. The first approach is based on linear interpolation in the Fourier domain, resulting in increasing lateral trace-to-trace interval of appropriate sampling frequency of the signal, preventing any aliasing. In the second approach, a super-resolution algorithm based on 2D-wavelet transform is developed to increase both vertical and horizontal resolution of the GPR B-Scan images through preserving scale and shape of hidden hyperbola features. Through comparing outputs from both methods with the corresponding actual high-resolution forward response, it is shown that both approaches can perform satisfactorily, although the wavelet-based approach outperforms the frequency-domain approach noticeably, both in amplitude and

  9. Endobronchial Ultrasound (EBUS) - Update 2017.

    Science.gov (United States)

    Darwiche, Kaid; Özkan, Filiz; Wolters, Celina; Eisenmann, Stephan

    2018-02-01

    Endobronchial ultrasound (EBUS) has revolutionized the diagnosis of lung cancer over the last decade. This minimally invasive diagnostic method has also become increasingly important in the case of other diseases such as sarcoidosis, thereby helping to avoid unnecessary diagnostic interventions. This review article provides an update regarding EBUS and discusses current and future developments of this method. © Georg Thieme Verlag KG Stuttgart · New York.

  10. Profile updating for information systems

    International Nuclear Information System (INIS)

    Abrantes, J.F.

    1983-02-01

    Profiles updating methods were analysed. A method suitable to the characteristics of the system used in the research (SDI/CIN/CNEN) that uses as the selection criterio the threshold and weights criterion, was determined. Relevance weighting theory was described and experiments to verify precision were carried out. The improvements obtained were good nevertheless more significant tests are required to attain more reliable results. (Author) [pt

  11. A Proposal of a Method to Measure and Evaluate the Effect to Apply External Support Measures for Owners by Construction Management Method, etc

    Science.gov (United States)

    Tada, Hiroshi; Miyatake, Ichiro; Mouri, Junji; Ajiki, Norihiko; Fueta, Toshiharu

    In Japan, various approaches have been taken to ensure the quality of public works or to support the procurement regime of the governmental agencies, as a means to utilize external resources, which include the procurement support service or the construction management (CM) method. Although discussions on these measures to utilize external resources (hereinafter referred to as external support measure) have been going on, as well as the follow-up surveys showing the positive effects of such measures have been conducted, the surveys only deal with the matters concerning the overall effects of the external support measure on the whole, meaning that the effect of each item of the tasks have not been addressed, and that the extent it dealt with the expectations of the client is unknown. However, the effective use of the external support measure in future cannot be achieved without knowing what was the purpose to introduce the external support measure, and what effect was expected on each task item, and what extent the expectation fulfilled. Furthermore, it is important to clarify not only the effect as compared to the client's expectation (performance), but also the public benefit of this measure (value improvement). From this point of view, there is not an established method to figure out the effect of the client's measure to utilize external resources. In view of this background, this study takes the CM method as an example of the external support measure, and proposes a method to measure and evaluate the effect by each task item, and suggests the future issues and possible responses, in the aim of contributing the promotion, improvement, and proper implementation of the external support measures in future.

  12. The Updating of Geospatial Base Data

    Science.gov (United States)

    Alrajhi, Muhamad N.; Konecny, Gottfried

    2018-04-01

    Topopographic mapping issues concern the area coverage at different scales and their age. The age of the map is determined by the system of updating. The United Nations (UNGGIM) have attempted to track the global map coverage at various scale ranges, which has greatly improved in recent decades. However the poor state of updating of base maps is still a global problem. In Saudi Arabia large scale mapping is carried out for all urban, suburban and rural areas by aerial surveys. Updating is carried out by remapping every 5 to 10 years. Due to the rapid urban development this is not satisfactory, but faster update methods are forseen by use of high resolution satellite imagery and the improvement of object oriented geodatabase structures, which will permit to utilize various survey technologies to update the photogrammetry established geodatabases. The longterm goal is to create an geodata infrastructure, which exists in Great Britain or Germany.

  13. A rapid learning and dynamic stepwise updating algorithm for flat neural networks and the application to time-series prediction.

    Science.gov (United States)

    Chen, C P; Wan, J Z

    1999-01-01

    A fast learning algorithm is proposed to find an optimal weights of the flat neural networks (especially, the functional-link network). Although the flat networks are used for nonlinear function approximation, they can be formulated as linear systems. Thus, the weights of the networks can be solved easily using a linear least-square method. This formulation makes it easier to update the weights instantly for both a new added pattern and a new added enhancement node. A dynamic stepwise updating algorithm is proposed to update the weights of the system on-the-fly. The model is tested on several time-series data including an infrared laser data set, a chaotic time-series, a monthly flour price data set, and a nonlinear system identification problem. The simulation results are compared to existing models in which more complex architectures and more costly training are needed. The results indicate that the proposed model is very attractive to real-time processes.

  14. A Proposal of Estimation Methodology to Improve Calculation Efficiency of Sampling-based Method in Nuclear Data Sensitivity and Uncertainty Analysis

    International Nuclear Information System (INIS)

    Song, Myung Sub; Kim, Song Hyun; Kim, Jong Kyung; Noh, Jae Man

    2014-01-01

    The uncertainty with the sampling-based method is evaluated by repeating transport calculations with a number of cross section data sampled from the covariance uncertainty data. In the transport calculation with the sampling-based method, the transport equation is not modified; therefore, all uncertainties of the responses such as k eff , reaction rates, flux and power distribution can be directly obtained all at one time without code modification. However, a major drawback with the sampling-based method is that it requires expensive computational load for statistically reliable results (inside confidence level 0.95) in the uncertainty analysis. The purpose of this study is to develop a method for improving the computational efficiency and obtaining highly reliable uncertainty result in using the sampling-based method with Monte Carlo simulation. The proposed method is a method to reduce the convergence time of the response uncertainty by using the multiple sets of sampled group cross sections in a single Monte Carlo simulation. The proposed method was verified by estimating GODIVA benchmark problem and the results were compared with that of conventional sampling-based method. In this study, sampling-based method based on central limit theorem is proposed to improve calculation efficiency by reducing the number of repetitive Monte Carlo transport calculation required to obtain reliable uncertainty analysis results. Each set of sampled group cross sections is assigned to each active cycle group in a single Monte Carlo simulation. The criticality uncertainty for the GODIVA problem is evaluated by the proposed and previous method. The results show that the proposed sampling-based method can efficiently decrease the number of Monte Carlo simulation required for evaluate uncertainty of k eff . It is expected that the proposed method will improve computational efficiency of uncertainty analysis with sampling-based method

  15. Feasibility study and technical proposal for the use of microseismic methods in the long-term observation of bedrock stability

    International Nuclear Information System (INIS)

    Saari, J.

    1995-04-01

    Recent geodetic and seismological studies have paid attention to the slow deformation occurring in the Fennoscandian Shield. On the basis of these studies, together with in-situ stress measurements, the idea has been put forth that horizontal movement can be even greater than vertical movement. Local seismotectonics has importance in relation to the predictions of the long-term stability of the bedrock at the final disposal site. Potential direct and - what in Finland is more likely - indirect effects on the vault are due to local earthquakes of creep. The direct effects on the repository include rock vibration and displacement on an increasing fault. The indirect effects are changes in the surrounding structure, in the stress field, in the groundwater table, pressure, flux and chemistry. The block movements are controlled mainly by the network of fracture zones. The report deals with the possibilities to monitor by seismic methods slow movements occurring in the bedrock at the local level. The report includes descriptions of instrumentation for recording microearthquakes, the seismic network and an interpretation of the observations. The potential sites for disposal (Kuhmo, Aeaenekoski, Eurajoki) are compared in relation to seismic monitoring. Also the experiences of other investigations and a proposal for microearthquake investigations as well as of prospective developments within monitoring are presented. (28 refs., 17 figs.)

  16. Proposed Food and Drug Administration protective action guides for human food and animal feed: methods and implementation

    International Nuclear Information System (INIS)

    Schmidt, G.D.; Shleien, B.; Chiacchierini, R.P.

    1978-01-01

    The Food and Drug Administration's proposed recommendations to State and local agencies provide guidance on appropriate planning actions necessary for evaluating and preventing radioactive contamination of foods and animal feeds and the control and use of such products should they become contaminated. This presentation will cover the recommendations on implementation of the Preventive and Emergency PAG's. These recommendations include (1) the use of 'Dietary Factors' to obtain PAG's for specific food items from the general guidance, (2) procedures to be used for radionuclide mixtures and other radionuclides, (3) field and laboratory methods for the measurement of the level of contamination in the event of an incident and, (4) protective actions to be implemented by State and local agencies to limit the radiation dose to the public. Specific protective actions which should be considered for implementation when the projected dose exceeds the Preventive PAG are given for application to pasture, milk, fruits and vegetables, and grains. At the Emergency PAG level, the protective action decision is whether condemnation or other disposition is appropriate. (author)

  17. Relating two proposed methods for speedup of algorithms for fitting two- and three-way principal component and related multilinear models

    NARCIS (Netherlands)

    Kiers, Henk A.L.; Harshman, Richard A.

    Multilinear analysis methods such as component (and three-way component) analysis of very large data sets can become very computationally demanding and even infeasible unless some method is used to compress the data and/or speed up the algorithms. We discuss two previously proposed speedup methods.

  18. Are Forecast Updates Progressive?

    NARCIS (Netherlands)

    C-L. Chang (Chia-Lin); Ph.H.B.F. Franses (Philip Hans); M.J. McAleer (Michael)

    2010-01-01

    textabstractMacro-economic forecasts typically involve both a model component, which is replicable, as well as intuition, which is non-replicable. Intuition is expert knowledge possessed by a forecaster. If forecast updates are progressive, forecast updates should become more accurate, on average,

  19. Proposal of flexible atomic and molecular process management for Monte Carlo impurity transport code based on object oriented method

    International Nuclear Information System (INIS)

    Asano, K.; Ohno, N.; Takamura, S.

    2001-01-01

    Monte Carlo simulation code on impurity transport has been developed by several groups to be utilized mainly for fusion related edge plasmas. State of impurity particle is determined by atomic and molecular processes such as ionization, charge exchange in plasma. A lot of atomic and molecular processes have been considered because the edge plasma has not only impurity atoms, but also impurity molecules mainly related to chemical erosion of carbon materials, and their cross sections have been given experimentally and theoretically. We need to reveal which process is essential in a given edge plasma condition. Monte Carlo simulation code, which takes such various atomic and molecular processes into account, is necessary to investigate the behavior of impurity particle in plasmas. Usually, the impurity transport simulation code has been intended for some specific atomic and molecular processes so that the introduction of a new process forces complicated programming work. In order to evaluate various proposed atomic and molecular processes, a flexible management of atomic and molecular reaction should be established. We have developed the impurity transport simulation code based on object-oriented method. By employing object-oriented programming, we can handle each particle as 'object', which enfolds data and procedure function itself. A user (notice, not programmer) can define property of each particle species and the related atomic and molecular processes and then each 'object' is defined by analyzing this information. According to the relation among plasma particle species, objects are connected with each other and change their state by themselves. Dynamic allocation of these objects to program memory is employed to adapt for arbitrary number of species and atomic/molecular reactions. Thus we can treat arbitrary species and process starting from, for instance, methane and acetylene. Such a software procedure would be useful also for industrial application plasmas

  20. a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating

    Science.gov (United States)

    Tian, W.; Zhu, X.; Liu, Y.

    2012-08-01

    Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.

  1. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  2. Automatic Rapid Updating of ATR Target Knowledge Bases

    National Research Council Canada - National Science Library

    Wells, Barton

    1999-01-01

    .... Methods of comparing infrared images with CAD model renderings, including object detection, feature extraction, object alignment, match quality evaluation, and CAD model updating are researched and analyzed...

  3. Object Tracking Using Adaptive Covariance Descriptor and Clustering-Based Model Updating for Visual Surveillance

    Directory of Open Access Journals (Sweden)

    Lei Qin

    2014-05-01

    Full Text Available We propose a novel approach for tracking an arbitrary object in video sequences for visual surveillance. The first contribution of this work is an automatic feature extraction method that is able to extract compact discriminative features from a feature pool before computing the region covariance descriptor. As the feature extraction method is adaptive to a specific object of interest, we refer to the region covariance descriptor computed using the extracted features as the adaptive covariance descriptor. The second contribution is to propose a weakly supervised method for updating the object appearance model during tracking. The method performs a mean-shift clustering procedure among the tracking result samples accumulated during a period of time and selects a group of reliable samples for updating the object appearance model. As such, the object appearance model is kept up-to-date and is prevented from contamination even in case of tracking mistakes. We conducted comparing experiments on real-world video sequences, which confirmed the effectiveness of the proposed approaches. The tracking system that integrates the adaptive covariance descriptor and the clustering-based model updating method accomplished stable object tracking on challenging video sequences.

  4. Updated clinical guidelines experience major reporting limitations

    Directory of Open Access Journals (Sweden)

    Robin W.M. Vernooij

    2017-10-01

    Full Text Available Abstract Background The Checklist for the Reporting of Updated Guidelines (CheckUp was recently developed. However, so far, no systematic assessment of the reporting of updated clinical guidelines (CGs exists. We aimed to examine (1 the completeness of reporting the updating process in CGs and (2 the inter-observer reliability of CheckUp. Methods We conducted a systematic assessment of the reporting of the updating process in a sample of updated CGs using CheckUp. We performed a systematic search to identify updated CGs published in 2015, developed by a professional society, reporting a systematic review of the evidence, and containing at least one recommendation. Three reviewers independently assessed the CGs with CheckUp (16 items. We calculated the median score per item, per domain, and overall, converting scores to a 10-point scale. Multiple linear regression analyses were used to identify differences according to country, type of organisation, scope, and health topic of updated CGs. We calculated the intraclass coefficient (ICC and 95% confidence interval (95% CI for domains and overall score. Results We included in total 60 updated CGs. The median domain score on a 10-point scale for presentation was 5.8 (range 1.7 to 10, for editorial independence 8.3 (range 3.3 to 10, and for methodology 5.7 (range 0 to 10. The median overall score on a 10-point scale was 6.3 (range 3.1 to 10. Presentation and justification items at recommendation level (respectively reported by 27 and 38% of the CGs and the methods used for the external review and implementing changes in practice were particularly poorly reported (both reported by 38% of the CGs. CGs developed by a European or international institution obtained a statistically significant higher overall score compared to North American or Asian institutions (p = 0.014. Finally, the agreement among the reviewers on the overall score was excellent (ICC 0.88, 95% CI 0.75 to 0.95. Conclusions The

  5. Using Multi-Viewpoint Contracts for Negotiation of Embedded Software Updates

    Directory of Open Access Journals (Sweden)

    Sönke Holthusen

    2016-05-01

    Full Text Available In this paper we address the issue of change after deployment in safety-critical embedded system applications. Our goal is to substitute lab-based verification with in-field formal analysis to determine whether an update may be safely applied. This is challenging because it requires an automated process able to handle multiple viewpoints such as functional correctness, timing, etc. For this purpose, we propose an original methodology for contract-based negotiation of software updates. The use of contracts allows us to cleanly split the verification effort between the lab and the field. In addition, we show how to rely on existing viewpoint-specific methods for update negotiation. We illustrate our approach on a concrete example inspired by the automotive domain.

  6. A PSO Driven Intelligent Model Updating and Parameter Identification Scheme for Cable-Damper System

    Directory of Open Access Journals (Sweden)

    Danhui Dan

    2015-01-01

    Full Text Available The precise measurement of the cable force is very important for monitoring and evaluating the operation status of cable structures such as cable-stayed bridges. The cable system should be installed with lateral dampers to reduce the vibration, which affects the precise measurement of the cable force and other cable parameters. This paper suggests a cable model updating calculation scheme driven by the particle swarm optimization (PSO algorithm. By establishing a finite element model considering the static geometric nonlinearity and stress-stiffening effect firstly, an automatically finite element method model updating powered by PSO algorithm is proposed, with the aims to identify the cable force and relevant parameters of cable-damper system precisely. Both numerical case studies and full-scale cable tests indicated that, after two rounds of updating process, the algorithm can accurately identify the cable force, moment of inertia, and damping coefficient of the cable-damper system.

  7. A State Space Model for Spatial Updating of Remembered Visual Targets during Eye Movements.

    Science.gov (United States)

    Mohsenzadeh, Yalda; Dash, Suryadeep; Crawford, J Douglas

    2016-01-01

    In the oculomotor system, spatial updating is the ability to aim a saccade toward a remembered visual target position despite intervening eye movements. Although this has been the subject of extensive experimental investigation, there is still no unifying theoretical framework to explain the neural mechanism for this phenomenon, and how it influences visual signals in the brain. Here, we propose a unified state-space model (SSM) to account for the dynamics of spatial updating during two types of eye movement; saccades and smooth pursuit. Our proposed model is a non-linear SSM and implemented through a recurrent radial-basis-function neural network in a dual Extended Kalman filter (EKF) structure. The model parameters and internal states (remembered target position) are estimated sequentially using the EKF method. The proposed model replicates two fundamental experimental observations: continuous gaze-centered updating of visual memory-related activity during smooth pursuit, and predictive remapping of visual memory activity before and during saccades. Moreover, our model makes the new prediction that, when uncertainty of input signals is incorporated in the model, neural population activity and receptive fields expand just before and during saccades. These results suggest that visual remapping and motor updating are part of a common visuomotor mechanism, and that subjective perceptual constancy arises in part from training the visual system on motor tasks.

  8. General Purpose Fortran Program for Discrete-Ordinate-Method Radiative Transfer in Scattering and Emitting Layered Media: An Update of DISORT

    Science.gov (United States)

    Tsay, Si-Chee; Stamnes, Knut; Wiscombe, Warren; Laszlo, Istvan; Einaudi, Franco (Technical Monitor)

    2000-01-01

    This update reports a state-of-the-art discrete ordinate algorithm for monochromatic unpolarized radiative transfer in non-isothermal, vertically inhomogeneous, but horizontally homogeneous media. The physical processes included are Planckian thermal emission, scattering with arbitrary phase function, absorption, and surface bidirectional reflection. The system may be driven by parallel or isotropic diffuse radiation incident at the top boundary, as well as by internal thermal sources and thermal emission from the boundaries. Radiances, fluxes, and mean intensities are returned at user-specified angles and levels. DISORT has enjoyed considerable popularity in the atmospheric science and other communities since its introduction in 1988. Several new DISORT features are described in this update: intensity correction algorithms designed to compensate for the 8-M forward-peak scaling and obtain accurate intensities even in low orders of approximation; a more general surface bidirectional reflection option; and an exponential-linear approximation of the Planck function allowing more accurate solutions in the presence of large temperature gradients. DISORT has been designed to be an exemplar of good scientific software as well as a program of intrinsic utility. An extraordinary effort has been made to make it numerically well-conditioned, error-resistant, and user-friendly, and to take advantage of robust existing software tools. A thorough test suite is provided to verify the program both against published results, and for consistency where there are no published results. This careful attention to software design has been just as important in DISORT's popularity as its powerful algorithmic content.

  9. Numerical Differentiation Methods for Computing Error Covariance Matrices in Item Response Theory Modeling: An Evaluation and a New Proposal

    Science.gov (United States)

    Tian, Wei; Cai, Li; Thissen, David; Xin, Tao

    2013-01-01

    In item response theory (IRT) modeling, the item parameter error covariance matrix plays a critical role in statistical inference procedures. When item parameters are estimated using the EM algorithm, the parameter error covariance matrix is not an automatic by-product of item calibration. Cai proposed the use of Supplemented EM algorithm for…

  10. National Pediatric Program Update

    International Nuclear Information System (INIS)

    2008-01-01

    The book of the National Pediatric Program Update, issued by the Argentina Society of Pediatrics, describes important issues, including: effective treatment of addictions (drugs); defects of the neural tube; and the use of radiation imaging in diagnosis. [es

  11. Update-in-Place Analysis for True Multidimensional Arrays

    Directory of Open Access Journals (Sweden)

    Steven M. Fitzgerald

    1996-01-01

    Full Text Available Applicative languages have been proposed for defining algorithms for parallel architectures because they are implicitly parallel and lack side effects. However, straightforward implementations of applicative-language compilers may induce large amounts of copying to preserve program semantics. The unnecessary copying of data can increase both the execution time and the memory requirements of an application. To eliminate the unnecessary copying of data, the Sisal compiler uses both build-in-place and update-in-place analyses. These optimizations remove unnecessary array copy operations through compile-time analysis. Both build-in-place and update-in-place are based on hierarchical ragged arrays, i.e., the vector-of-vectors array model. Although this array model is convenient for certain applications, many optimizations are precluded, e.g., vectorization. To compensate for this deficiency, new languages, such as Sisal 2.0, have extended array models that allow for both high-level array operations to be performed and efficient implementations to be devised. In this article, we introduce a new method to perform update-in-place analysis that is applicable to arrays stored either in hierarchical or in contiguous storage. Consequently, the array model that is appropriate for an application can be selected without the loss of performance. Moreover, our analysis is more amenable for distributed memory and large software systems.

  12. Proposed method for assigning metric tons of heavy metal values to defense high-level waste forms to be disposed of in a geologic repository

    International Nuclear Information System (INIS)

    1987-08-01

    A proposed method is described for assigning an equivalent metric ton heavy metal (eMTHM) value to defense high-level waste forms to be disposed of in a geologic repository. This method for establishing a curie equivalency between defense high-level waste and irradiated commercial fuel is based on the ratio of defense fuel exposure to the typical commercial fuel exposure, MWd/MTHM. application of this technique to defense high-level wastes is described. Additionally, this proposed technique is compared to several alternate calculations for eMTHM. 15 refs., 2 figs., 10 tabs

  13. A Continuously Updated, Global Land Classification Map, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate a fully automatic capability for generating a global, high resolution (30 m) land classification map, with continuous updates from...

  14. Model Updating Nonlinear System Identification Toolbox, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology proposes to develop an enhanced model updating nonlinear system identification (MUNSID) methodology by adopting the flight data with state-of-the-art...

  15. Mammogram segmentation using maximal cell strength updation in cellular automata.

    Science.gov (United States)

    Anitha, J; Peter, J Dinesh

    2015-08-01

    Breast cancer is the most frequently diagnosed type of cancer among women. Mammogram is one of the most effective tools for early detection of the breast cancer. Various computer-aided systems have been introduced to detect the breast cancer from mammogram images. In a computer-aided diagnosis system, detection and segmentation of breast masses from the background tissues is an important issue. In this paper, an automatic segmentation method is proposed to identify and segment the suspicious mass regions of mammogram using a modified transition rule named maximal cell strength updation in cellular automata (CA). In coarse-level segmentation, the proposed method performs an adaptive global thresholding based on the histogram peak analysis to obtain the rough region of interest. An automatic seed point selection is proposed using gray-level co-occurrence matrix-based sum average feature in the coarse segmented image. Finally, the method utilizes CA with the identified initial seed point and the modified transition rule to segment the mass region. The proposed approach is evaluated over the dataset of 70 mammograms with mass from mini-MIAS database. Experimental results show that the proposed approach yields promising results to segment the mass region in the mammograms with the sensitivity of 92.25% and accuracy of 93.48%.

  16. Multiple computer-based methods of measuring joint space width can discriminate between treatment arms in the COBRA trial -- Update of an ongoing OMERACT project.

    Science.gov (United States)

    Sharp, John T; Angwin, Jane; Boers, Maarten; Duryea, Jeff; Finckh, Axel; Hall, James R; Kauffman, Joost A; Landewé, Robert; Langs, Georg; Lukas, Cédric; Moens, H J Bernelot; Peloschek, Philipp; Strand, C Vibeke; van der Heijde, Désirée

    2009-08-01

    Previously reported data on 5 computer-based programs for measurement of joint space width focusing on discriminating ability and reproducibility are updated, showing new data. Four of 5 different programs for measuring joint space width were more discriminating than observer scoring for change in narrowing in the 12 months interval. Three of 4 programs were more discriminating than observer scoring for the 0-18 month interval. The program that failed to discriminate in the 0-12 month interval was not the same program that failed in the 0-18 month interval. The committee agreed at an interim meeting in November 2007 that an important goal for computer-based measurement programs is a 90% success rate in making measurements of joint pairs in followup studies. This means that the same joint must be measured in images of both timepoints in order to assess change over time in serial radiographs. None of the programs met this 90% threshold, but 3 programs achieved 85%-90% success rate. Intraclass correlation coefficients for assessing change in joint space width in individual joints were 0.98 or 0.99 for 4 programs. The smallest detectable change was < 0.2 mm for 4 of the 5 programs, representing 29%-36% of the change within the 99th percentile of measurements.

  17. [A study for testing the antifungal susceptibility of yeast by the Japanese Society for Medical Mycology (JSMM) method. The proposal of the modified JSMM method 2009].

    Science.gov (United States)

    Nishiyama, Yayoi; Abe, Michiko; Ikeda, Reiko; Uno, Jun; Oguri, Toyoko; Shibuya, Kazutoshi; Maesaki, Shigefumi; Mohri, Shinobu; Yamada, Tsuyoshi; Ishibashi, Hiroko; Hasumi, Yayoi; Abe, Shigeru

    2010-01-01

    The Japanese Society for Medical Mycology (JSMM) method used for testing the antifungal susceptibility of yeast, the MIC end point for azole antifungal agents, is currently set at IC(80). It was recently shown, however that there is an inconsistency in the MIC value between the JSMM method and the CLSI M27-A2 (CLSI) method, in which the end- point was to read as IC(50). To resolve this discrepancy and reassess the JSMM method, the MIC for three azoles, fluconazole, itraconazole and voriconazole were compared to 5 strains of each of the following Candida species: C. albicans, C. glabrata, C. tropicalis, C. parapsilosis and C. krusei, for a total of 25 comparisons, using the JSMM method, a modified JSMM method, and the CLSI method. The results showed that when the MIC end- point criterion of the JSMM method was changed from IC(80) to IC(50) (the modified JSMM method) , the MIC value was consistent and compatible with the CLSI method. Finally, it should be emphasized that the JSMM method, using a spectrophotometer for MIC measurement, was superior in both stability and reproducibility, as compared to the CLSI method in which growth was assessed by visual observation.

  18. BIPM Time Activities Update

    Science.gov (United States)

    2009-11-01

    VNIIFTRI and the PTB [7]. GPS time transfer represents today about 85% of the time links for TAI; in this technique, we make use of different types...campaign visited the PTB, the VNIIFTRI , and the AOS [8]. Already in 1996, the use of GLONASS in standard CGGTTS Common-View mode was proposed, but...are compared on regular basis to GPS and TWSTFT methods [16]. Also, with the agreement of the CCTF (2009), the link between the PTB and VNIIFTRI

  19. Proposal of an Appropriate Decalcification Method of Bone Marrow Biopsy Specimens in the Era of Expanding Genetic Molecular Study

    Directory of Open Access Journals (Sweden)

    Sung-Eun Choi

    2015-05-01

    Full Text Available Background: The conventional method for decalcification of bone specimens uses hydrochloric acid (HCl and is notorious for damaging cellular RNA, DNA, and proteins, thus complicating molecular and immunohistochemical analyses. A method that can effectively decalcify while preserving genetic material is necessary. Methods: Pairs of bilateral bone marrow biopsies sampled from 53 patients were decalcified according to protocols of two comparison groups: EDTA versus HCl and RDO GOLD (RDO versus HCl. Pairs of right and left bone marrow biopsy samples harvested from 28 cases were allocated into the EDTA versus HCl comparison group, and 25 cases to the RDO versus HCl comparison group. The decalcification protocols were compared with regards to histomorphology, immunohistochemistry, and molecular analysis. For molecular analysis, we randomly selected 5 cases from the EDTA versus HCl and RDO versus HCl groups. Results: The decalcification time for appropriate histomorphologic analysis was the longest in the EDTA method and the shortest in the RDO method. EDTA was superior to RDO or HCl in DNA yield and integrity, assessed via DNA extraction, polymerase chain reaction, and silver in situ hybridization using DNA probes. The EDTA method maintained intact nuclear protein staining on immunohistochemistry, while the HCl method produced poor quality images. Staining after the RDO method had equivocal results. RNA in situ hybridization using kappa and lambda RNA probes measured RNA integrity; the EDTA and RDO method had the best quality, followed by HCl. Conclusions: The EDTA protocol would be the best in preserving genetic material. RDO may be an acceptable alternative when rapid decalcification is necessary.

  20. Lagrangian relaxation technique in power systems operation planning: Multipliers updating problem

    Energy Technology Data Exchange (ETDEWEB)

    Ruzic, S. [Electric Power Utility of Serbia, Belgrade (Yugoslavia)

    1995-11-01

    All Lagrangian relaxation based approaches to the power systems operation planning have an important common part: the Lagrangian multipliers correction procedure. It is the subject of this paper. Different approaches presented in the literature are discussed and an original method for the Lagrangian multipliers updating is proposed. The basic idea of this new method is to update Lagrangian multipliers trying to satisfy Khun-Tucker optimality conditions. Instead of the dual function maximization the `distance of optimality function` is defined and minimized. If Khun-Tucker optimality conditions are satisfied the value of this function is in range (-1,0); otherwise the function has a big positive value. This method called `the distance of optimality method` takes into account future changes in planning generations due to the Lagrangian multipliers updating. The influence of changes in a multiplier associated to one system constraint to the satisfaction of some other system requirements is also considered. The numerical efficiency of the proposed method is analyzed and compared with results obtained using the sub-gradient technique. 20 refs, 2 tabs

  1. Proposal and Its Evaluation of a Shoulder-Surfing Attack Resistant Authentication Method:Secret Tap with Double Shift

    OpenAIRE

    Yoshihiro Kita; Fumio Sugai; MiRang Park; Naonobu Okazaki

    2015-01-01

    Recently, mobile terminals such as smartphones have come into widespread use. Most of such mobile terminals store several types of important data, such as personal information. Therefore, it is necessary to lock and unlock terminals using a personal authentication method such as personal identification numbers (PINs) in order to prevent data theft. However, most existing authentication methods have a common problem referred to here as “shoulder-surfing”, in which authentication information is...

  2. Testing the ability of a proposed geotechnical based method to evaluate the liquefaction potential analysis subjected to earthquake vibrations

    Science.gov (United States)

    Abbaszadeh Shahri, A.; Behzadafshar, K.; Esfandiyari, B.; Rajablou, R.

    2010-12-01

    During the earthquakes a number of earth dams have had severe damages or suffered major displacements as a result of liquefaction, thus modeling by computer codes can provide a reliable tool to predict the response of the dam foundation against earthquakes. These modeling can be used in the design of new dams or safety assessments of existing ones. In this paper, on base of the field and laboratory tests and by combination of several software packages a seismic geotechnical based analysis procedure is proposed and verified by comparison with computer model tests, field and laboratory experiences. Verification or validation of the analyses relies to ability of the applied computer codes. By use of Silakhor earthquake (2006, Ms 6.1) and in order to check the efficiency of the proposed framework, the procedure is applied to the Korzan earth dam of Iran which is located in Hamedan Province to analyze and estimate the liquefaction and safety factor. Design and development of a computer code by authors which named as “Abbas Converter” with graphical user interface which operates as logic connecter function that can computes and models the soil profiles is the critical point of this study and the results are confirm and proved the ability of the generated computer code on evaluation of soil behavior under the earthquake excitations. Also this code can make and render facilitate this study more than previous have done, and take over the encountered problem.

  3. Isotope dilution/mass spectrometry of serum cholesterol with [3,4-13C]cholesterol: proposed definitive method

    International Nuclear Information System (INIS)

    Pelletier, O.; Wright, L.A.; Breckenridge, W.C.

    1987-01-01

    We describe a new gas-chromatographic/mass-spectrometric (GC/MS) isotope-dilution method for determination of serum cholesterol. The method has been fully optimized and documented to provide the high accuracy and precision expected for a Definitive Method. In the presence of [3,4- 13 C]cholesterol, cholesteryl esters in serum are hydrolyzed under optimum conditions and the entire cholesterol pool is extracted and derivatized to silyl ethers. The cholesterol derivatives are resolved from other sterols by gas-liquid chromatography on a fused silica column, and selected ions characteristic of cholesterol and the [3,4- 13 C]cholesterol are monitored with a GC/MS quandrupole system. We estimated the cholesterol content of samples by bracketing each sample with standards of comparable cholesterol concentration that also contained the [3,4- 13 C]cholesterol. The procedure was highly reproducible (CV less than 0.5%), better accuracy and precision being obtained with [3,4- 13 C]cholesterol than with heptadeuterated cholesterol. Mean values per gram of dry serum for one serum pool assayed by this method and that of the National Bureau of Standards differed by 0.5%. We conclude that the method satisfies the criteria for a Definitive Method

  4. Empirical testing of forecast update procedure forseasonal products

    DEFF Research Database (Denmark)

    Wong, Chee Yew; Johansen, John

    2008-01-01

    Updating of forecasts is essential for successful collaborative forecasting, especially for seasonal products. This paper discusses the results of a theoretical simulation and an empirical test of a proposed time-series forecast updating procedure. It involves a two-stage longitudinal case study...... of a toy supply chain. The theoretical simulation involves historical weekly consumer demand data for 122 toy products. The empirical test is then carried out in real-time with 291 toy products. The results show that the proposed forecast updating procedure: 1) reduced forecast errors of the annual...... provided less forecast accuracy improvement and it needed a longer time to achieve relatively acceptable forecast uncertainty....

  5. MERCURY QUANTIFICATION IN SOILS USING THERMAL DESORPTION AND ATOMIC ABSORPTION SPECTROMETRY: PROPOSAL FOR AN ALTERNATIVE METHOD OF ANALYSIS

    Directory of Open Access Journals (Sweden)

    Liliane Catone Soares

    2015-08-01

    Full Text Available Despite the considerable environmental importance of mercury (Hg, given its high toxicity and ability to contaminate large areas via atmospheric deposition, little is known about its activity in soils, especially tropical soils, in comparison with other heavy metals. This lack of information about Hg arises because analytical methods for determination of Hg are more laborious and expensive compared to methods for other heavy metals. The situation is even more precarious regarding speciation of Hg in soils since sequential extraction methods are also inefficient for this metal. The aim of this paper is to present a technique of thermal desorption associated with atomic absorption spectrometry, TDAAS, as an efficient tool for quantitative determination of Hg in soils. The method consists of the release of Hg by heating, followed by its quantification by atomic absorption spectrometry. It was developed by constructing calibration curves in different soil samples based on increasing volumes of standard Hg2+ solutions. Performance, accuracy, precision, and quantification and detection limit parameters were evaluated. No matrix interference was detected. Certified reference samples and comparison with a Direct Mercury Analyzer, DMA (another highly recognized technique, were used in validation of the method, which proved to be accurate and precise.

  6. Updating of a dynamic finite element model from the Hualien scale model reactor building

    International Nuclear Information System (INIS)

    Billet, L.; Moine, P.; Lebailly, P.

    1996-08-01

    The forces occurring at the soil-structure interface of a building have generally a large influence on the way the building reacts to an earthquake. One can be tempted to characterise these forces more accurately bu updating a model from the structure. However, this procedure requires an updating method suitable for dissipative models, since significant damping can be observed at the soil-structure interface of buildings. Such a method is presented here. It is based on the minimization of a mechanical energy built from the difference between Eigen data calculated bu the model and Eigen data issued from experimental tests on the real structure. An experimental validation of this method is then proposed on a model from the HUALIEN scale-model reactor building. This scale-model, built on the HUALIEN site of TAIWAN, is devoted to the study of soil-structure interaction. The updating concerned the soil impedances, modelled by a layer of springs and viscous dampers attached to the building foundation. A good agreement was found between the Eigen modes and dynamic responses calculated bu the updated model and the corresponding experimental data. (authors). 12 refs., 3 figs., 4 tabs

  7. Proposed method for agglutinating antibody titer analysis and its use as indicator of acquired immunity in pacu, Piaractus mesopotamicus

    Directory of Open Access Journals (Sweden)

    JD Biller-Takahashi

    Full Text Available Antibody can be assessed by agglutinating antibody titer which is a quantitative measure of circulating antibodies in serum from fish previously immunized. The antibody evaluation has been performed with different fish species, and is considered a reliable method that can be applied to confirm several hypothesis regarding acquired immunity, even in conjunction with precise methods to describe immune mechanisms. In order to provide appropriate analytical methods for future studies on the specific immune system of native fish, the present study standardized on assay to measure the serum agglutinating antibody titer produced after immunization with inactivated A. hydrophila and levamisole administration in pacu. It was possible to determine the agglutinating antibodies titer in a satisfactorily way in pacu immunized with inactive A. hydrophila, and the highest titers were observed on fish fed with levamisole.

  8. Sequential updating of a new dynamic pharmacokinetic model for caffeine in premature neonates.

    Science.gov (United States)

    Micallef, Sandrine; Amzal, Billy; Bach, Véronique; Chardon, Karen; Tourneux, Pierre; Bois, Frédéric Y

    2007-01-01

    Caffeine treatment is widely used in nursing care to reduce the risk of apnoea in premature neonates. To check the therapeutic efficacy of the treatment against apnoea, caffeine concentration in blood is an important indicator. The present study was aimed at building a pharmacokinetic model as a basis for a medical decision support tool. In the proposed model, time dependence of physiological parameters is introduced to describe rapid growth of neonates. To take into account the large variability in the population, the pharmacokinetic model is embedded in a population structure. The whole model is inferred within a Bayesian framework. To update caffeine concentration predictions as data of an incoming patient are collected, we propose a fast method that can be used in a medical context. This involves the sequential updating of model parameters (at individual and population levels) via a stochastic particle algorithm. Our model provides better predictions than the ones obtained with models previously published. We show, through an example, that sequential updating improves predictions of caffeine concentration in blood (reduce bias and length of credibility intervals). The update of the pharmacokinetic model using body mass and caffeine concentration data is studied. It shows how informative caffeine concentration data are in contrast to body mass data. This study provides the methodological basis to predict caffeine concentration in blood, after a given treatment if data are collected on the treated neonate.

  9. A novel method of TVTS in the TS-3 device and the proposal of its application to a large device

    International Nuclear Information System (INIS)

    Tokimatsu, K.; Hayashi, N.; Ueda, Y.; Ono, Y.; Katsurai, M.

    1997-01-01

    A novel method of television Thomson scattering (TVTS) has been developed in the TS-3 device. In this system, a framing camera is located between a spectroscope and a close coupled device camera. This framing camera can take two successive frames from the exit of the spectroscope. The time interval is of 500 ns between those two frames. Because of the shortness of this time interval, the background light is negligible; moreover TVTS is applicable to the TS-3 device whose plasma lifetime is about 150 μs. This method indicates the possibility of not only high spatial resolution but also time repetition in a simple system. (orig.)

  10. Study of a proposed method of uranium concentration determination using low-energy γ-ray spectroscopy

    International Nuclear Information System (INIS)

    Rossiter, K.G.; Tang, J.C.N.

    1980-01-01

    The problems associated with in-situ uranium assaying are discussed, especially in relation to the secular disequilibrium between the parent uranium and its radioactive daughters. A detailed study of the gamma-spectra of some natural uranium bearing ore and mineral samples was performed using a high resolution Ge(Li) detector. A method of spectroscopic analysis of the low energy gamma-rays of U-238 and its daughter Th-234, using a proportional counter and a series of Ross filters, was found to be feasible. The application of such a method to uranium assaying in natural ore bodies is discussed

  11. Update History of This Database - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive [Life Science Database Archive metadata

    Lifescience Database Archive (English)

    Full Text Available List Contact us PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods ...B link & Genome analysis methods English archive site is opened. 2012/08/08 PGDBj... Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods is opened. About This...ate History of This Database - PGDBj Registered plant list, Marker list, QTL list, Plant DB link & Genome analysis methods | LSDB Archive ...

  12. PROPOSED ASTM METHOD FOR THE DETERMINATION OF ASBESTOS IN AIR BY TEM AND INFORMATION ON INTERFERING FIBERS

    Science.gov (United States)

    The draft of the ASTM Test Method for air entitled: "Airborne Asbestos Concentration in Ambient and Indoor Atmospheres as Determined by Transmission Electron Microscopy Direct Transfer (TEM)" (ASTM Z7077Z) is an adaptation of the International Standard, ISO 10312. It is currently...

  13. A proposed method to assess the damage risk of future climate change to museum objects in historic buildings

    NARCIS (Netherlands)

    Huijbregts, Z.; Kramer, R.P.; Martens, M.H.J.; Schijndel, van A.W.M.; Schellen, H.L.

    2012-01-01

    Future climate change is expected to have a critical effect on valuable museum collections that are housed in historic buildings. Changes of the indoor environment in the building affect the microclimate around the museum objects and may cause damage to the collection. In this study, a method is

  14. Scientific Opinion on a composting method proposed by Portugal as a heat treatment to eliminate pine wood nematode from the bark of pine trees

    DEFF Research Database (Denmark)

    Baker, R.; Candresse, T.; Dormannsné Simon, E.

    2010-01-01

    Following a request from the European Commission, the Panel on Plant Health was asked to deliver a scientific opinion on the appropriateness of a composting method proposed by Portugal as a heat treatment to eliminate pine wood nematode (PWN), Bursaphelenchus xylophilus (Steiner and Buhrer) Nickle......) insufficient evidence on the sampling methodology is provided to determine the reliability of the testing method provided by the Portuguese document to determine freedom from PWN. Although there is potential for development of a composting method as a heat treatment to eliminate PWN from bark of pine trees...

  15. DUAL STATE-PARAMETER UPDATING SCHEME ON A CONCEPTUAL HYDROLOGIC MODEL USING SEQUENTIAL MONTE CARLO FILTERS

    Science.gov (United States)

    Noh, Seong Jin; Tachikawa, Yasuto; Shiiba, Michiharu; Kim, Sunmin

    Applications of data assimilation techniques have been widely used to improve upon the predictability of hydrologic modeling. Among various data assimilation techniques, sequential Monte Carlo (SMC) filters, known as "particle filters" provide the capability to handle non-linear and non-Gaussian state-space models. This paper proposes a dual state-parameter updating scheme (DUS) based on SMC methods to estimate both state and parameter variables of a hydrologic model. We introduce a kernel smoothing method for the robust estimation of uncertain model parameters in the DUS. The applicability of the dual updating scheme is illustrated using the implementation of the storage function model on a middle-sized Japanese catchment. We also compare performance results of DUS combined with various SMC methods, such as SIR, ASIR and RPF.

  16. Estimation of beam material random field properties via sensitivity-based model updating using experimental frequency response functions

    Science.gov (United States)

    Machado, M. R.; Adhikari, S.; Dos Santos, J. M. C.; Arruda, J. R. F.

    2018-03-01

    Structural parameter estimation is affected not only by measurement noise but also by unknown uncertainties which are present in the system. Deterministic structural model updating methods minimise the difference between experimentally measured data and computational prediction. Sensitivity-based methods are very efficient in solving structural model updating problems. Material and geometrical parameters of the structure such as Poisson's ratio, Young's modulus, mass density, modal damping, etc. are usually considered deterministic and homogeneous. In this paper, the distributed and non-homogeneous characteristics of these parameters are considered in the model updating. The parameters are taken as spatially correlated random fields and are expanded in a spectral Karhunen-Loève (KL) decomposition. Using the KL expansion, the spectral dynamic stiffness matrix of the beam is expanded as a series in terms of discretized parameters, which can be estimated using sensitivity-based model updating techniques. Numerical and experimental tests involving a beam with distributed bending rigidity and mass density are used to verify the proposed method. This extension of standard model updating procedures can enhance the dynamic description of structural dynamic models.

  17. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    Kirwan, John R; Boers, Maarten; Hewlett, Sarah

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides guidelines for the development and validation of outcome measures for use in clinical research. The "Truth" section of the OMERACT Filter presupposes an explicit framework for identifying the relevant core outcomes...... for defining core areas of measurement ("Filter 2.0 Core Areas of Measurement") was presented at OMERACT 11 to explore areas of consensus and to consider whether already endorsed core outcome sets fit into this newly proposed framework. METHODS: Discussion groups critically reviewed the extent to which case......, presentation, and clarity of the framework were questioned. The discussion groups and subsequent feedback highlighted 20 such issues. CONCLUSION: These issues will require resolution to reach consensus on accepting the proposed Filter 2.0 framework of Core Areas as the basis for the selection of Core Outcome...

  18. Effect of asynchronous updating on the stability of cellular automata

    International Nuclear Information System (INIS)

    Baetens, J.M.; Van der Weeën, P.; De Baets, B.

    2012-01-01

    Highlights: ► An upper bound on the Lyapunov exponent of asynchronously updated CA is established. ► The employed update method has repercussions on the stability of CAs. ► A decision on the employed update method should be taken with care. ► Substantial discrepancies arise between synchronously and asynchronously updated CA. ► Discrepancies between different asynchronous update schemes are less pronounced. - Abstract: Although cellular automata (CAs) were conceptualized as utter discrete mathematical models in which the states of all their spatial entities are updated simultaneously at every consecutive time step, i.e. synchronously, various CA-based models that rely on so-called asynchronous update methods have been constructed in order to overcome the limitations that are tied up with the classical way of evolving CAs. So far, only a few researchers have addressed the consequences of this way of updating on the evolved spatio-temporal patterns, and the reachable stationary states. In this paper, we exploit Lyapunov exponents to determine to what extent the stability of the rules within a family of totalistic CAs is affected by the underlying update method. For that purpose, we derive an upper bound on the maximum Lyapunov exponent of asynchronously iterated CAs, and show its validity, after which we present a comparative study between the Lyapunov exponents obtained for five different update methods, namely one synchronous method and four well-established asynchronous methods. It is found that the stability of CAs is seriously affected if one of the latter methods is employed, whereas the discrepancies arising between the different asynchronous methods are far less pronounced and, finally, we discuss the repercussions of our findings on the development of CA-based models.

  19. Withdrawal of corticosteroids in inflammatory bowel disease patients after dependency periods ranging from 2 to 45 years: a proposed method.

    LENUS (Irish Health Repository)

    Murphy, S J

    2012-02-01

    BACKGROUND: Even in the biologic era, corticosteroid dependency in IBD patients is common and causes a lot of morbidity, but methods of withdrawal are not well described. AIM: To assess the effectiveness of a corticosteroid withdrawal method. METHODS: Twelve patients (10 men, 2 women; 6 ulcerative colitis, 6 Crohn\\'s disease), median age 53.5 years (range 29-75) were included. IBD patients with quiescent disease refractory to conventional weaning were transitioned to oral dexamethasone, educated about symptoms of the corticosteroid withdrawal syndrome (CWS) and weaned under the supervision of an endocrinologist. When patients failed to wean despite a slow weaning pace and their IBD remaining quiescent, low dose synthetic ACTH stimulation testing was performed to assess for adrenal insufficiency. Multivariate analysis was performed to assess predictors of a slow wean. RESULTS: Median durations for disease and corticosteroid dependency were 21 (range 3-45) and 14 (range 2-45) years respectively. Ten patients (83%) were successfully weaned after a median follow-up from final wean of 38 months (range 5-73). Disease flares occurred in two patients, CWS in five and ACTH testing was performed in 10. Multivariate analysis showed that longer duration of corticosteroid use appeared to be associated with a slower wean (P = 0.056). CONCLUSIONS: Corticosteroid withdrawal using this protocol had a high success rate and durable effect and was effective in patients with long-standing (up to 45 years) dependency. As symptoms of CWS mimic symptoms of IBD disease flares, gastroenterologists may have difficulty distinguishing them, which may be a contributory factor to the frequency of corticosteroid dependency in IBD patients.

  20. Number needed to benefit from information (NNBI): proposal from a mixed methods research study with practicing family physicians.

    Science.gov (United States)

    Pluye, Pierre; Grad, Roland M; Johnson-Lafleur, Janique; Granikov, Vera; Shulha, Michael; Marlow, Bernard; Ricarte, Ivan Luiz Marques

    2013-01-01

    We wanted to describe family physicians' use of information from an electronic knowledge resource for answering clinical questions, and their perception of subsequent patient health outcomes; and to estimate the number needed to benefit from information (NNBI), defined as the number of patients for whom clinical information was retrieved for 1 to benefit. We undertook a mixed methods research study, combining quantitative longitudinal and qualitative research studies. Participants were 41 family physicians from primary care clinics across Canada. Physicians were given access to 1 electronic knowledge resource on handheld computer in 2008-2009. For the outcome assessment, participants rated their searches using a validated method. Rated searches were examined during interviews guided by log reports that included ratings. Cases were defined as clearly described searches where clinical information was used for a specific patient. For each case, interviewees described information-related patient health outcomes. For the mixed methods data analysis, quantitative and qualitative data were merged into clinical vignettes (each vignette describing a case). We then estimated the NNBI. In 715 of 1,193 searches for information conducted during an average of 86 days, the search objective was directly linked to a patient. Of those searches, 188 were considered to be cases. In 53 cases, participants associated the use of information with at least 1 patient health benefit. This finding suggested an NNBI of 14 (715/53). The NNBI may be used in further experimental research to compare electronic knowledge resources. A low NNBI can encourage clinicians to search for information more frequently. If all searches had benefits, the NNBI would be 1. In addition to patient benefits, learning and knowledge reinforcement outcomes are frequently reported.

  1. Sugammadex: An Update

    Directory of Open Access Journals (Sweden)

    Ezri Tiberiu

    2016-01-01

    Full Text Available The purpose of this update is to provide recent knowledge and debates regarding the use of sugammadex in the fields of anesthesia and critical care. The review is not intended to provide a comprehensive description of sugammadex and its clinical use.

  2. Supreme Court Update

    Science.gov (United States)

    Taylor, Kelley R.

    2009-01-01

    "Chief Justice Flubs Oath." "Justice Ginsburg Has Cancer Surgery." At the start of this year, those were the news headlines about the U.S. Supreme Court. But January 2009 also brought news about key education cases--one resolved and two others on the docket--of which school administrators should take particular note. The Supreme Court updates on…

  3. Update of telephone exchange

    CERN Multimedia

    2006-01-01

    As part of the upgrade of telephone services, the CERN switching centre will be updated on Monday 3 July between 8.00 p.m. and 3.00 a.m. Telephone services may be disrupted and possibly even interrupted during this operation.We apologise in advance for any inconvenience this may cause. CERN TELECOM Service

  4. Update of telephone exchange

    CERN Multimedia

    2006-01-01

    As part of the upgrade of telephone services, the CERN switching centre will be updated on Wednesday 14 June between 8.00 p.m. and midnight. Telephone services may be disrupted and possibly even interrupted during this operation. We apologise in advance for any inconvenience this may cause. CERN TELECOM Service

  5. Update of telephone exchange

    CERN Multimedia

    2006-01-01

    As part of the upgrade of telephone services, the CERN switching centre will be updated on between Monday 23 October 8.00 p.m. and Tuesday 24 October 2.00 a.m. Telephone services may be disrupted and possibly even interrupted during this operation. We apologise in advance for any inconvenience this may cause. CERN TELECOM Service

  6. Update of telephone exchange

    CERN Multimedia

    2006-01-01

    As part of the upgrade of telephone services, the CERN switching centre will be updated on Monday 3 July between 8.00 p.m. and 3.00 a.m. Telephone services may be disrupted and possibly even interrupted during this operation. We apologise in advance for any inconvenience this may cause. CERN TELECOM Service

  7. [Cardiology update in 2016].

    Science.gov (United States)

    Gabus, Vincent; Tran, Van Nam; Regamey, Julien; Pascale, Patrizio; Monney, Pierre; Hullin, Roger; Vogt, Pierre

    2017-01-11

    In 2016 the European Society of Cardiology (ESC) published new guidelines. These documents update the knowledge in various fields such as atrial fibrillation, heart failure, cardiovascular prevention and dyslipidemia. Of course it is impossible to summarize these guidelines in detail. Nevertheless, we decided to highlight the major modifications, and to emphasize some key points that are especially useful for the primary care physician.

  8. OSATE Overview & Community Updates

    Science.gov (United States)

    2015-02-15

    update 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Delange /Julien 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK...main language capabilities Modeling patterns & model samples for beginners Error-Model examples EMV2 model constructs Demonstration of tools Case

  9. Proposals for the Operationalisation of the Discourse Theory of Laclau and Mouffe Using a Triangulation of Lexicometrical and Interpretative Methods

    Directory of Open Access Journals (Sweden)

    Georg Glasze

    2007-05-01

    Full Text Available The discourse theory of Ernesto LACLAU and Chantal MOUFFE brings together three elements: the FOUCAULTian notion of discourse, the (post- MARXist notion of hegemony, and the poststructuralist writings of Jacques DERRIDA and Roland BARTHES. Discourses are regarded as temporary fixations of differential relations. Meaning, i.e. any social "objectivity", is conceptualised as an effect of such a fixation. The discussion on an appropriate operationalisation of such a discourse theory is just beginning. In this paper, it is argued that a triangulation of two linguistic methods is appropriate to reveal temporary fixations: by means of corpus-driven lexicometric procedures as well as by the analysis of narrative patterns, the regularities of the linkage of elements can be analysed (for example, in diachronic comparisons. The example of a geographic research project shows how, in so doing, the historically contingent constitution of an international community and "world region" can be analysed. URN: urn:nbn:de:0114-fqs0702143

  10. Model parameter updating using Bayesian networks

    International Nuclear Information System (INIS)

    Treml, C.A.; Ross, Timothy J.

    2004-01-01

    This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.

  11. Clays as green catalysts in the cholesterol esterification: spectroscopic characterization and polymorphs identification by thermal analysis methods. An interdisciplinary laboratorial proposal for the undergraduate level

    International Nuclear Information System (INIS)

    Maria, Teresa M R.; Nunes, Rui M. D.; Pereira, Mariette M.; Eusebio, M. Ermelinda S.

    2009-01-01

    A laboratory experiment that enables the professor to introduce the problematic of sustainable development in pharmaceutical chemistry to undergraduate students is proposed, using a simple synthetic procedure. Cholesteryl acetate is prepared by the esterification of cholesterol using Montmorillonite K10 as heterogeneous catalyst. Cholesterol and cholesteryl acetate are characterized by spectroscopic ( 1 H RMN, 13 C RMN, FTIR) and thermal analysis techniques. The thermal methods are used to introduce the concepts of polymorphism and the nature of mesophases. (author)

  12. Optimal updating magnitude in adaptive flat-distribution sampling.

    Science.gov (United States)

    Zhang, Cheng; Drake, Justin A; Ma, Jianpeng; Pettitt, B Montgomery

    2017-11-07

    We present a study on the optimization of the updating magnitude for a class of free energy methods based on flat-distribution sampling, including the Wang-Landau (WL) algorithm and metadynamics. These methods rely on adaptive construction of a bias potential that offsets the potential of mean force by histogram-based updates. The convergence of the bias potential can be improved by decreasing the updating magnitude with an optimal schedule. We show that while the asymptotically optimal schedule for the single-bin updating scheme (commonly used in the WL algorithm) is given by the known inverse-time formula, that for the Gaussian updating scheme (commonly used in metadynamics) is often more complex. We further show that the single-bin updating scheme is optimal for very long simulations, and it can be generalized to a class of bandpass updating schemes that are similarly optimal. These bandpass updating schemes target only a few long-range distribution modes and their optimal schedule is also given by the inverse-time formula. Constructed from orthogonal polynomials, the bandpass updating schemes generalize the WL and Langfeld-Lucini-Rago algorithms as an automatic parameter tuning scheme for umbrella sampling.

  13. Settlement behavior of the container for high-level nuclear waste disposal. Centrifuge model tests and proposal for simple evaluation method for settlement behavior

    International Nuclear Information System (INIS)

    Nakamura, Kunihiko; Tanaka, Yukihisa

    2004-01-01

    In Japan, bentonite will be used as buffer materials in high-level nuclear waste disposal. In the softened buffer material with the infiltration of various properties of under ground water, if the container deeply sinks, the decrease of the thickness of the buffer materials may lose the required abilities. Therefore, it is very important to consider settlement of container. In this study, influences of distilled water and artificial seawater on the settlement of the container were investigated and a simple evaluation method for settlement of the container was proposed. The following findings were obtained from this study. (1) Under the distilled water, amount of settlement decreases exponentially as dry density becomes larger. (2) While the amount of settlement of container under the 10% artificial seawater was almost equal to the one in the distilled water, the container was floating under the 100% artificial seawater. (3) The simple evaluation method for settlement of container was proposed based on the diffuse double layer theory, and the effectiveness of the proposed method was demonstrated by the results of several experiments. (author)

  14. Relationship between gastroesophageal reflux disease and Ph nose and salivary: proposal of a simple method outpatient in patients adults.

    Science.gov (United States)

    Caruso, Arturo Armone; Del Prete, Salvatore; Ferrara, Lydia; Serra, Raffaele; Telesca, Donato Alessandro; Ruggiero, Simona; Russo, Teresa; Sivero, Luigi

    2016-01-01

    The frequency of gastroesophageal reflux disease (GERD) is increasing, in part through easy inspection of the upper digestive tract, but especially for a real spread of the disease as a consequence of modernity, lifestyle, incorrect dietary rules, and stress arising from social norms. It is a common chronic gastrointestinal disorder in Europe and the United States. The aim of our study is to highlight a relationship between gastroesophageal reflux disease and salivary pH as evidenced by indicator strips, especially in the outpatient field. Twenty adult subjects (10 males and 10 females) aged between 18 and 50 years (GROUP A)_ were selected. How to control a homogeneous group of 20 patients without GERD, or from any type of allergies (GROUP B) was enlisted. This method has provided excellent results showing no difference in the measured values compared with the traditional instrumental measurement. Our study has allowed us to observe a strong correlation between the saliva pH, nasal cavities and the interaction between the two districts, and could be the basis for a diagnosis of GERD especially in primary health care clinics and in the initial stage of the disease.

  15. Trends and regional variations in provision of contraception methods in a commercially insured population in the United States based on nationally proposed measures.

    Science.gov (United States)

    Law, A; Yu, J S; Wang, W; Lin, J; Lynen, R

    2017-09-01

    Three measures to assess the provision of effective contraception methods among reproductive-aged women have recently been endorsed for national public reporting. Based on these measures, this study examined real-world trends and regional variations of contraceptive provision in a commercially insured population in the United States. Women 15-44years old with continuous enrollment in each year from 2005 to 2014 were identified from a commercial claims database. In accordance with the proposed measures, percentages of women (a) provided most effective or moderately effective (MEME) methods of contraception and (b) provided a long-acting reversible contraceptive (LARC) method were calculated in two populations: women at risk for unintended pregnancy and women who had a live birth within 3 and 60days of delivery. During the 10-year period, the percentages of women at risk for unintended pregnancy provided MEME contraceptive methods increased among 15-20-year-olds (24.5%-35.9%) and 21-44-year-olds (26.2%-31.5%), and those provided a LARC method also increased among 15-20-year-olds (0.1%-2.4%) and 21-44-year-olds (0.8%-3.9%). Provision of LARC methods increased most in the North Central and West among both age groups of women. Provision of MEME contraceptives and LARC methods to women who had a live birth within 60days postpartum also increased across age groups and regions. This assessment indicates an overall trend of increasing provision of MEME contraceptive methods in the commercial sector, albeit with age group and regional variations. If implemented, these proposed measures may have impacts on health plan contraceptive access policy. Copyright © 2017 Elsevier Inc. All rights reserved.

  16. BN-600 MOX Core Benchmark Analysis. Results from Phases 4 and 6 of a Coordinated Research Project on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects

    International Nuclear Information System (INIS)

    2013-12-01

    For those Member States that have or have had significant fast reactor development programmes, it is of utmost importance that they have validated up to date codes and methods for fast reactor physics analysis in support of R and D and core design activities in the area of actinide utilization and incineration. In particular, some Member States have recently focused on fast reactor systems for minor actinide transmutation and on cores optimized for consuming rather than breeding plutonium; the physics of the breeder reactor cycle having already been widely investigated. Plutonium burning systems may have an important role in managing plutonium stocks until the time when major programmes of self-sufficient fast breeder reactors are established. For assessing the safety of these systems, it is important to determine the prediction accuracy of transient simulations and their associated reactivity coefficients. In response to Member States' expressed interest, the IAEA sponsored a coordinated research project (CRP) on Updated Codes and Methods to Reduce the Calculational Uncertainties of the LMFR Reactivity Effects. The CRP started in November 1999 and, at the first meeting, the members of the CRP endorsed a benchmark on the BN-600 hybrid core for consideration in its first studies. Benchmark analyses of the BN-600 hybrid core were performed during the first three phases of the CRP, investigating different nuclear data and levels of approximation in the calculation of safety related reactivity effects and their influence on uncertainties in transient analysis prediction. In an additional phase of the benchmark studies, experimental data were used for the verification and validation of nuclear data libraries and methods in support of the previous three phases. The results of phases 1, 2, 3 and 5 of the CRP are reported in IAEA-TECDOC-1623, BN-600 Hybrid Core Benchmark Analyses, Results from a Coordinated Research Project on Updated Codes and Methods to Reduce the

  17. Proposal for Testing and Validation of Vacuum Ultra-Violet Atomic Laser-Induced Fluorescence as a Method to Analyze Carbon Grid Erosion in Ion Thrusters

    Science.gov (United States)

    Stevens, Richard

    2003-01-01

    Previous investigation under award NAG3-25 10 sought to determine the best method of LIF to determine the carbon density in a thruster plume. Initial reports from other groups were ambiguous as to the number of carbon clusters that might be present in the plume of a thruster. Carbon clusters would certainly affect the ability to LIF; if they were the dominant species, then perhaps the LIF method should target clusters. The results of quadrupole mass spectroscopy on sputtered carbon determined that minimal numbers of clusters were sputtered from graphite under impact from keV Krypton. There were some investigations in the keV range by other groups that hinted at clusters, but at the time the proposal was presented to NASA, there was no data from low-energy sputtering available. Thus, the proposal sought to develop a method to characterize the population only of atoms sputtered from a graphite target in a test cell. Most of the ground work had been established by the previous two years of investigation. The proposal covering 2003 sought to develop an anti-Stokes Raman shifting cell to generate VUW light and test this cell on two different laser systems, ArF and YAG- pumped dye. The second goal was to measure the lowest detectable amounts of carbon atoms by 156.1 nm and 165.7 nm LIF. If equipment was functioning properly, it was expected that these goals would be met easily during the timeframe of the proposal, and that is the reason only modest funding was requested. The PI was only funded at half- time by Glenn during the summer months. All other work time was paid for by Whitworth College. The college also funded a student, Charles Shawley, who worked on the project during the spring.

  18. Genome Update: alignment of bacterial chromosomes

    DEFF Research Database (Denmark)

    Ussery, David; Jensen, Mette; Poulsen, Tine Rugh

    2004-01-01

    There are four new microbial genomes listed in this month's Genome Update, three belonging to Gram-positive bacteria and one belonging to an archaeon that lives at pH 0; all of these genomes are listed in Table 1⇓. The method of genome comparison this month is that of genome alignment and, as an ...

  19. 2016 updated MASCC/ESMO consensus recommendations

    DEFF Research Database (Denmark)

    Roila, Fausto; Warr, David; Hesketh, Paul J

    2017-01-01

    PURPOSE: An update of the recommendations for the prophylaxis of acute and delayed emesis induced by moderately emetogenic chemotherapy published after the last MASCC/ESMO antiemetic consensus conference in 2009 has been carried out. METHODS: A systematic literature search using PubMed from Janua...

  20. The CAM-ICU has now a French "official" version. The translation process of the 2014 updated Complete Training Manual of the Confusion Assessment Method for the Intensive Care Unit in French (CAM-ICU.fr).

    Science.gov (United States)

    Chanques, Gérald; Garnier, Océane; Carr, Julie; Conseil, Matthieu; de Jong, Audrey; Rowan, Christine M; Ely, E Wesley; Jaber, Samir

    2017-10-01

    Delirium is common in Intensive-Care-Unit (ICU) patients but under-recognized by bed-side clinicians when not using validated delirium-screening tools. The Confusion-Assessment-Method for the ICU (CAM-ICU) has demonstrated very good psychometric properties, and has been translated into many different languages though not into French. We undertook this opportunity to describe the translation process. The translation was performed following recommended guidelines. The updated method published in 2014 including introduction letters, worksheet and flowsheet for bed-side use, the method itself, case-scenarios for training and Frequently-Asked-Questions (32 pages) was translated into French language by a neuropsychological researcher who was not familiar with the original method. Then, the whole method was back-translated by a native English-French bilingual speaker. The new English version was compared to the original one by the Vanderbilt University ICU-delirium-team. Discrepancies were discussed between the two teams before final approval of the French version. The entire process took one year. Among the 3692 words of the back-translated version of the method itself, 18 discrepancies occurred. Eight (44%) lead to changes in the final version. Details of the translation process are provided. The French version of CAM-ICU is now available for French-speaking ICUs. The CAM-ICU is provided with its complete training-manual that was challenging to translate following recommended process. While many such translations have been done for other clinical tools, few have published the details of the process itself. We hope that the availability of such teaching material will now facilitate a large implementation of delirium-screening in French-speaking ICUs. Copyright © 2017 Société française d'anesthésie et de réanimation (Sfar). All rights reserved.

  1. Sequence History Update Tool

    Science.gov (United States)

    Khanampompan, Teerapat; Gladden, Roy; Fisher, Forest; DelGuercio, Chris

    2008-01-01

    The Sequence History Update Tool performs Web-based sequence statistics archiving for Mars Reconnaissance Orbiter (MRO). Using a single UNIX command, the software takes advantage of sequencing conventions to automatically extract the needed statistics from multiple files. This information is then used to populate a PHP database, which is then seamlessly formatted into a dynamic Web page. This tool replaces a previous tedious and error-prone process of manually editing HTML code to construct a Web-based table. Because the tool manages all of the statistics gathering and file delivery to and from multiple data sources spread across multiple servers, there is also a considerable time and effort savings. With the use of The Sequence History Update Tool what previously took minutes is now done in less than 30 seconds, and now provides a more accurate archival record of the sequence commanding for MRO.

  2. Photovoltaic Shading Testbed for Module-Level Power Electronics: 2016 Performance Data Update

    Energy Technology Data Exchange (ETDEWEB)

    Deline, Chris [National Renewable Energy Lab. (NREL), Golden, CO (United States); Meydbray, Jenya [PV Evolution Labs (PVEL), Davis, CA (United States); Donovan, Matt [PV Evolution Labs (PVEL), Davis, CA (United States)

    2016-09-01

    The 2012 NREL report 'Photovoltaic Shading Testbed for Module-Level Power Electronics' provides a standard methodology for estimating the performance benefit of distributed power electronics under partial shading conditions. Since the release of the report, experiments have been conducted for a number of products and for different system configurations. Drawing from these experiences, updates to the test and analysis methods are recommended. Proposed changes in data processing have the benefit of reducing the sensitivity to measurement errors and weather variability, as well as bringing the updated performance score in line with measured and simulated values of the shade recovery benefit of distributed PV power electronics. Also, due to the emergence of new technologies including sub-module embedded power electronics, the shading method has been extended to include power electronics that operate at a finer granularity than the module level. An update to the method is proposed to account for these emerging technologies that respond to shading differently than module-level devices. The partial shading test remains a repeatable test procedure that attempts to simulate shading situations as would be experienced by typical residential or commercial rooftop photovoltaic (PV) systems. Performance data for multiple products tested using this method are discussed, based on equipment from Enphase, Solar Edge, Maxim Integrated and SMA. In general, the annual recovery of shading losses from the module-level electronics evaluated is 25-35%, with the major difference between different trials being related to the number of parallel strings in the test installation rather than differences between the equipment tested. Appendix D data has been added in this update.

  3. Sparsity-promoting orthogonal dictionary updating for image reconstruction from highly undersampled magnetic resonance data

    International Nuclear Information System (INIS)

    Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu

    2015-01-01

    Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy. (paper)

  4. Sparsity-promoting orthogonal dictionary updating for image reconstruction from highly undersampled magnetic resonance data.

    Science.gov (United States)

    Huang, Jinhong; Guo, Li; Feng, Qianjin; Chen, Wufan; Feng, Yanqiu

    2015-07-21

    Image reconstruction from undersampled k-space data accelerates magnetic resonance imaging (MRI) by exploiting image sparseness in certain transform domains. Employing image patch representation over a learned dictionary has the advantage of being adaptive to local image structures and thus can better sparsify images than using fixed transforms (e.g. wavelets and total variations). Dictionary learning methods have recently been introduced to MRI reconstruction, and these methods demonstrate significantly reduced reconstruction errors compared to sparse MRI reconstruction using fixed transforms. However, the synthesis sparse coding problem in dictionary learning is NP-hard and computationally expensive. In this paper, we present a novel sparsity-promoting orthogonal dictionary updating method for efficient image reconstruction from highly undersampled MRI data. The orthogonality imposed on the learned dictionary enables the minimization problem in the reconstruction to be solved by an efficient optimization algorithm which alternately updates representation coefficients, orthogonal dictionary, and missing k-space data. Moreover, both sparsity level and sparse representation contribution using updated dictionaries gradually increase during iterations to recover more details, assuming the progressively improved quality of the dictionary. Simulation and real data experimental results both demonstrate that the proposed method is approximately 10 to 100 times faster than the K-SVD-based dictionary learning MRI method and simultaneously improves reconstruction accuracy.

  5. Proposal of a simple screening method for a rapid preliminary evaluation of ''heavy metals'' mobility in soils of contaminated sites

    Energy Technology Data Exchange (ETDEWEB)

    Pinto, Valentina; Chiusolo, Francesca; Cremisini, Carlo [ENEA - Italian Agency for New Technologies, Energy and Environment, Rome (Italy). Section PROTCHIM

    2010-09-15

    Risks associated to ''heavy metals'' (HM) soil contamination depend not only on their total content but, mostly, on their mobility. Many extraction procedures have been developed to evaluate HM mobility in contaminated soils, but they are generally time consuming (especially the sequential extraction procedures (SEPs)) and consequently applicable on a limited number of samples. For this reason, a simple screening method, applicable even ''in field'', has been proposed in order to obtain a rapid evaluation of HM mobility in polluted soils, mainly focused on the fraction associated to Fe and Mn oxide/hydroxides. A buffer solution of trisodium citrate and hydroxylamine hydrochloride was used as extractant for a single-step leaching test. The choice of this buffered solution was strictly related to the possibility of directly determining, via titration with dithizone (DZ), the content of Zn, Cu, Pb and Cd, which are among the most representative contaminants in highly mineralised soils. Moreover, the extraction solution is similar, aside from for the pH value, which is the one used in the BCR SEP second step. The analysis of bivalents ions through DZ titration was exploited in order to further simplify and quicken the whole procedure. The proposed method generically measures, in few minutes, the concentration of total extractable ''heavy metals'' expressed as molL{sup -1} without distinguishing between elements. The proposed screening method has been developed and applied on soil samples collected from rural, urban and mining areas, representing different situation of soil contamination. Results were compared with data obtained from the BCR procedure. The screening method demonstrated to be a reliable tool for a rapid evaluation of metals mobility. Therefore, it could be very useful, even ''in field'', both to guide the sampling activity on site and to monitor the efficacy of the subsequent

  6. Emissions trading and competitive positions. The European Proposal for a Directive establishing a Framework for Greenhouse Gas Emissions Trading and Methods for the initial Allocation of Pollution Rights

    International Nuclear Information System (INIS)

    Grimeaud, D.; Peeters, M.

    2002-10-01

    The study on the intention to introduce emissions trading on a European Union level was conducted on the basis of the following three questions: Which methods can be used (by the Member States) to distribute the tradable emissions rights en which legal preconditions should be observed considering the EU-Treaty and the relevant directive proposal? Whenever necessary and possible international agreements on climate change and international trade law will be mentioned. Which safeguards are available for fair competition and which system of emissions trading is advisable in this perspective? How should the PSR (performance standard rate) system, which is preferred by industry, be valued? The structure of this study is as follows: in chapter 2 insight is given into the various methods that can be used to start an emissions trading system, i.e. the way tradable pollution rights are distributed (initial allocation). Chapter 3 will further examine the system of the initial allocation of pollution rights as it has been chosen in the proposal for the European directive. The aim is to give an exact qualification of the method of emissions trading, especially the method of initial allocation, that is used in the directive proposal. Chapter 4 examines whether safeguards are available to prevent competition distortions between firms that fall under the scope of the emissions trading scheme. Special attention will be given to conditions that result from the EU-Treaty in this context, such as the prohibition of state aid. In this chapter the international trade law will be dealt with as well. Chapter 5 will present an executive summary and the specific question whether the PSR-system is legally acceptable or maybe even recommendable, will be answered

  7. Annual Pension Fund Update

    CERN Multimedia

    Pension Fund

    2011-01-01

    All members and beneficiaries of the Pension Fund are invited to attend the Annual Pension Fund Update to be held in the CERN Council Chamber on Tuesday 20 September 2011 from 10-00 to 12-00 a.m. Copies of the 2010 Financial Statements are available from departmental secretariats. Coffee and croissants will be served prior to the meeting as of 9-30 a.m.

  8. Medi SPICE : an update

    OpenAIRE

    Mc Caffery, Fergal; Dorling, Alec; Casey, Valentine

    2010-01-01

    peer-reviewed This paper provides an update on the development of a software process assessment and improvement model (Medi SPICE) specifically for the medical device industry. The development of Medi SPICE was launched at the SPICE 2009 Conference. Medi SPICE will consist of a Process Reference Model and a Process Assessment Model. The Medi SPICE Process Assessment Model will be used to perform conformant assessments of the software process capability of medical device suppliers in accord...

  9. Ontario Hydro's DSP update

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    Ontario Hydro's Demand/Supply Plan (DSP), the 25 year plan which was submitted in December 1989, is currently being reviewed by the Environmental Assessment Board (EAB). Since 1989 there have been several changes which have led Ontario Hydro to update the original Demand/Supply Plan. This information sheet gives a quick overview of what has changed and how Ontario Hydro is adapting to that change

  10. Perforated peptic ulcer - an update

    Science.gov (United States)

    Chung, Kin Tong; Shelat, Vishalkumar G

    2017-01-01

    Peptic ulcer disease (PUD) affects 4 million people worldwide annually. The incidence of PUD has been estimated at around 1.5% to 3%. Perforated peptic ulcer (PPU) is a serious complication of PUD and patients with PPU often present with acute abdomen that carries high risk for morbidity and mortality. The lifetime prevalence of perforation in patients with PUD is about 5%. PPU carries a mortality ranging from 1.3% to 20%. Thirty-day mortality rate reaching 20% and 90-d mortality rate of up to 30% have been reported. In this review we have summarized the current evidence on PPU to update readers. This literature review includes the most updated information such as common causes, clinical features, diagnostic methods, non-operative and operative management, post-operative complications and different scoring systems of PPU. With the advancement of medical technology, PUD can now be treated with medications instead of elective surgery. The classic triad of sudden onset of abdominal pain, tachycardia and abdominal rigidity is the hallmark of PPU. Erect chest radiograph may miss 15% of cases with air under the diaphragm in patients with bowel perforation. Early diagnosis, prompt resuscitation and urgent surgical intervention are essential to improve outcomes. Exploratory laparotomy and omental patch repair remains the gold standard. Laparoscopic surgery should be considered when expertise is available. Gastrectomy is recommended in patients with large or malignant ulcer. PMID:28138363

  11. Perforated peptic ulcer - an update.

    Science.gov (United States)

    Chung, Kin Tong; Shelat, Vishalkumar G

    2017-01-27

    Peptic ulcer disease (PUD) affects 4 million people worldwide annually. The incidence of PUD has been estimated at around 1.5% to 3%. Perforated peptic ulcer (PPU) is a serious complication of PUD and patients with PPU often present with acute abdomen that carries high risk for morbidity and mortality. The lifetime prevalence of perforation in patients with PUD is about 5%. PPU carries a mortality ranging from 1.3% to 20%. Thirty-day mortality rate reaching 20% and 90-d mortality rate of up to 30% have been reported. In this review we have summarized the current evidence on PPU to update readers. This literature review includes the most updated information such as common causes, clinical features, diagnostic methods, non-operative and operative management, post-operative complications and different scoring systems of PPU. With the advancement of medical technology, PUD can now be treated with medications instead of elective surgery. The classic triad of sudden onset of abdominal pain, tachycardia and abdominal rigidity is the hallmark of PPU. Erect chest radiograph may miss 15% of cases with air under the diaphragm in patients with bowel perforation. Early diagnosis, prompt resuscitation and urgent surgical intervention are essential to improve outcomes. Exploratory laparotomy and omental patch repair remains the gold standard. Laparoscopic surgery should be considered when expertise is available. Gastrectomy is recommended in patients with large or malignant ulcer.

  12. Online updating of context-aware landmark detectors for prostate localization in daily treatment CT images

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Xiubin [College of Geographic and Biologic Information, Nanjing University of Posts and Telecommunications, Nanjing, Jiangsu 210015, China and IDEA Lab, Department of Radiology and BRIC, University of North Carolina at Chapel Hill, 130 Mason Farm Road, Chapel Hill, North Carolina 27510 (United States); Gao, Yaozong [IDEA Lab, Department of Radiology and BRIC, University of North Carolina at Chapel Hill, 130 Mason Farm Road, Chapel Hill, North Carolina 27510 (United States); Shen, Dinggang, E-mail: dgshen@med.unc.edu [IDEA Lab, Department of Radiology and BRIC, University of North Carolina at Chapel Hill, 130 Mason Farm Road, Chapel Hill, North Carolina 27510 and Department of Brain and Cognitive Engineering, Korea University, Seoul (Korea, Republic of)

    2015-05-15

    Purpose: In image guided radiation therapy, it is crucial to fast and accurately localize the prostate in the daily treatment images. To this end, the authors propose an online update scheme for landmark-guided prostate segmentation, which can fully exploit valuable patient-specific information contained in the previous treatment images and can achieve improved performance in landmark detection and prostate segmentation. Methods: To localize the prostate in the daily treatment images, the authors first automatically detect six anatomical landmarks on the prostate boundary by adopting a context-aware landmark detection method. Specifically, in this method, a two-layer regression forest is trained as a detector for each target landmark. Once all the newly detected landmarks from new treatment images are reviewed or adjusted (if necessary) by clinicians, they are further included into the training pool as new patient-specific information to update all the two-layer regression forests for the next treatment day. As more and more treatment images of the current patient are acquired, the two-layer regression forests can be continually updated by incorporating the patient-specific information into the training procedure. After all target landmarks are detected, a multiatlas random sample consensus (multiatlas RANSAC) method is used to segment the entire prostate by fusing multiple previously segmented prostates of the current patient after they are aligned to the current treatment image. Subsequently, the segmented prostate of the current treatment image is again reviewed (or even adjusted if needed) by clinicians before including it as a new shape example into the prostate shape dataset for helping localize the entire prostate in the next treatment image. Results: The experimental results on 330 images of 24 patients show the effectiveness of the authors’ proposed online update scheme in improving the accuracies of both landmark detection and prostate segmentation

  13. Online updating of context-aware landmark detectors for prostate localization in daily treatment CT images

    International Nuclear Information System (INIS)

    Dai, Xiubin; Gao, Yaozong; Shen, Dinggang

    2015-01-01

    Purpose: In image guided radiation therapy, it is crucial to fast and accurately localize the prostate in the daily treatment images. To this end, the authors propose an online update scheme for landmark-guided prostate segmentation, which can fully exploit valuable patient-specific information contained in the previous treatment images and can achieve improved performance in landmark detection and prostate segmentation. Methods: To localize the prostate in the daily treatment images, the authors first automatically detect six anatomical landmarks on the prostate boundary by adopting a context-aware landmark detection method. Specifically, in this method, a two-layer regression forest is trained as a detector for each target landmark. Once all the newly detected landmarks from new treatment images are reviewed or adjusted (if necessary) by clinicians, they are further included into the training pool as new patient-specific information to update all the two-layer regression forests for the next treatment day. As more and more treatment images of the current patient are acquired, the two-layer regression forests can be continually updated by incorporating the patient-specific information into the training procedure. After all target landmarks are detected, a multiatlas random sample consensus (multiatlas RANSAC) method is used to segment the entire prostate by fusing multiple previously segmented prostates of the current patient after they are aligned to the current treatment image. Subsequently, the segmented prostate of the current treatment image is again reviewed (or even adjusted if needed) by clinicians before including it as a new shape example into the prostate shape dataset for helping localize the entire prostate in the next treatment image. Results: The experimental results on 330 images of 24 patients show the effectiveness of the authors’ proposed online update scheme in improving the accuracies of both landmark detection and prostate segmentation

  14. Agent Communication for Dynamic Belief Update

    Science.gov (United States)

    Kobayashi, Mikito; Tojo, Satoshi

    Thus far, various formalizations of rational / logical agent model have been proposed. In this paper, we include the notion of communication channel and belief modality into update logic, and introduce Belief Update Logic (BUL). First, we discuss that how we can reformalize the inform action of FIPA-ACL into communication channel, which represents a connection between agents. Thus, our agents can send a message only when they believe, and also there actually is, a channel between him / her and a receiver. Then, we present a static belief logic (BL) and show its soundness and completeness. Next, we develop the logic to BUL, which can update Kripke model by the inform action; in which we show that in the updated model the belief operator also satisfies K45. Thereafter, we show that every sentence in BUL can be translated into BL; thus, we can contend that BUL is also sound and complete. Furthermore, we discuss the features of CUL, including the case of inconsistent information, as well as channel transmission. Finally, we summarize our contribution and discuss some future issues.

  15. Proposal and field practice of a 'hiyarihatto' activity method for promotion of statements of participants for nuclear power plant organization

    International Nuclear Information System (INIS)

    Aoyagi, Saizo; Fujino, Hidenori; Ishii, Hirotake; Shimoda, Hiroshi; Sakuda, Hiroshi; Yoshikawa, Hidekazu; Sugiman, Toshio

    2011-01-01

    In a 'hiyarihatto' activity, workers report and discuss incident cases related to their work. Such an activity is particularly effective for cultivating participants' attitudes about safety. Nevertheless, a conventional face-to-face hiyarihatto activity includes features that are inappropriate for conduct in a nuclear power plant organization. For example, workers at nuclear power plants are geographically distributed and busy. Therefore, they have great difficulty in participating in a face-to-face hiyarihatto activity. Furthermore, workers' hesitation in discussing problems inhibits the continuation of their active participation. This study is conducted to propose a hiyarihatto activity with an asynchronous and distributed computer-mediated communication (CMC) for a nuclear power plant organization, with the demonstration of its effectiveness through field practice. The proposed method also involves the introduction of special participants who follow action guidelines for the promotion of the continuation of the activity. The method was used in an actual nuclear power plant organization. Results showed that the method is effective under some conditions, such as during periods without facility inspection. Special participants promoted the activity in some cases. Moreover, other factors affecting the activity and some improvements were identified. (author)

  16. Effects of disease severity distribution on the performance of quantitative diagnostic methods and proposal of a novel ‘V-plot’ methodology to display accuracy values

    Science.gov (United States)

    Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P

    2018-01-01

    Background Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test’s performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. Methods and findings We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Cholrapid and Cholgold) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). Conclusion No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard. PMID:29387424

  17. Developing an Agent-Based Simulation System for Post-Earthquake Operations in Uncertainty Conditions: A Proposed Method for Collaboration among Agents

    Directory of Open Access Journals (Sweden)

    Navid Hooshangi

    2018-01-01

    Full Text Available Agent-based modeling is a promising approach for developing simulation tools for natural hazards in different areas, such as during urban search and rescue (USAR operations. The present study aimed to develop a dynamic agent-based simulation model in post-earthquake USAR operations using geospatial information system and multi agent systems (GIS and MASs, respectively. We also propose an approach for dynamic task allocation and establishing collaboration among agents based on contract net protocol (CNP and interval-based Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS methods, which consider uncertainty in natural hazards information during agents’ decision-making. The decision-making weights were calculated by analytic hierarchy process (AHP. In order to implement the system, earthquake environment was simulated and the damage of the buildings and a number of injuries were calculated in Tehran’s District 3: 23%, 37%, 24% and 16% of buildings were in slight, moderate, extensive and completely vulnerable classes, respectively. The number of injured persons was calculated to be 17,238. Numerical results in 27 scenarios showed that the proposed method is more accurate than the CNP method in the terms of USAR operational time (at least 13% decrease and the number of human fatalities (at least 9% decrease. In interval uncertainty analysis of our proposed simulated system, the lower and upper bounds of uncertain responses are evaluated. The overall results showed that considering uncertainty in task allocation can be a highly advantageous in the disaster environment. Such systems can be used to manage and prepare for natural hazards.

  18. Tritium extraction methods proposed for a solid breeder blanket. Subtask WP-B 6.1 of the European Blanket Program 1996

    International Nuclear Information System (INIS)

    Albrecht, H.

    1997-04-01

    Ten different methods for the extraction of tritium from the purge gas of a ceramic blanket are described and evaluated with respect to their applicability for ITER and DEMO. The methods are based on the conditions that the purge gas is composed of helium with an addition of up to 0.1% of H 2 or O 2 and H 2 O to facilitate the release of tritium, and that tritium occurs in the purge gas in two main chemical forms, i.e. HT and HTO. Individual process steps of many methods are identical; in particular, the application of cold traps, molecular sieve beds, and diffusors are proposed in several cases. Differences between the methods arise mainly from the ways in which various process steps are combined and from the operating conditions which are chosen with respect to temperature and pressure. Up to now, none of the methods has been demonstrated to be reliably applicable for the purge gas conditions foreseen for the operation of an ITER blanket test module (or larger ceramic blanket designs such as for DEMO). These conditions are characterized by very high gas flow rates and extremely low concentrations of HT and HTO. Therefore, a proposal has been made (FZK concept) which is expected to have the best potential for applicability to ITER and DEMO and to incorporate the smallest development risk. In this concept, the extraction of tritium and excess hydrogen is accomplished by using a cold trap for freezing out HTO/H 2 O and a 5A molecular sieve bed for the adsorption of HT/H 2 . (orig.) [de

  19. Method of dynamic fuzzy symptom vector in intelligent diagnosis

    International Nuclear Information System (INIS)

    Sun Hongyan; Jiang Xuefeng

    2010-01-01

    Aiming at the requirement of diagnostic symptom real-time updating brought from diagnostic knowledge accumulation and great gap in unit and value of diagnostic symptom in multi parameters intelligent diagnosis, the method of dynamic fuzzy symptom vector is proposed. The concept of dynamic fuzzy symptom vector is defined. Ontology is used to specify the vector elements, and the vector transmission method based on ontology is built. The changing law of symptom value is analyzed and fuzzy normalization method based on fuzzy membership functions is built. An instance proved method of dynamic fussy symptom vector is efficient to solve the problems of symptom updating and unify of symptom value and unit. (authors)

  20. In situ hybridization detection methods for HPV16 E6/E7 mRNA in identifying transcriptionally active HPV infection of oropharyngeal carcinoma: an updating.

    Science.gov (United States)

    Volpi, Chiara C; Ciniselli, Chiara M; Gualeni, Ambra V; Plebani, Maddalena; Alfieri, Salvatore; Verderio, Paolo; Locati, Laura; Perrone, Federica; Quattrone, Pasquale; Carbone, Antonino; Pilotti, Silvana; Gloghini, Annunziata

    2018-04-01

    The aim of this study is to compare 2 in situ hybridization (ISH) detection methods for human papilloma virus (HPV) 16 E6/E7 mRNA, that is, the RNAscope 2.0 High Definition (HD) and the upgraded RNAscope 2.5 HD version. The RNAscope 2.5 HD has recently replaced the RNAscope 2.0 HD detection kit. Therefore, this investigation starts from the need to analytically validate the new mRNA ISH assay and, possibly, to refine the current algorithm for HPV detection in oropharyngeal squamous cell carcinoma with the final goal of applying it to daily laboratory practice. The study was based on HPV status and on generated data, interpreted by a scoring algorithm. The results highlighted that the compared RNAscope HPV tests had a good level of interchangeability and enabled to identify oropharyngeal squamous cell carcinoma that are truly driven by high-risk HPV infection. This was also supported by the comparison of the RNAscope HPV test with HPV E6/E7 mRNA real-time reverse-transcription polymerase chain reaction in a fraction of cases where material for HPV E6/E7 mRNA real-time reverse-transcription polymerase chain reaction was available. Furthermore, the algorithm that associates p16 immunohistochemistry with the identification of HPV mRNA by RNAscope was more effective than the one that associated p16 immunohistochemistry with the identification of HPV DNA by ISH. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Preconditioner Updates Applied to CFD Model Problems

    Czech Academy of Sciences Publication Activity Database

    Birken, P.; Duintjer Tebbens, Jurjen; Meister, A.; Tůma, Miroslav

    2008-01-01

    Roč. 58, č. 11 (2008), s. 1628-1641 ISSN 0168-9274 R&D Projects: GA AV ČR 1ET400300415; GA AV ČR KJB100300703 Institutional research plan: CEZ:AV0Z10300504 Keywords : finite volume methods * update preconditioning * Krylov subspace methods * Euler equations * conservation laws Subject RIV: BA - General Mathematics Impact factor: 0.952, year: 2008

  2. Update in women's health.

    Science.gov (United States)

    Ganschow, Pamela S; Jacobs, Elizabeth A; Mackinnon, Jennifer; Charney, Pamela

    2009-06-01

    The aim of this clinical update is to summarize articles and guidelines published in the last year with the potential to change current clinical practice as it relates to women's health. We used two independent search strategies to identify articles relevant to women's health published between March 1, 2007 and February 29, 2008. First, we reviewed the Cochrane Database of Systematic Reviews and journal indices from the ACP Journal Club, Annals of Internal Medicine, Archives of Internal Medicine, British Medical Journal, Circulation, Diabetes, JAMA, JGIM, Journal of Women's Health, Lancet, NEJM, Obstetrics and Gynecology, and Women's Health Journal Watch. Second, we performed a MEDLINE search using the medical subject heading term "sex factors." The authors, who all have clinical and/or research experience in the area of women's health, reviewed all article titles, abstracts, and, when indicated, full publications. We excluded articles related to obstetrical aspects of women's health focusing on those relevant to general internists. We had two acceptance criteria, scientific rigor and potential to impact women's health. We also identified new and/or updated women's health guidelines released during the same time period. We identified over 250 publications with potential relevance to women's health. Forty-six articles were selected for presentation as part of the Clinical Update, and nine were selected for a more detailed discussion in this paper. Evidence-based women's health guidelines are listed in Table 1. Table 1 Important Women's Health Guidelines in 2007-2008: New or Updated Topic Issuing organization Updated recommendations and comments Mammography screening in women 40-4917 ACP Individualized risk assessment and informed decision making should be used to guide decisions about mammography screening in this age group. To aid in the risk assessment, a discussion of the risk factors, which if present in a woman in her 40s increases her risk to above that of an

  3. Updating Dosimetry for Emergency Response Dose Projections.

    Science.gov (United States)

    DeCair, Sara

    2016-02-01

    In 2013, the U.S. Environmental Protection Agency (EPA) proposed an update to the 1992 Protective Action Guides (PAG) Manual. The PAG Manual provides guidance to state and local officials planning for radiological emergencies. EPA requested public comment on the proposed revisions, while making them available for interim use by officials faced with an emergency situation. Developed with interagency partners, EPA's proposal incorporates newer dosimetric methods, identifies tools and guidelines developed since the current document was issued, and extends the scope of the PAGs to all significant radiological incidents, including radiological dispersal devices or improvised nuclear devices. In order to best serve the emergency management community, scientific policy direction had to be set on how to use International Commission on Radiological Protection Publication 60 age groups in dose assessment when implementing emergency guidelines. Certain guidelines that lend themselves to different PAGs for different subpopulations are the PAGs for potassium iodide (KI), food, and water. These guidelines provide age-specific recommendations because of the radiosensitivity of the thyroid and young children with respect to ingestion and inhalation doses in particular. Taking protective actions like using KI, avoiding certain foods or using alternative sources of drinking water can be relatively simple to implement by the parents of young children. Clear public messages can convey which age groups should take which action, unlike how an evacuation or relocation order should apply to entire households or neighborhoods. New in the PAG Manual is planning guidance for the late phase of an incident, after the situation is stabilized and efforts turn toward recovery. Because the late phase can take years to complete, decision makers are faced with managing public exposures in areas not fully remediated. The proposal includes quick-reference operational guidelines to inform re-entry to

  4. Effects of disease severity distribution on the performance of quantitative diagnostic methods and proposal of a novel 'V-plot' methodology to display accuracy values.

    Science.gov (United States)

    Petraco, Ricardo; Dehbi, Hakim-Moulay; Howard, James P; Shun-Shin, Matthew J; Sen, Sayan; Nijjer, Sukhjinder S; Mayet, Jamil; Davies, Justin E; Francis, Darrel P

    2018-01-01

    Diagnostic accuracy is widely accepted by researchers and clinicians as an optimal expression of a test's performance. The aim of this study was to evaluate the effects of disease severity distribution on values of diagnostic accuracy as well as propose a sample-independent methodology to calculate and display accuracy of diagnostic tests. We evaluated the diagnostic relationship between two hypothetical methods to measure serum cholesterol (Chol rapid and Chol gold ) by generating samples with statistical software and (1) keeping the numerical relationship between methods unchanged and (2) changing the distribution of cholesterol values. Metrics of categorical agreement were calculated (accuracy, sensitivity and specificity). Finally, a novel methodology to display and calculate accuracy values was presented (the V-plot of accuracies). No single value of diagnostic accuracy can be used to describe the relationship between tests, as accuracy is a metric heavily affected by the underlying sample distribution. Our novel proposed methodology, the V-plot of accuracies, can be used as a sample-independent measure of a test performance against a reference gold standard.

  5. WIMS Library updating

    International Nuclear Information System (INIS)

    Ravnik, M.; Trkov, A.; Holubar, A.

    1992-01-01

    At the end of 1990 the WIMS Library Update Project (WLUP) has been initiated at the International Atomic Energy Agency. The project was organized as an international research project, coordinated at the J. Stefan Institute. Up to now, 22 laboratories from 19 countries joined the project. Phase 1 of the project, which included WIMS input optimization for five experimental benchmark lattices, has been completed. The work presented in this paper describes also the results of Phase 2 of the Project, in which the cross sections based on ENDF/B-IV evaluated nuclear data library have been processed. (author) [sl

  6. Fusion Energy Update

    International Nuclear Information System (INIS)

    Whitson, M.O.

    1982-01-01

    Fusion Energy Update (CFU) provides monthly abstracting and indexing coverage of current scientific and technical reports, journal articles, conference papers and proceedings, books, patents, theses, and monographs for all sources on fusion energy. All information announced in CFU, plus additional backup information, is included in the energy information data base of the Department of Energy's Technical Information Center. The subject matter covered by CFU includes plasma physics, the physics and engineering of blankets, magnet coils and fields, power supplies and circuitry, cooling systems, fuel systems, radiation hazards, power conversion systems, inertial confinement systems, and component development and testing

  7. Context updates are hierarchical

    Directory of Open Access Journals (Sweden)

    Anton Karl Ingason

    2016-10-01

    Full Text Available This squib studies the order in which elements are added to the shared context of interlocutors in a conversation. It focuses on context updates within one hierarchical structure and argues that structurally higher elements are entered into the context before lower elements, even if the structurally higher elements are pronounced after the lower elements. The crucial data are drawn from a comparison of relative clauses in two head-initial languages, English and Icelandic, and two head-final languages, Korean and Japanese. The findings have consequences for any theory of a dynamic semantics.

  8. Shipment security update - 2003

    International Nuclear Information System (INIS)

    Patterson, John; Anne, Catherine

    2003-01-01

    At the 2002 RERTR, NAC reported on the interim measures taken by the U.S. Nuclear Regulatory Commission to enhance the security afforded to shipments of spent nuclear fuel. Since that time, there have been a number of additional actions focused on shipment security including training programs sponsored by the U.S. Department of Transportation and the Electric Power Research Council, investigation by the Government Accounting Office, and individual measures taken by shippers and transportation agents. The paper will present a status update regarding this dynamic set of events and provide an objective assessment of the cost, schedule and technical implications of the changing security landscape. (author)

  9. LBTool: A stochastic toolkit for leave-based key updates

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Quantitative techniques have been successfully employed in verification of information and communication systems. However, the use of such techniques are still rare in the area of security. In this paper, we present a toolkit that implements transient analysis on a key update method for wireless...... sensor networks. The analysis aims to find out the probability of a network key being compromised at a specific time point, which result in fluctuations over time for a specific key update method called Leave-based key update. For such a problem, the use of current tools is limited in many ways...

  10. Frequency response function (FRF) based updating of a laser spot welded structure

    Science.gov (United States)

    Zin, M. S. Mohd; Rani, M. N. Abdul; Yunus, M. A.; Sani, M. S. M.; Wan Iskandar Mirza, W. I. I.; Mat Isa, A. A.

    2018-04-01

    The objective of this paper is to present frequency response function (FRF) based updating as a method for matching the finite element (FE) model of a laser spot welded structure with a physical test structure. The FE model of the welded structure was developed using CQUAD4 and CWELD element connectors, and NASTRAN was used to calculate the natural frequencies, mode shapes and FRF. Minimization of the discrepancies between the finite element and experimental FRFs was carried out using the exceptional numerical capability of NASTRAN Sol 200. The experimental work was performed under free-free boundary conditions using LMS SCADAS. Avast improvement in the finite element FRF was achieved using the frequency response function (FRF) based updating with two different objective functions proposed.

  11. A comparison of updating algorithms for large N reduced models

    Energy Technology Data Exchange (ETDEWEB)

    Pérez, Margarita García [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); González-Arroyo, Antonio [Instituto de Física Teórica UAM-CSIC, Universidad Autónoma de Madrid,Nicolás Cabrera 13-15, E-28049-Madrid (Spain); Departamento de Física Teórica, C-XI Universidad Autónoma de Madrid,E-28049 Madrid (Spain); Keegan, Liam [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland); Okawa, Masanori [Graduate School of Science, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Core of Research for the Energetic Universe, Hiroshima University,Higashi-Hiroshima, Hiroshima 739-8526 (Japan); Ramos, Alberto [PH-TH, CERN,CH-1211 Geneva 23 (Switzerland)

    2015-06-29

    We investigate Monte Carlo updating algorithms for simulating SU(N) Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole SU(N) matrix at once, or iterating through SU(2) subgroups of the SU(N) matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  12. A comparison of updating algorithms for large $N$ reduced models

    CERN Document Server

    Pérez, Margarita García; Keegan, Liam; Okawa, Masanori; Ramos, Alberto

    2015-01-01

    We investigate Monte Carlo updating algorithms for simulating $SU(N)$ Yang-Mills fields on a single-site lattice, such as for the Twisted Eguchi-Kawai model (TEK). We show that performing only over-relaxation (OR) updates of the gauge links is a valid simulation algorithm for the Fabricius and Haan formulation of this model, and that this decorrelates observables faster than using heat-bath updates. We consider two different methods of implementing the OR update: either updating the whole $SU(N)$ matrix at once, or iterating through $SU(2)$ subgroups of the $SU(N)$ matrix, we find the same critical exponent in both cases, and only a slight difference between the two.

  13. Second-generation speed limit map updating applications

    DEFF Research Database (Denmark)

    Tradisauskas, Nerius; Agerholm, Niels; Juhl, Jens

    2011-01-01

    Intelligent Speed Adaptation is an Intelligent Transport System developed to significantly improve road safety in helping car drivers maintain appropriate driving behaviour. The system works in connection with the speed limits on the road network. It is thus essential to keep the speed limit map...... used in the Intelligent Speed Adaptation scheme updated. The traditional method of updating speed limit maps on the basis of long time interval observations needed to be replaced by a more efficient speed limit updating tool. In a Danish Intelligent Speed Adaptation trial a web-based tool was therefore...... for map updating should preferably be made on the basis of a commercial map provider, 2 such as Google Maps and that the real challenge is to oblige road authorities to carry out updates....

  14. Technical Update: Preimplantation Genetic Diagnosis and Screening.

    Science.gov (United States)

    Dahdouh, Elias M; Balayla, Jacques; Audibert, François; Wilson, R Douglas; Audibert, François; Brock, Jo-Ann; Campagnolo, Carla; Carroll, June; Chong, Karen; Gagnon, Alain; Johnson, Jo-Ann; MacDonald, William; Okun, Nanette; Pastuck, Melanie; Vallée-Pouliot, Karine

    2015-05-01

    To update and review the techniques and indications of preimplantation genetic diagnosis (PGD) and preimplantation genetic screening (PGS). Discussion about the genetic and technical aspects of preimplantation reproductive techniques, particularly those using new cytogenetic technologies and embryo-stage biopsy. Clinical outcomes of reproductive techniques following the use of PGD and PGS are included. This update does not discuss in detail the adverse outcomes that have been recorded in association with assisted reproductive technologies. Published literature was retrieved through searches of The Cochrane Library and Medline in April 2014 using appropriate controlled vocabulary (aneuploidy, blastocyst/physiology, genetic diseases, preimplantation diagnosis/methods, fertilization in vitro) and key words (e.g., preimplantation genetic diagnosis, preimplantation genetic screening, comprehensive chromosome screening, aCGH, SNP microarray, qPCR, and embryo selection). Results were restricted to systematic reviews, randomized controlled trials/controlled clinical trials, and observational studies published from 1990 to April 2014. There were no language restrictions. Searches were updated on a regular basis and incorporated in the update to January 2015. Additional publications were identified from the bibliographies of retrieved articles. Grey (unpublished) literature was identified through searching the websites of health technology assessment and health technology-related agencies, clinical practice guideline collections, clinical trial registries, and national and international medical specialty societies. The quality of evidence in this document was rated using the criteria described in the Report of the Canadian Task Force on Preventive Health Care. (Table 1) BENEFITS, HARMS, AND COSTS: This update will educate readers about new preimplantation genetic concepts, directions, and technologies. The major harms and costs identified are those of assisted reproductive

  15. Identifying null meta-analyses that are ripe for updating

    Directory of Open Access Journals (Sweden)

    Fang Manchun

    2003-07-01

    Full Text Available Abstract Background As an increasingly large number of meta-analyses are published, quantitative methods are needed to help clinicians and systematic review teams determine when meta-analyses are not up to date. Methods We propose new methods for determining when non-significant meta-analytic results might be overturned, based on a prediction of the number of participants required in new studies. To guide decision making, we introduce the "new participant ratio", the ratio of the actual number of participants in new studies to the predicted number required to obtain statistical significance. A simulation study was conducted to study the performance of our methods and a real meta-analysis provides further evidence. Results In our three simulation configurations, our diagnostic test for determining whether a meta-analysis is out of date had sensitivity of 55%, 62%, and 49% with corresponding specificity of 85%, 80%, and 90% respectively. Conclusions Simulations suggest that our methods are able to detect out-of-date meta-analyses. These quick and approximate methods show promise for use by systematic review teams to help decide whether to commit the considerable resources required to update a meta-analysis. Further investigation and evaluation of the methods is required before they can be recommended for general use.

  16. Toward a holistic environmental impact assessment of marble quarrying and processing: proposal of a novel easy-to-use IPAT-based method.

    Science.gov (United States)

    Capitano, Cinzia; Peri, Giorgia; Rizzo, Gianfranco; Ferrante, Patrizia

    2017-03-01

    Marble is a natural dimension stone that is widely used in building due to its resistance and esthetic qualities. Unfortunately, some concerns have arisen regarding its production process because quarrying and processing activities demand significant amounts of energy and greatly affect the environment. Further, performing an environmental analysis of a production process such as that of marble requires the consideration of many environmental aspects (e.g., noise, vibrations, dust and waste production, energy consumption). Unfortunately, the current impact accounting tools do not seem to be capable of considering all of the major aspects of the (marble) production process that may affect the environment and thus cannot provide a comprehensive and concise assessment of all environmental aspects associated with the marble production process. Therefore, innovative, easy, and reliable methods for evaluating its environmental impact are necessary, and they must be accessible for the non-technician. The present study intends to provide a contribution in this sense by proposing a reliable and easy-to-use evaluation method to assess the significance of the environmental impacts associated with the marble production process. In addition, an application of the method to an actual marble-producing company is presented to demonstrate its practicability. Because of its relative ease of use, the method presented here can also be used as a "self-assessment" tool for pursuing a virtuous environmental policy because it enables company owners to easily identify the segments of their production chain that most require environmental enhancement.

  17. WIMS nuclear data library and its updating

    Energy Technology Data Exchange (ETDEWEB)

    Bakhtyar, S; Salahuddin, A; Arshad, M

    1995-10-01

    This report gives a brief overview of the status of reactor physics computer code WIMS-D/4 and its library. It presents the details of WIMS-D/4 Library Update Project (WLUP), initiated by International Atomic Energy Agency (IAEA) with the goal of providing updated nuclear data library to the user of WIMS-D/4. The WLUP was planned to be executed in several stages. In this report the calculations performed for the first stage are presented. A number of benchmarks for light water and heavy water lattices proposed by IAEA have been analysed and the results have been compared with the average of experimental values, the IAEA reference values and the average of calculated results from different international laboratories. (author) 8 figs.

  18. WIMS nuclear data library and its updating

    International Nuclear Information System (INIS)

    Bakhtyar, S.; Salahuddin, A.; Arshad, M.

    1995-10-01

    This report gives a brief overview of the status of reactor physics computer code WIMS-D/4 and its library. It presents the details of WIMS-D/4 Library Update Project (WLUP), initiated by International Atomic Energy Agency (IAEA) with the goal of providing updated nuclear data library to the user of WIMS-D/4. The WLUP was planned to be executed in several stages. In this report the calculations performed for the first stage are presented. A number of benchmarks for light water and heavy water lattices proposed by IAEA have been analysed and the results have been compared with the average of experimental values, the IAEA reference values and the average of calculated results from different international laboratories. (author) 8 figs

  19. Information dissemination model for social media with constant updates

    Science.gov (United States)

    Zhu, Hui; Wu, Heng; Cao, Jin; Fu, Gang; Li, Hui

    2018-07-01

    With the development of social media tools and the pervasiveness of smart terminals, social media has become a significant source of information for many individuals. However, false information can spread rapidly, which may result in negative social impacts and serious economic losses. Thus, reducing the unfavorable effects of false information has become an urgent challenge. In this paper, a new competitive model called DMCU is proposed to describe the dissemination of information with constant updates in social media. In the model, we focus on the competitive relationship between the original false information and updated information, and then propose the priority of related information. To more effectively evaluate the effectiveness of the proposed model, data sets containing actual social media activity are utilized in experiments. Simulation results demonstrate that the DMCU model can precisely describe the process of information dissemination with constant updates, and that it can be used to forecast information dissemination trends on social media.

  20. WIMS Library updating

    Energy Technology Data Exchange (ETDEWEB)

    Ravnik, M; Trkov, A [Inst. Jozef Stefan, Ljubljana (Slovenia); Holubar, A [Ustav Jaderneho Vyzkumu CSKAE, Rez (Serbia and Montenegro)

    1992-07-01

    At the end of 1990 the WIMS Library Update Project (WLUP) has been initiated at the International Atomic Energy Agency. The project was organized as an international research project, coordinated at the J. Stefan Institute. Up to now, 22 laboratories from 19 countries joined the project. Phase 1 of the project, which included WIMS input optimization for five experimental benchmark lattices, has been completed. The work presented in this paper describes also the results of Phase 2 of the Project, in which the cross sections based on ENDF/B-IV evaluated nuclear data library have been processed. (author) [Slovenian] Konec 1990 se je na Mednarodni agenciji za atomsko energijo zacel projekt obnove knjiznice presekov programa WIMS (WIMS Library Updating Project, WLUP). V projektu sodeluje 22 laboratorijev iz 19 drzav, koordiniramo pa ga na Institutu Jozef Stefan. Doslej je koncana faza 1 tega projekta, ki obsega optimizacijo vhodnega modela programa WIMS za pet eksperimentalnih testnih problemov. Podani so tudi rezultati faze 2, v kateri so se procesirali preseki na osnovi ENDF/B-IV datoteke. (author)

  1. Cross-Cultural Adaptation and Validation of the MPAM-R to Brazilian Portuguese and Proposal of a New Method to Calculate Factor Scores

    Science.gov (United States)

    Albuquerque, Maicon R.; Lopes, Mariana C.; de Paula, Jonas J.; Faria, Larissa O.; Pereira, Eveline T.; da Costa, Varley T.

    2017-01-01

    In order to understand the reasons that lead individuals to practice physical activity, researchers developed the Motives for Physical Activity Measure-Revised (MPAM-R) scale. In 2010, a translation of MPAM-R to Portuguese and its validation was performed. However, psychometric measures were not acceptable. In addition, factor scores in some sports psychology scales are calculated by the mean of scores by items of the factor. Nevertheless, it seems appropriate that items with higher factor loadings, extracted by Factor Analysis, have greater weight in the factor score, as items with lower factor loadings have less weight in the factor score. The aims of the present study are to translate, validate the MPAM-R for Portuguese versions, and investigate agreement between two methods used to calculate factor scores. Three hundred volunteers who were involved in physical activity programs for at least 6 months were collected. Confirmatory Factor Analysis of the 30 items indicated that the version did not fit the model. After excluding four items, the final model with 26 items showed acceptable model fit measures by Exploratory Factor Analysis, as well as it conceptually supports the five factors as the original proposal. When two methods are compared to calculate factors scores, our results showed that only “Enjoyment” and “Appearance” factors showed agreement between methods to calculate factor scores. So, the Portuguese version of the MPAM-R can be used in a Brazilian context, and a new proposal for the calculation of the factor score seems to be promising. PMID:28293203

  2. Numerical model updating technique for structures using firefly algorithm

    Science.gov (United States)

    Sai Kubair, K.; Mohan, S. C.

    2018-03-01

    Numerical model updating is a technique used for updating the existing experimental models for any structures related to civil, mechanical, automobiles, marine, aerospace engineering, etc. The basic concept behind this technique is updating the numerical models to closely match with experimental data obtained from real or prototype test structures. The present work involves the development of numerical model using MATLAB as a computational tool and with mathematical equations that define the experimental model. Firefly algorithm is used as an optimization tool in this study. In this updating process a response parameter of the structure has to be chosen, which helps to correlate the numerical model developed with the experimental results obtained. The variables for the updating can be either material or geometrical properties of the model or both. In this study, to verify the proposed technique, a cantilever beam is analyzed for its tip deflection and a space frame has been analyzed for its natural frequencies. Both the models are updated with their respective response values obtained from experimental results. The numerical results after updating show that there is a close relationship that can be brought between the experimental and the numerical models.

  3. Particle Filtering Methods for Incorporating Intelligence Updates

    Science.gov (United States)

    2017-03-01

    past time steps. 3.2.1 Particle Filtering through Bayesian Bootstrap Sampling Although SIS helps resolve the computational and complexity issues...variables. This insight was called the Bayesian bootstrap filter, or more commonly called the particle filter. Multiple particles are sampled from an...2012) 16 maps of drug flow into the United States. Business Insider Online, (July 8), http://www.businessinsider.com/16-maps-of-drug-flow-into-the

  4. Hamstring injuries: update article

    Directory of Open Access Journals (Sweden)

    Lucio Ernlund

    Full Text Available ABSTRACT Hamstring (HS muscle injuries are the most common injury in sports. They are correlated to long rehabilitations and have a great tendency to recur. The HS consist of the long head of the biceps femoris, semitendinosus, and semimembranosus. The patient's clinical presentation depends on the characteristics of the lesion, which may vary from strain to avulsions of the proximal insertion. The most recognized risk factor is a previous injury. Magnetic resonance imaging is the method of choice for the injury diagnosis and classification. Many classification systems have been proposed; the current classifications aim to describe the injury and correlate it to the prognosis. The treatment is conservative, with the use of anti-inflammatory drugs in the acute phase followed by a muscle rehabilitation program. Proximal avulsions have shown better results with surgical repair. When the patient is pain free, shows recovery of strength and muscle flexibility, and can perform the sport's movements, he/she is able to return to play. Prevention programs based on eccentric strengthening of the muscles have been indicated both to prevent the initial injury as well as preventing recurrence.

  5. Oriented Polar Molecules in a Solid Inert-Gas Matrix: A Proposed Method for Measuring the Electric Dipole Moment of the Electron

    Directory of Open Access Journals (Sweden)

    A. C. Vutha

    2018-01-01

    Full Text Available We propose a very sensitive method for measuring the electric dipole moment of the electron using polar molecules embedded in a cryogenic solid matrix of inert-gas atoms. The polar molecules can be oriented in the z ^ -direction by an applied electric field, as has recently been demonstrated by Park et al. The trapped molecules are prepared into a state that has its electron spin perpendicular to z ^ , and a magnetic field along z ^ causes precession of this spin. An electron electric dipole moment d e would affect this precession due to the up to 100 GV/cm effective electric field produced by the polar molecule. The large number of polar molecules that can be embedded in a matrix, along with the expected long coherence times for the precession, allows for the possibility of measuring d e to an accuracy that surpasses current measurements by many orders of magnitude. Because the matrix can inhibit molecular rotations and lock the orientation of the polar molecules, it may not be necessary to have an electric field present during the precession. The proposed technique can be applied using a variety of polar molecules and inert gases, which, along with other experimental variables, should allow for careful study of systematic uncertainties in the measurement.

  6. Oriented Polar Molecules in a Solid Inert-Gas Matrix: A Proposed Method for Measuring the Electric Dipole Moment of the Electron

    Science.gov (United States)

    Vutha, A.; Horbatsch, M.; Hessels, E.

    2018-01-01

    We propose a very sensitive method for measuring the electric dipole moment of the electron using polar molecules embedded in a cryogenic solid matrix of inert-gas atoms. The polar molecules can be oriented in the $\\hat{\\rm{z}}$ direction by an applied electric field, as has recently been demonstrated by Park, et al. [Angewandte Chemie {\\bf 129}, 1066 (2017)]. The trapped molecules are prepared into a state which has its electron spin perpendicular to $\\hat{\\rm{z}}$, and a magnetic field along $\\hat{\\rm{z}}$ causes precession of this spin. An electron electric dipole moment $d_e$ would affect this precession due to the up to 100~GV/cm effective electric field produced by the polar molecule. The large number of polar molecules that can be embedded in a matrix, along with the expected long coherence times for the precession, allows for the possibility of measuring $d_e$ to an accuracy that surpasses current measurements by many orders of magnitude. Because the matrix can inhibit molecular rotations and lock the orientation of the polar molecules, it may not be necessary to have an electric field present during the precession. The proposed technique can be applied using a variety of polar molecules and inert gases, which, along with other experimental variables, should allow for careful study of systematic uncertainties in the measurement.

  7. Large-strain optical fiber sensing and real-time FEM updating of steel structures under the high temperature effect

    International Nuclear Information System (INIS)

    Huang, Ying; Fang, Xia; Xiao, Hai; Bevans, Wesley James; Chen, Genda; Zhou, Zhi

    2013-01-01

    Steel buildings are subjected to fire hazards during or immediately after a major earthquake. Under combined gravity and thermal loads, they have non-uniformly distributed stiffness and strength, and thus collapse progressively with large deformation. In this study, large-strain optical fiber sensors for high temperature applications and a temperature-dependent finite element model updating method are proposed for accurate prediction of structural behavior in real time. The optical fiber sensors can measure strains up to 10% at approximately 700 °C. Their measurements are in good agreement with those from strain gauges up to 0.5%. In comparison with the experimental results, the proposed model updating method can reduce the predicted strain errors from over 75% to below 20% at 800 °C. The minimum number of sensors in a fire zone that can properly characterize the vertical temperature distribution of heated air due to the gravity effect should be included in the proposed model updating scheme to achieve a predetermined simulation accuracy. (paper)

  8. Automated finite element updating using strain data for the lifetime reliability assessment of bridges

    International Nuclear Information System (INIS)

    Okasha, Nader M.; Frangopol, Dan M.; Orcesi, André D.

    2012-01-01

    The importance of improving the understanding of the performance of structures over their lifetime under uncertainty with information obtained from structural health monitoring (SHM) has been widely recognized. However, frameworks that efficiently integrate monitoring data into the life-cycle management of structures are yet to be developed. The objective of this paper is to propose and illustrate an approach for updating the lifetime reliability of aging bridges using monitored strain data obtained from crawl tests. It is proposed to use automated finite element model updating techniques as a tool for updating the resistance parameters of the structure. In this paper, the results from crawl tests are used to update the finite element model and, in turn, update the lifetime reliability. The original and updated lifetime reliabilities are computed using advanced computational tools. The approach is illustrated on an existing bridge.

  9. Additional comments on 'A proposed method for measuring the electric dipole moment of the neutron using acceleration in an electric field gradient and ultracold neutron interferometry'

    CERN Document Server

    Lamoreaux, S K

    1999-01-01

    We have previously (Lamoreaux and Golub, Los Alamos archive (xxx) nucl-ex/9901007vs, Nucl. Instr. and Meth., 433 (1999)) presented an analysis, using classical, semi-classical and quantum mechanical tehniques, of the proposal of Freedman et al., (Nucl. Instr. and Meth., A 396 (1997) 181) to search for the neutron electric dipole moment by the use of acceleration of ultracold neutrons in an inhomogeneous electric field followed by amplification of the resulting displacement by several methods involving spin independent interactions (gravity) or reflection from curved (spin independent) mirrors. Following the appearance of some more recent comments (Peshkin, Los Alamos archive (xxx) nucl-ex/9903012 v2; Dombeck and Ringo, Nucl. Instr. and Meth., A 433 (1999)) it now seems reasonable to publish a revised version of our quantum mechanical treatment (Section 2 B of ) with a more detailed exposition.

  10. Spumaretroviruses: Updated taxonomy and nomenclature.

    Science.gov (United States)

    Khan, Arifa S; Bodem, Jochen; Buseyne, Florence; Gessain, Antoine; Johnson, Welkin; Kuhn, Jens H; Kuzmak, Jacek; Lindemann, Dirk; Linial, Maxine L; Löchelt, Martin; Materniak-Kornas, Magdalena; Soares, Marcelo A; Switzer, William M

    2018-03-01

    Spumaretroviruses, commonly referred to as foamy viruses, are complex retroviruses belonging to the subfamily Spumaretrovirinae, family Retroviridae, which naturally infect a variety of animals including nonhuman primates (NHPs). Additionally, cross-species transmissions of simian foamy viruses (SFVs) to humans have occurred following exposure to tissues of infected NHPs. Recent research has led to the identification of previously unknown exogenous foamy viruses, and to the discovery of endogenous spumaretrovirus sequences in a variety of host genomes. Here, we describe an updated spumaretrovirus taxonomy that has been recently accepted by the International Committee on Taxonomy of Viruses (ICTV) Executive Committee, and describe a virus nomenclature that is generally consistent with that used for other retroviruses, such as lentiviruses and deltaretroviruses. This taxonomy can be applied to distinguish different, but closely related, primate (e.g., human, ape, simian) foamy viruses as well as those from other hosts. This proposal accounts for host-virus co-speciation and cross-species transmission. Published by Elsevier Inc.

  11. Update on equine allergies.

    Science.gov (United States)

    Fadok, Valerie A

    2013-12-01

    Horses develop many skin and respiratory disorders that have been attributed to allergy. These disorders include pruritic skin diseases, recurrent urticaria, allergic rhinoconjunctivitis, and reactive airway disease. Allergen-specific IgE has been detected in these horses, and allergen-specific immunotherapy is used to ameliorate clinical signs. The best understood atopic disease in horses is insect hypersensitivity, but the goal of effective treatment with allergen-specific immunotherapy remains elusive. In this review, updates in pathogenesis of allergic states and a brief mention of the new data on what is known in humans and dogs and how that relates to equine allergic disorders are discussed. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Hypertrophic Cardiomyopathy: Clinical Update.

    Science.gov (United States)

    Geske, Jeffrey B; Ommen, Steve R; Gersh, Bernard J

    2018-05-01

    Hypertrophic cardiomyopathy (HCM) is the most common heritable cardiomyopathy, manifesting as left ventricular hypertrophy in the absence of a secondary cause. The genetic underpinnings of HCM arise largely from mutations of sarcomeric proteins; however, the specific underlying mutation often remains undetermined. Patient presentation is phenotypically diverse, ranging from asymptomatic to heart failure or sudden cardiac death. Left ventricular hypertrophy and abnormal ventricular configuration result in dynamic left ventricular outflow obstruction in most patients. The goal of therapeutic interventions is largely to reduce dynamic obstruction, with treatment modalities spanning lifestyle modifications, pharmacotherapies, and septal reduction therapies. A small subset of patients with HCM will experience sudden cardiac death, and risk stratification remains a clinical challenge. This paper presents a clinical update for diagnosis, family screening, clinical imaging, risk stratification, and management of symptoms in patients with HCM. Copyright © 2018 American College of Cardiology Foundation. Published by Elsevier Inc. All rights reserved.

  13. Y2K UPDATE

    CERN Multimedia

    Sverre JARP

    1999-01-01

    Concerning Y2K preparation, please note the following:Everybody, who has a NICE installation on his/her PC, needs to log in to NICE at least once before Xmas to get the Y2K update installed. This applies especially to dual boot systems.The test schedule on Y2Kplus.cern.ch will be prolonged. The last restart took place on 10 November and two more will take place on 24 November and 8 December, respectively. The Oracle users responsible for the maintenance of Oracle Forms applications which include PL/SQL blocks where date fields are handled with the default format are requested to contact oracle.support@cern.ch at their earliest convenience.Sverre Jarp (CERN Y2K co-ordinator, phone: 74944)

  14. Amblyopia update: new treatments.

    Science.gov (United States)

    Vagge, Aldo; Nelson, Leonard B

    2016-09-01

    This review article is an update on the current treatments for amblyopia. In particular, the authors focus on the concepts of brain plasticity and their implications for novel treatment strategies for both children and adults affected by amblyopia. A variety of strategies has been developed to treat amblyopia in children and adults. New evidence on the pathogenesis of amblyopia has been obtained both in animal models and in clinical trials. Mainly, these studies have challenged the classical concept that amblyopia becomes untreatable after the 'end' of the sensitive or critical period of visual development, because of a lack of sufficient plasticity in the adult brain. New treatments for amblyopia in children and adults are desirable and should be encouraged. However, further studies should be completed before such therapies are widely accepted into clinical practice.

  15. Memory updating and mental arithmetic

    Directory of Open Access Journals (Sweden)

    Cheng-Ching eHan

    2016-02-01

    Full Text Available Is domain-general memory updating ability predictive of calculation skills or are such skills better predicted by the capacity for updating specifically numerical information? Here, we used multidigit mental multiplication (MMM as a measure for calculating skill as this operation requires the accurate maintenance and updating of information in addition to skills needed for arithmetic more generally. In Experiment 1, we found that only individual differences with regard to a task updating numerical information following addition (MUcalc could predict the performance of MMM, perhaps owing to common elements between the task and MMM. In Experiment 2, new updating tasks were designed to clarify this: a spatial updating task with no numbers, a numerical task with no calculation, and a word task. The results showed that both MUcalc and the spatial task were able to predict the performance of MMM but only with the more difficult problems, while other updating tasks did not predict performance. It is concluded that relevant processes involved in updating the contents of working memory support mental arithmetic in adults.

  16. Oil sands development update

    International Nuclear Information System (INIS)

    1999-01-01

    A detailed review and update of oil sands development in Alberta are provided covering every aspect of the production and economic aspects of the industry. It is pointed out that at present oil sands account for 28 per cent of Canadian crude oil production, expected to reach 50 per cent by 2005. Based on recent announcements, a total of 26 billion dollars worth of projects are in progress or planned; 20 billion dollars worth of this development is in the Athabasca area, the remainder in Cold Lake and other areas. The current update envisages up to 1,800,000 barrels per day by 2008, creating 47,000 new jobs and total government revenues through direct and indirect taxes of 118 billion dollars. Provinces other than Alberta also benefit from these development, since 60 per cent of all employment and income created by oil sands production is in other parts of Canada. Up to 60 per cent of the expansion is for goods and services and of this, 50 to 55 per cent will be purchased from Canadian sources. The remaining 40 per cent of the new investment is for engineering and construction of which 95 per cent is Canadian content. Aboriginal workforce by common consent of existing operators matches regional representation (about 13 per cent), and new developers are expected to match these standards. Planned or ongoing development in environmental protection through improved technologies and optimization, energy efficiency and improved tailings management, and active support of flexibility mechanisms such as emission credits trading, joint implementation and carbon sinks are very high on the industry's agenda. The importance of offsets are discussed extensively along with key considerations for international negotiations, as well as further research of other options such as sequestration, environmentally benign disposal of waste, and enhanced voluntary action

  17. ADAS Update and Maintainability

    Science.gov (United States)

    Watson, Leela R.

    2010-01-01

    Since 2000, both the National Weather Service Melbourne (NWS MLB) and the Spaceflight Meteorology Group (SMG) have used a local data integration system (LOIS) as part of their forecast and warning operations. The original LOIS was developed by the Applied Meteorology Unit (AMU) in 1998 (Manobianco and Case 1998) and has undergone subsequent improvements. Each has benefited from three-dimensional (3-D) analyses that are delivered to forecasters every 15 minutes across the peninsula of Florida. The intent is to generate products that enhance short-range weather forecasts issued in support of NWS MLB and SMG operational requirements within East Central Florida. The current LDIS uses the Advanced Regional Prediction System (ARPS) Data Analysis System (AD AS) package as its core, which integrates a wide variety of national, regional, and local observational data sets. It assimilates all available real-time data within its domain and is run at a finer spatial and temporal resolution than current national or regional-scale analysis packages. As such, it provides local forecasters with a more comprehensive understanding of evolving fine-scale weather features. Over the years, the LDIS has become problematic to maintain since it depends on AMU-developed shell scripts that were written for an earlier version of the ADAS software. The goals of this task were to update the NWS MLB/SMG LDIS with the latest version of ADAS, incorporate new sources of observational data, and upgrade and modify the AMU-developed shell scripts written to govern the system. In addition, the previously developed ADAS graphical user interface (GUI) was updated. Operationally, these upgrades will result in more accurate depictions of the current local environment to help with short-range weather forecasting applications, while also offering an improved initialization for local versions of the Weather Research and Forecasting (WRF) model used by both groups.

  18. Evaluation of acoustic resonance at branch section in main steam line. Part 2. Proposal of method for predicting resonance frequency in steam flow

    International Nuclear Information System (INIS)

    Uchiyama, Yuta; Morita, Ryo

    2012-01-01

    Flow-induced acoustic resonances of piping system containing closed side-branches are sometimes encountered in power plants. Acoustic standing waves with large amplitude pressure fluctuation in closed side-branches are excited by the unstable shear layer which separates the mean flow in the main piping from the stagnant fluid in the branch. In U.S. NPP, the steam dryer had been damaged by high cycle fatigue due to acoustic-induced vibration under a power uprating condition. Our previous research developed the method for evaluating the acoustic resonance at the branch sections in actual power plants by using CFD. In the method, sound speed in wet steam is evaluated by its theory on the assumption of homogeneous flow, although it may be different from practical sound speed in wet steam. So, it is necessary to consider and introduce the most suitable model of practical sound speed in wet steam. In addition, we tried to develop simplified prediction method of the amplitude and frequency of pressure fluctuation in wet steam flow. Our previous experimental research clarified that resonance amplitude of fluctuating pressure at the top of the branch in wet steam. However, the resonance frequency in steam condition could not be estimated by using theoretical equation as the end correction in steam condition and sound speed in wet steam is not clarified as same reason as CFD. Therefore, in this study, we tried to evaluate the end correction in each dry and wet steam and sound speed of wet steam from experimental results. As a result, method for predicting resonance frequency by using theoretical equation in each wet and dry steam condition was proposed. (author)

  19. Dynapenia and aging: an update.

    Science.gov (United States)

    Manini, Todd M; Clark, Brian C

    2012-01-01

    In 2008, we published an article arguing that the age-related loss of muscle strength is only partially explained by the reduction in muscle mass and that other physiologic factors explain muscle weakness in older adults (Clark BC, Manini TM. Sarcopenia =/= dynapenia. J Gerontol A Biol Sci Med Sci. 2008;63:829-834). Accordingly, we proposed that these events (strength and mass loss) be defined independently, leaving the term "sarcopenia" to be used in its original context to describe the age-related loss of muscle mass. We subsequently coined the term "dynapenia" to describe the age-related loss of muscle strength and power. This article will give an update on both the biological and clinical literature on dynapenia-serving to best synthesize this translational topic. Additionally, we propose a working decision algorithm for defining dynapenia. This algorithm is specific to screening for and defining dynapenia using age, presence or absence of risk factors, a grip strength screening, and if warranted a test for knee extension strength. A definition for a single risk factor such as dynapenia will provide information in building a risk profile for the complex etiology of physical disability. As such, this approach mimics the development of risk profiles for cardiovascular disease that include such factors as hypercholesterolemia, hypertension, hyperglycemia, etc. Because of a lack of data, the working decision algorithm remains to be fully developed and evaluated. However, these efforts are expected to provide a specific understanding of the role that dynapenia plays in the loss of physical function and increased risk for disability among older adults.

  20. Working Memory Updating Latency Reflects the Cost of Switching between Maintenance and Updating Modes of Operation

    Science.gov (United States)

    Kessler, Yoav; Oberauer, Klaus

    2014-01-01

    Updating and maintenance of information are 2 conflicting demands on working memory (WM). We examined the time required to update WM (updating latency) as a function of the sequence of updated and not-updated items within a list. Participants held a list of items in WM and updated a variable subset of them in each trial. Four experiments that vary…

  1. N Reactor updated safety analysis report, NUSAR

    International Nuclear Information System (INIS)

    1978-01-01

    An update of the N Reactor safety analysis is presented to reconfirm that the continued operation does not pose undue risk to DOE personnel and property, the public, or the environment. A reanalysis of LOCA and reactivity transients utilizing current codes and methods is made. The principal aspects of the overall submission, a general description, and site characteristics including geography and demography, nearby industrial, transportation and military facilities, meteorology, hydraulic engineering, and geology and seismology are described

  2. Real-time numerical shake prediction and updating for earthquake early warning

    Science.gov (United States)

    Wang, Tianyun; Jin, Xing; Wei, Yongxiang; Huang, Yandan

    2017-12-01

    Ground motion prediction is important for earthquake early warning systems, because the region's peak ground motion indicates the potential disaster. In order to predict the peak ground motion quickly and precisely with limited station wave records, we propose a real-time numerical shake prediction and updating method. Our method first predicts the ground motion based on the ground motion prediction equation after P waves detection of several stations, denoted as the initial prediction. In order to correct the prediction error of the initial prediction, an updating scheme based on real-time simulation of wave propagation is designed. Data assimilation technique is incorporated to predict the distribution of seismic wave energy precisely. Radiative transfer theory and Monte Carlo simulation are used for modeling wave propagation in 2-D space, and the peak ground motion is calculated as quickly as possible. Our method has potential to predict shakemap, making the potential disaster be predicted before the real disaster happens. 2008 M S8.0 Wenchuan earthquake is studied as an example to show the validity of the proposed method.

  3. Effective public involvement in the HoST-D Programme for dementia home care support: From proposal and design to methods of data collection (innovative practice).

    Science.gov (United States)

    Giebel, Clarissa; Roe, Brenda; Hodgson, Anthony; Britt, David; Clarkson, Paul

    2017-01-01

    Public involvement is an important element in health and social care research. However, it is little evaluated in research. This paper discusses the utility and impact of public involvement of carers and people with dementia in a five-year programme on effective home support in dementia, from proposal and design to methods of data collection, and provides a useful guide for future research on how to effectively involve the public. The Home SupporT in Dementia (HoST-D) Programme comprises two elements of public involvement, a small reference group and a virtual lay advisory group. Involving carers and people with dementia is based on the six key values of involvement - respect, support, transparency, responsiveness, fairness of opportunity, and accountability. Carers and people with dementia gave opinions on study information, methods of data collection, an economic model, case vignettes, and a memory aid booklet, which were all taken into account. Public involvement has provided benefits to the programme whilst being considerate of the time constraints and geographical locations of members.

  4. Thesis Proposal

    DEFF Research Database (Denmark)

    Sloth, Erik

    2010-01-01

    Strukturen i Thesis proposal er følgende: Først præsenteres mine konkrete empiriske forskningsprojekter som skal munde ud i afhandlingens artikler. Jeg præsenterer herefter de teoretiske overvejelser omkring oplevelsesbegrebet og forbrugerkulturteori som danner baggrund for at jeg er nået frem til...

  5. A proposal for a test method for assessment of hazard property HP 12 ("Release of an acute toxic gas") in hazardous waste classification - Experience from 49 waste.

    Science.gov (United States)

    Hennebert, Pierre; Samaali, Ismahen; Molina, Pauline

    2016-12-01

    A stepwise method for assessment of the HP 12 is proposed and tested with 49 waste samples. The hazard property HP 12 is defined as "Release of an acute toxic gas": waste which releases acute toxic gases (Acute Tox. 1, 2 or 3) in contact with water or an acid. When a waste contains a substance assigned to one of the following supplemental hazards EUH029, EUH031 and EUH032, it shall be classified as hazardous by HP 12 according to test methods or guidelines (EC, 2014a, 2014b). When the substances with the cited hazard statement codes react with water or an acid, they can release HCl, Cl 2 , HF, HCN, PH 3 , H 2 S, SO 2 (and two other gases very unlikely to be emitted, hydrazoic acid HN 3 and selenium oxide SeO 2 - a solid with low vapor pressure). Hence, a method is proposed:For a set of 49 waste, water addition did not produce gas. Nearly all the solid waste produced a gas in contact with hydrochloric acid in 5 min in an automated calcimeter with a volume >0.1L of gas per kg of waste. Since a plateau of pressure is reached only for half of the samples in 5 min, 6 h trial with calorimetric bombs or glass flasks were done and confirmed the results. Identification of the gases by portable probes showed that most of the tested samples emit mainly CO 2 . Toxic gases are emitted by four waste: metallic dust from the aluminum industry (CO), two air pollution control residue of industrial waste incinerator (H 2 S) and a halogenated solvent (organic volatile(s) compound(s)). HF has not been measured in these trials started before the present definition of HP 12. According to the definition of HP 12, only the H 2 S emission of substances with hazard statement EUH031 is accounted for. In view of the calcium content of the two air pollution control residue, the presence of calcium sulphide (EUH031) can be assumed. These two waste are therefore classified potentially hazardous for HP 12, from a total of 49 waste. They are also classified as hazardous for other properties (HP 7

  6. A general framework for updating belief distributions.

    Science.gov (United States)

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  7. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair.

    Science.gov (United States)

    Lee, Young-Joo; Kim, Robin E; Suh, Wonho; Park, Kiwon

    2017-04-24

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed.

  8. Xanthones of Lichen Source: A 2016 Update.

    Science.gov (United States)

    Le Pogam, Pierre; Boustie, Joël

    2016-03-02

    An update of xanthones encountered in lichens is proposed as more than 20 new xanthones have been described since the publication of the compendium of lichen metabolites by Huneck and Yoshimura in 1996. The last decades witnessed major advances regarding the elucidation of biosynthetic schemes leading to these fascinating compounds, accounting for the unique substitution patterns of a very vast majority of lichen xanthones. Besides a comprehensive analysis of the structures of xanthones described in lichens, their bioactivities and the emerging analytical strategies used to pinpoint them within lichens are presented here together with physico-chemical properties (including NMR data) as reported since 1996.

  9. Summary Report of Laboratory Testing to Establish the Effectiveness of Proposed Treatment Methods for Unremediated and Remediated Nitrate Salt Waste Streams

    Energy Technology Data Exchange (ETDEWEB)

    Anast, Kurt Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Funk, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The inadvertent creation of transuranic waste carrying hazardous waste codes D001 and D002 requires the treatment of the material to eliminate the hazardous characteristics and allow its eventual shipment and disposal at the Waste Isolation Pilot Plant (WIPP). This report documents the effectiveness of two treatment methods proposed to stabilize both the unremediated and remediated nitrate salt waste streams (UNS and RNS, respectively). The two technologies include the addition of zeolite (with and without the addition of water as a processing aid) and cementation. Surrogates were developed to evaluate both the solid and liquid fractions expected from parent waste containers, and both the solid and liquid fractions were tested. Both technologies are shown to be effective at eliminating the characteristic of ignitability (D001), and the addition of zeolite was determined to be effective at eliminating corrosivity (D002), with the preferred option1 of zeolite addition currently planned for implementation at the Waste Characterization, Reduction, and Repackaging Facility. During the course of this work, we established the need to evaluate and demonstrate the effectiveness of the proposed remedy for debris material, if required. The evaluation determined that Wypalls absorbed with saturated nitrate salt solutions exhibit the ignitability characteristic (all other expected debris is not classified as ignitable). Follow-on studies will be developed to demonstrate the effectiveness of stabilization for ignitable Wypall debris. Finally, liquid surrogates containing saturated nitrate salts did not exhibit the characteristic of ignitability in their pure form (those neutralized with Kolorsafe and mixed with sWheat did exhibit D001). As a result, additional nitrate salt solutions (those exhibiting the oxidizer characteristic) will be tested to demonstrate the effectiveness of the remedy.

  10. SU-F-J-142: Proposed Method to Broaden Inclusion Potential of Patients Able to Use the Calypso Tracking System in Prostate Radiotherapy

    Energy Technology Data Exchange (ETDEWEB)

    Fiedler, D; Kuo, H; Bodner, W; Tome, W [Montefiore Medical Center, Bronx, NY (United States)

    2016-06-15

    Purpose: To introduce a non-standard method of patient setup, using BellyBoard immobilization, to better utilize the localization and tracking potential of an RF-beacon system with EBRT for prostate cancer. Methods: An RF-beacon phantom was imaged using a wide bore CT scanner, both in a standard level position and with a known rotation (4° pitch and 7.5° yaw). A commercial treatment planning system (TPS) was used to determine positional coordinates of each beacon, and the centroid of the three beacons for both setups. For each setup at the Linac, kV AP and Rt Lateral images were obtained. A full characterization of the RF-beacon system in clinical mode was completed for various beacons’ array-to-centroid distances, which includes vertical, lateral, and longitudinal offset data, as well as pitch and yaw offset measurements for the tilted phantom. For the single patient who has been setup using the proposed BellyBoard method, a supine simulation was first obtained. When abdominal protrusion was found to be exceeding the limits of the RF-Beacon system through distance-based analysis in the TPS, the patient is re-simulated prone with the BellyBoard. Array to centroid distance is measured again in the TPS, and if found to be within the localization or tracking region it is applied. Results: Characterization of limitations for the RF-beacon system in clinical mode showed acceptable consistency of offset determination for phantom setup accuracy. The nonstandard patient setup method reduced the beacons’ centroid-to-array distance by 8.32cm, from 25.13cm to 16.81cm; completely out of tracking range (greater than 20cm) to within setup tracking range (less than 20cm). Conclusion: Using the RF-beacon system in combination with this novel patient setup can allow patients who would otherwise not be candidates for beacon enhanced EBRT to now be able to benefit from the reduced PTV margins of this treatment method.

  11. SU-F-J-142: Proposed Method to Broaden Inclusion Potential of Patients Able to Use the Calypso Tracking System in Prostate Radiotherapy

    International Nuclear Information System (INIS)

    Fiedler, D; Kuo, H; Bodner, W; Tome, W

    2016-01-01

    Purpose: To introduce a non-standard method of patient setup, using BellyBoard immobilization, to better utilize the localization and tracking potential of an RF-beacon system with EBRT for prostate cancer. Methods: An RF-beacon phantom was imaged using a wide bore CT scanner, both in a standard level position and with a known rotation (4° pitch and 7.5° yaw). A commercial treatment planning system (TPS) was used to determine positional coordinates of each beacon, and the centroid of the three beacons for both setups. For each setup at the Linac, kV AP and Rt Lateral images were obtained. A full characterization of the RF-beacon system in clinical mode was completed for various beacons’ array-to-centroid distances, which includes vertical, lateral, and longitudinal offset data, as well as pitch and yaw offset measurements for the tilted phantom. For the single patient who has been setup using the proposed BellyBoard method, a supine simulation was first obtained. When abdominal protrusion was found to be exceeding the limits of the RF-Beacon system through distance-based analysis in the TPS, the patient is re-simulated prone with the BellyBoard. Array to centroid distance is measured again in the TPS, and if found to be within the localization or tracking region it is applied. Results: Characterization of limitations for the RF-beacon system in clinical mode showed acceptable consistency of offset determination for phantom setup accuracy. The nonstandard patient setup method reduced the beacons’ centroid-to-array distance by 8.32cm, from 25.13cm to 16.81cm; completely out of tracking range (greater than 20cm) to within setup tracking range (less than 20cm). Conclusion: Using the RF-beacon system in combination with this novel patient setup can allow patients who would otherwise not be candidates for beacon enhanced EBRT to now be able to benefit from the reduced PTV margins of this treatment method.

  12. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas April 11 Update of switch in LHC 4 LHC 4 Point April 14 Update of switch in LHC 5 LHC 5 Point April 15 Update of switches in LHC 3 and LHC 2 Points LHC 3 and LHC 2 April 22 Update of switch N4 Meyrin Ouest April 23 Update of switch  N6 Prévessin Site Ap...

  13. Pancreatitis. An update; Pankreatitis. Ein Update

    Energy Technology Data Exchange (ETDEWEB)

    Schreyer, A.G. [Universitaetsklinikum Regensburg, Institut fuer Roentgendiagnostik, Regensburg (Germany); Grenacher, L. [Diagnostik Muenchen, MVZ Radiologie, Muenchen (Germany); Juchems, M. [Klinikum Konstanz, Diagnostische und Interventionelle Radiologie, Konstanz (Germany)

    2016-04-15

    Acute and chronic pancreatitis are becoming increasingly more severe diseases in the western world with far-reaching consequences for the individual patient as well as the socioeconomic situation. This article gives an overview of the contribution of radiological imaging to the diagnostics and therapy of both forms of the disease. Acute pancreatitis can be subdivided into severe (20 %) and mild manifestations. The diagnostics should be performed with computed tomography (CT) or magnetic resonance imaging (MRI) for assessing necrosis or potential infections only in severe forms of pancreatitis. In chronic pancreatitis transabdominal ultrasound should initially be adequate for assessment of the pancreas. For the differential diagnosis between pancreatic carcinoma and chronic pancreatitis, MRI with magnetic resonance cholangiopancreatography (MRCP) followed by an endoscopic ultrasound-guided fine needle aspiration is the method of choice. For the primary diagnosis for acute and chronic pancreatitis ultrasound examination is the modality of first choice followed by radiological CT and MRI with MRCP examinations. (orig.) [German] Akute und chronische Pankreatitis sind in der westlichen Welt zunehmende schwere Krankheitsbilder mit tiefgreifenden Konsequenzen fuer den einzelnen Patienten sowie soziooekonomisch. Der Beitrag radiologischer Bildgebung zur Diagnostik und Therapie beider Erkrankungsformen soll im vorliegenden Uebersichtsbeitrag diskutiert werden. Die akute Pankreatitis kann in eine schwere (20 %) und milde Verlaufsform unterteilt werden. Lediglich bei den schweren Formen sollte eine CT- oder MRT-Diagnostik bzgl. der Beurteilung von Nekrosen und moeglichen Infektionen erfolgen. Bei der chronischen Pankreatitis genuegt zunaechst eine Beurteilung des Pankreas durch transabdominellen Ultraschall. Hier sind die MRT mit der Magnetresonanzcholangiopankreatikographie (MRCP) sowie die endosonographisch gesteuerte Feinnadelpunktion die Methode, um

  14. Automatic domain updating technique for improving computational efficiency of 2-D flood-inundation simulation

    Science.gov (United States)

    Tanaka, T.; Tachikawa, Y.; Ichikawa, Y.; Yorozu, K.

    2017-12-01

    Flood is one of the most hazardous disasters and causes serious damage to people and property around the world. To prevent/mitigate flood damage through early warning system and/or river management planning, numerical modelling of flood-inundation processes is essential. In a literature, flood-inundation models have been extensively developed and improved to achieve flood flow simulation with complex topography at high resolution. With increasing demands on flood-inundation modelling, its computational burden is now one of the key issues. Improvements of computational efficiency of full shallow water equations are made from various perspectives such as approximations of the momentum equations, parallelization technique, and coarsening approaches. To support these techniques and more improve the computational efficiency of flood-inundation simulations, this study proposes an Automatic Domain Updating (ADU) method of 2-D flood-inundation simulation. The ADU method traces the wet and dry interface and automatically updates the simulation domain in response to the progress and recession of flood propagation. The updating algorithm is as follow: first, to register the simulation cells potentially flooded at initial stage (such as floodplains nearby river channels), and then if a registered cell is flooded, to register its surrounding cells. The time for this additional process is saved by checking only cells at wet and dry interface. The computation time is reduced by skipping the processing time of non-flooded area. This algorithm is easily applied to any types of 2-D flood inundation models. The proposed ADU method is implemented to 2-D local inertial equations for the Yodo River basin, Japan. Case studies for two flood events show that the simulation is finished within two to 10 times smaller time showing the same result as that without the ADU method.

  15. Aqua/Aura Updated Inclination Adjust Maneuver Performance Prediction Model

    Science.gov (United States)

    Boone, Spencer

    2017-01-01

    This presentation will discuss the updated Inclination Adjust Maneuver (IAM) performance prediction model that was developed for Aqua and Aura following the 2017 IAM series. This updated model uses statistical regression methods to identify potential long-term trends in maneuver parameters, yielding improved predictions when re-planning past maneuvers. The presentation has been reviewed and approved by Eric Moyer, ESMO Deputy Project Manager.

  16. Updating the OMERACT filter

    DEFF Research Database (Denmark)

    D'Agostino, Maria-Antonietta; Boers, Maarten; Kirwan, John

    2014-01-01

    OBJECTIVE: The Outcome Measures in Rheumatology (OMERACT) Filter provides a framework for the validation of outcome measures for use in rheumatology clinical research. However, imaging and biochemical measures may face additional validation challenges because of their technical nature. The Imaging...... using the original OMERACT Filter and the newly proposed structure. Breakout groups critically reviewed the extent to which the candidate biomarkers complied with the proposed stepwise approach, as a way of examining the utility of the proposed 3-dimensional structure. RESULTS: Although...... was obtained for a proposed tri-axis structure to assess validation of imaging and soluble biomarkers; nevertheless, additional work is required to better evaluate its place within the OMERACT Filter 2.0....

  17. Ontario regulatory update

    International Nuclear Information System (INIS)

    Thompson, P.

    1998-01-01

    This paper provides a summary of recent events which when combined add up to a gradual but unmistakable movement of the energy sector in Ontario towards a fully competitive market. Some of the events precipitating this movement towards competition include the passing of the Energy Competition Act of 1998 (Bill 35), electricity deregulation, regulatory reform of the natural gas sector, and changes to the consumer protection legislation. The role of the Ontario Energy Board was also updated to bring it in line with the demands of the competitive marketplace. Among the new roles that the Board will assume are to facilitate competition, to maintain fair and reasonable rates, and to facilitate rational expansion. Another objective is to provide opportunities for including energy efficiency in government policies. Implications of the changes in the OEB's mandate for market participants were also discussed, including (1) regulated gas sales and delivery mechanisms, (2) transactional services, (3) contract restructuring, (4) consumer protection, (5) supervision of competitive market participants, and (6) market surveillance

  18. Advanced Stirling Convertor Update

    Science.gov (United States)

    Wood, J. Gary; Carroll, Cliff; Matejczyk, Dan; Penswick, L. B.; Soendker, E.

    2006-01-01

    This paper reports on the 88 We Advanced Stirling Convertor (ASC) currently being developed under Phase II of a NASA NRA program for possible use in advanced high specific power radioisotope space power systems. An early developmental unit, the Frequency Test Bed (FTB) which was built and tested in Phase I demonstrated 36% efficiency. The ASC-1 currently being developed under Phase II, uses a high temperature heater head to allow for operation at 850 °C and is expected to have an efficiency approaching 40% (based on AC electrical out) at a temperature ratio of 3.1. The final lightweight ASC-2 convertor to be developed in Phase III is expected to have a mass of approximately 1 kg. The implementation of the ASC would allow for much higher specific power radioisotope power systems, requiring significantly less radioisotope fuel than current systems. The first run of the ASC-1 occurred in September 2005, and full temperature operation was achieved in early October 2005. Presented is an update on progress on the ASC program as well as the plans for future development. Also presented are efforts being performed to ensure the ASC has the required long life already demonstrated in free-piston Stirling cryocoolers.

  19. Tetraplegia Management Update.

    Science.gov (United States)

    Fridén, Jan; Gohritz, Andreas

    2015-12-01

    Tetraplegia is a profound impairment of mobility manifesting as a paralysis of all 4 extremities owing to cervical spinal cord injury. The purpose of this article is to provide an update and analyze current management, treatment options, and outcomes of surgical reconstruction of arm and hand function. Surgical restoration of elbow and wrist extension or handgrip has tremendous potential to improve autonomy, mobility, and critical abilities, for example, eating, personal care, and self-catheterization and productive work in at least 70% of tetraplegic patients. Tendon and nerve transfers, tenodeses, and joint stabilizations reliably enable improved arm and hand usability, reduce muscle imbalance and pain in spasticity, and prevent joint contractures. One-stage combined procedures have proven considerable advantages over traditional multistage approaches. Immediate activation of transferred muscles reduces the risk of adhesions, facilitates relearning, avoids adverse effects of immobilization, and enhances functional recovery. Transfer of axillary, musculocutaneous, and radial nerve fascicles from above the spinal cord injury are effective and promising options to enhance motor outcome and sensory protection, especially in groups with limited resources. Improved communication between medical disciplines, therapists, patients, and their relatives should help that more individuals can benefit from these advances and could empower many thousands tetraplegic individuals "to take life into their own hands" and live more independently. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  20. Update in Internal Medicine

    Science.gov (United States)

    López-Jiménez, Francisco; Brito, Máximo; Aude, Y. Wady; Scheinberg, Phillip; Kaplan, Mariana; Dixon, Denise A.; Schneiderman, Neil; Trejo, Jorge F.; López-Salazar, Luis Humberto; Ramírez-Barba, Ector Jaime; Kalil, Roberto; Ortiz, Carmen; Goyos, José; Buenaño, Alvaro; Kottiech, Samer; Lamas, Gervasio A.

    2009-01-01

    More than 500,000 new medical articles are published every year and available time to keep updated is scarcer every day. Nowadays, the task of selecting useful, consistent, and relevant information for clinicians is a priority in many major medical journals. This review has the aim of gathering the results of the most important findings in clinical medicine in the last few years. It is focused on results from randomized clinical trials and well-designed observational research. Findings were included preferentially if they showed solid results, and we avoided as much as possible including only preliminary data, or results that included only non-clinical outcomes. Some of the most relevant findings reported here include the significant benefit of statins in patients with coronary artery disease even with mean cholesterol level. It also provides a substantial review of the most significant trials assessing the effectiveness of IIb/IIIa receptor blockers. In gastroenterology many advances have been made in the H. pylori eradication, and the finding that the cure of H. pylori infection may be followed by gastroesophageal reflux disease. Some new antivirals have shown encouraging results in patients with chronic hepatitis. In the infectious disease arena, the late breaking trials in anti-retroviral disease are discussed, as well as the new trends regarding antibiotic resistance. This review approaches also the role of leukotriene modifiers in the treatment of asthma and discusses the benefit of using methylprednisolone in patients with adult respiratory distress syndrome, among many other advances in internal medicine. PMID:11068074

  1. Nanomaterials, and Occupational Health and Safety—A Literature Review About Control Banding and a Semi-Quantitative Method Proposed for Hazard Assessment.

    Science.gov (United States)

    Dimou, Kaotar; Emond, Claude

    2017-06-01

    approaches that were developed worldwide, and then to suggest an original methodology based on the characterisation of the hazard. For this research, our team conducted a systematic literature review over the past 20 years. This approach is important in understanding the conceptual basis for CB and the model’s overall effectiveness. These considerations will lead to the proposal of an original hazard assessment method based on physico-chemical and biological characteristics. Such a method should help the entire industry better understand the ability of the CB approach to limit workers’ exposure, while identifying the strengths and weaknesses of the approach. Developing this practice method will help to provide relevant recommendations to workers who handle hazardous chemicals such as ENMs and to the general population.

  2. Nanomaterials, and Occupational Health and Safety—A Literature Review About Control Banding and a Semi-Quantitative Method Proposed for Hazard Assessment

    International Nuclear Information System (INIS)

    Dimou, Kaotar; Emond, Claude

    2017-01-01

    approaches that were developed worldwide, and then to suggest an original methodology based on the characterisation of the hazard. For this research, our team conducted a systematic literature review over the past 20 years. This approach is important in understanding the conceptual basis for CB and the model’s overall effectiveness. These considerations will lead to the proposal of an original hazard assessment method based on physico-chemical and biological characteristics. Such a method should help the entire industry better understand the ability of the CB approach to limit workers’ exposure, while identifying the strengths and weaknesses of the approach. Developing this practice method will help to provide relevant recommendations to workers who handle hazardous chemicals such as ENMs and to the general population. (paper)

  3. Uses and updating of the Benders method in the integer-mixed programming in the planning of the electric power systems expansion; Usos y actualizacion del metodo de Benders en la programacion entera-mixta y en la planeacion de la expansion de los sistemas electricos de potencia

    Energy Technology Data Exchange (ETDEWEB)

    De la Torre Vega, Eli

    1997-04-01

    In the first chapter the deduction of the Benders cuts are presented, departing from the properties of duality. Also the properties of the Benders cuts are presented, as well as the initial algorithm of Benders to solve any problem of lineal integer-mixed programming are presented. In the second chapter, of the planning of the expansion of means of generation and transmission in an electric power system is presented and the different structures of the mathematical programming it gives rise to and how the method of Benders can be adapted to these. In the third chapter the theoretical contributions of this work are presented: a) How to initialize the master problem to take advantage of the acquired experience after having solved a similar problem, so that it can be solved more efficiently, the succession of integer-mixed problems of linear programming that arise when solving the problem of the planning of the expansion of generation and transmission means in an electric power system. b) How to generate a master problem whose continuous optimal solution corresponds to the optimal continuous one of the integer-mixed problem, so that the search of integer solutions is made in the vicinity of the optimum continuous. c) How to generate an integer solution, close to the optimum continuous of the integer-mixed problem, that has high probability of being feasible, and that is perhaps the optimal integer solution, in a smaller time than that required to solve it in exact form. In addition, other ideas are presented that can be incorporated to the Benders method. In order to show the effectiveness of the proposed ideas, in chapter 4 the results obtained when solving several problems are presented using: 1. The updated Benders method, 2. The branch and bound method, 3. The update of Benders when adding restrictions and 4. The update of Benders when considered as integer each time to more variables. Finally a summary is made of the achievements, of the conclusions obtained and

  4. Uncertainty for Part Density Determination: An Update

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Mario Orlando [Los Alamos National Laboratory

    2016-12-14

    Accurate and precise density measurements by hydrostatic weighing requires the use of an analytical balance, configured with a suspension system, to both measure the weight of a part in water and in air. Additionally, the densities of these liquid media (water and air) must be precisely known for the part density determination. To validate the accuracy and precision of these measurements, uncertainty statements are required. The work in this report is a revision of an original report written more than a decade ago, specifically applying principles and guidelines suggested by the Guide to the Expression of Uncertainty in Measurement (GUM) for determining the part density uncertainty through sensitivity analysis. In this work, updated derivations are provided; an original example is revised with the updated derivations and appendix, provided solely to uncertainty evaluations using Monte Carlo techniques, specifically using the NIST Uncertainty Machine, as a viable alternative method.

  5. How to identify partial exposures to ionizing radiation? Proposal for a cytogenetic method; Como identificar exposicoes parciais as radiacoes ionizantes? Proposta de um metodo citogenetico

    Energy Technology Data Exchange (ETDEWEB)

    Fernandes, T.S.; Silva, E.B.; Pinto, M.M.P.L.; Amaral, A., E-mail: thiagosalazar@hotmail.com [Universidade Federal de Pernambuco (LAMBDA/UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear. Lab. de Modelagem e Biodosimetria Aplicada; Lloyd, David [Health Protection Agency, Oxford (United Kingdom). Radiation Protection Division

    2013-08-15

    In cases of radiological incidents or in occupational exposures to ionizing radiation, the majority of exposures are not related to the total body, but only partial. In this context, if the cytogenetic dosimetry is performed, there will be an underestimation of the absorbed dose due to the dilution of irradiated cells with non-irradiated cells. Considering the norms of NR 32 - Safety and Health in the Work of Health Service - which recommends cytogenetic dosimetry in the investigation of accidental exposures to ionizing radiations, it is necessary to develop of a tool to provide a better identification of partial exposures. With this aim, a partial body exposure was simulated by mixing, in vitro, 70% of blood irradiated with 4 Gy of X-rays with 30% of unirradiated blood from the same healthy donor. Aliquots of this mixture were cultured for 48 and 72 hours. Prolonging the time of cell culture from 48 to 72 hours produced no significant change in the yield of dicentrics. However, when only M1 (first division cells) were analyzed, the frequency of dicentrics per cell was increased. Prolonging the time of cell culture allowed cells in mitotic delay by irradiation to reach metaphase, and thus provides enough time for the damage to be visualized. The results of this research present the proposed method as an important tool in the investigation of exposed individuals, allowing associating the cytogenetic analysis with the real percentage of irradiated cells, contributing significantly for the decision making in terms of occupational health. (author)

  6. A proposal for evaluation method of crack growth due to cyclic overload for piping materials based on an elastic-plastic fracture mechanics parameter

    International Nuclear Information System (INIS)

    Yamaguchi, Yoshihito; Katsuyama, Jinya; Onizawa, Kunio; Li, Yinsheng; Sugino, Hideharu

    2011-01-01

    The magnitude of Niigata-ken Chuetsu-Oki earthquake in 2007 was beyond the assumed one provided in seismic design. Therefore it becomes an important issue to evaluate the crack growth behaviors due to the cyclic overload like large earthquake. Fatigue crack growth is usually evaluated by Paris's law using the range of stress intensity factor (ΔK). However, ΔK is inappropriate in a loading condition beyond small scale yielding. In this study, the crack growth behaviors for piping materials were investigated based on an elastic-plastic fracture mechanics parameter, J-integral. It was indicated that the crack growth due to the cyclic overload beyond small scale yielding could be the sum of fatigue and ductile crack growth. The retardation effect of excessive loading on the crack growth was observed after the loading. The modified Wheeler model using J-integral has been proposed for the prediction of retardation effect. Finally, an evaluation method for crack growth behaviors due to the cyclic overload is suggested. (author)

  7. Construction and Updating of Event Models in Auditory Event Processing

    Science.gov (United States)

    Huff, Markus; Maurer, Annika E.; Brich, Irina; Pagenkopf, Anne; Wickelmaier, Florian; Papenmeier, Frank

    2018-01-01

    Humans segment the continuous stream of sensory information into distinct events at points of change. Between 2 events, humans perceive an event boundary. Present theories propose changes in the sensory information to trigger updating processes of the present event model. Increased encoding effort finally leads to a memory benefit at event…

  8. Argonne Wakefield Accelerator Update '92

    International Nuclear Information System (INIS)

    Rosing, M.; Balka, L.; Chojnacki, E.; Gai, W.; Ho, C.; Konecny, R.; Power, J.; Schoessow, P.; Simpson, J.

    1992-01-01

    The Argonne Wakefield Accelerator (AWA) is an experiment designed to test various ideas related to wakefield technology. Construction is now underway for a 100 nC electron beam in December of 1992. This report updates this progress

  9. Internet Journal of Medical Update

    African Journals Online (AJOL)

    admin

    Internet Journal of Medical Update 2010 July;5(2):8-14. Internet Journal ... hospitalizations. This study of Nigerian patients with diabetes examined the adequacy of ..... Physicians need .... relationship between patient education and glycaemic ...

  10. Vasectomy reversal: a clinical update

    Directory of Open Access Journals (Sweden)

    Abhishek P Patel

    2016-01-01

    Full Text Available Vasectomy is a safe and effective method of contraception used by 42-60 million men worldwide. Approximately 3%-6% of men opt for a vasectomy reversal due to the death of a child or divorce and remarriage, change in financial situation, desire for more children within the same marriage, or to alleviate the dreaded postvasectomy pain syndrome. Unlike vasectomy, vasectomy reversal is a much more technically challenging procedure that is performed only by a minority of urologists and places a larger financial strain on the patient since it is usually not covered by insurance. Interest in this procedure has increased since the operating microscope became available in the 1970s, which consequently led to improved patency and pregnancy rates following the procedure. In this clinical update, we discuss patient evaluation, variables that may influence reversal success rates, factors to consider in choosing to perform vasovasostomy versus vasoepididymostomy, and the usefulness of vasectomy reversal to alleviate postvasectomy pain syndrome. We also review the use of robotics for vasectomy reversal and other novel techniques and instrumentation that have emerged in recent years to aid in the success of this surgery.

  11. Engineering Evaluation of Proposed Alternative Salt Transfer Method for the Molten Salt Reactor Experiment for the Oak Ridge National Laboratory, Oak Ridge, Tennessee

    International Nuclear Information System (INIS)

    Carlberg, Jon A.; Roberts, Kenneth T.; Kollie, Thomas G.; Little, Leslie E.; Brady, Sherman D.

    2009-01-01

    This evaluation was performed by Pro2Serve in accordance with the Technical Specification for an Engineering Evaluation of the Proposed Alternative Salt Transfer Method for the Molten Salt Reactor Experiment at the Oak Ridge National Laboratory (BJC 2009b). The evaluators reviewed the Engineering Evaluation Work Plan for Molten Salt Reactor Experiment Residual Salt Removal, Oak Ridge National Laboratory, Oak Ridge, Tennessee (DOE 2008). The Work Plan (DOE 2008) involves installing a salt transfer probe and new drain line into the Fuel Drain Tanks and Fuel Flush Tank and connecting them to the new salt transfer line at the drain tank cell shield. The probe is to be inserted through the tank ball valve and the molten salt to the bottom of the tank. The tank would then be pressurized through the Reactive Gas Removal System to force the salt into the salt canisters. The Evaluation Team reviewed the work plan, interviewed site personnel, reviewed numerous documents on the Molten Salt Reactor (Sects. 7 and 8), and inspected the probes planned to be used for the transfer. Based on several concerns identified during this review, the team recommends not proceeding with the salt transfer via the proposed alternate salt transfer method. The major concerns identified during this evaluation are: (1) Structural integrity of the tanks - The main concern is with the corrosion that occurred during the fluorination phase of the uranium removal process. This may also apply to the salt transfer line for the Fuel Flush Tank. Corrosion Associated with Fluorination in the Oak Ridge National Laboratory Fluoride Volatility Process (Litman 1961) shows that this problem is significant. (2) Continued generation of Fluorine - Although the generation of Fluorine will be at a lower rate than experienced before the uranium removal, it will continue to be generated. This needs to be taken into consideration regardless of what actions are taken with the salt. (3) More than one phase of material

  12. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2010-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update and Fundamentals is the single best source for the latest developments, trends, and issues in communication technology. Featuring the fundamental framework along with the history and background of communication technologies, Communication Technology Update and Fundamentals, 12th edition helps you stay ahead of these ever-changing and emerging technologies.As always, every chapter ha

  13. New uses of sulfur - update

    Energy Technology Data Exchange (ETDEWEB)

    Almond, K.P.

    1995-07-01

    An update to an extensive bibliography on alternate uses of sulfur was presented. Alberta Sulphur Research Ltd., previously compiled a bibliography in volume 24 of this quarterly bulletin. This update provides an additional 44 new publications. The information regarding current research focusses on topics regarding the use of sulfur in oil and gas applications, mining and metallurgy, concretes and other structural materials, waste management, rubber and textile products, asphalts and other paving and highway applications.

  14. Communication technology update and fundamentals

    CERN Document Server

    Grant, August E

    2008-01-01

    New communication technologies are being introduced at an astonishing rate. Making sense of these technologies is increasingly difficult. Communication Technology Update is the single best source for the latest developments, trends, and issues in communication technology. Now in its 11th edition, Communication Technology Update has become an indispensable information resource for business, government, and academia. As always, every chapter has been completely rewritten to reflect the latest developments and market statistics, and now covers mobile computing, dig

  15. Autoimmune pancreatitis. An update; Autoimmunpankreatitis. Ein Update

    Energy Technology Data Exchange (ETDEWEB)

    Helmberger, T. [Klinikum Bogenhausen, Staedt. Klinikum, Institut fuer Diagnostische und Interventionelle Radiologie, Neuroradiologie und Nuklearmedizin, Muenchen (Germany)

    2016-04-15

    Autoimmune pancreatitis (AIP) is a rare disease, the pathophysiological understanding of which has been greatly improved over the last years. The most common form, type 1 AIP belongs to the IgG4-related diseases and must be distinguished from type 2 AIP, which is a much rarer entity associated with chronic inflammatory bowel disease. Clinically, there is an overlap with pancreatic cancer. Imaging and further criteria, such as serological and histological parameters are utilized for a differentiation between both entities in order to select the appropriate therapy and to avoid the small but ultimately unnecessary number of pancreatectomies. The diagnostics of AIP are complex, whereby the consensus criteria of the International Association of Pancreatology have become accepted as the parameters for discrimination. These encompass five cardinal criteria and one therapeutic criterion. By applying these criteria AIP can be diagnosed with a sensitivity of 84.9 %, a specificity of 100 % and an accuracy of 93.8 %. The diagnosis of AIP is accomplished by applying several parameters of which two relate to imaging. As for the routine diagnostics of the pancreas these are ultrasound, computed tomography (CT) and magnetic resonance imaging (MRI). Important for the differential diagnosis is the exclusion of signs of local and remote tumor spread for which CT and MRI are established. The essential diagnostic parameter of histology necessitates sufficient sample material, which cannot usually be acquired by a fine needle biopsy. CT or MRI are the reference standard methods for identification of the optimal puncture site and imaging-assisted (TruCut) biopsy. In patients presenting with unspecific upper abdominal pain, painless jaundice combined with the suspicion of a pancreatic malignancy in imaging but a mismatch of secondary signs of malignancy, AIP should also be considered as a differential diagnosis. As the diagnosis of AIP only partially relies on imaging radiologists also

  16. Summary Report of Comprehensive Laboratory Testing to Establish the Effectiveness of Proposed Treatment Methods for Unremediated and Remediated Nitrate Salt Waste Streams

    Energy Technology Data Exchange (ETDEWEB)

    Anast, Kurt Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Funk, David John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hargis, Kenneth Marshall [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-10-04

    The inadvertent creation of transuranic waste carrying hazardous waste codes D001 and D002 requires the treatment of the material to eliminate the hazardous characteristics and allow its eventual shipment and disposal at the Waste Isolation Pilot Plant (WIPP). This report documents the effectiveness of two treatment methods proposed to stabilize both the unremediated and remediated nitrate salt waste streams (UNS and RNS, respectively) at Los Alamos National Laboratory (LANL). The two technologies include the addition of zeolite (with and without the addition of water as a processing aid) and cementation. Surrogates were developed to evaluate both the solid and liquid fractions expected from parent waste containers, and both the solid and liquid fractions were tested. Both technologies are shown to be effective at eliminating the characteristic of ignitability (D001), and the addition of zeolite was determined to be effective at eliminating corrosivity (D002), with the preferred option1 of adding zeolite currently planned for implementation at LANL’s Waste Characterization, Reduction, and Repackaging Facility (WCRRF). The course of this work verified the need to evaluate and demonstrate the effectiveness of the proposed remedy for debris material, if required. The evaluation determined that WypAlls, cheesecloth, and Celotex absorbed with saturated nitrate salt solutions exhibit the ignitability characteristic (all other expected debris is not classified as ignitable). Finally, liquid surrogates containing saturated nitrate salts did not exhibit the characteristic of ignitability in their pure form (those neutralized with Kolorsafe and mixed with sWheat did exhibit D001). Sensitivity testing and an analysis were conducted to evaluate the waste form for reactivity. Tests included subjecting surrogate material to mechanical impact, friction, electrostatic discharge and thermal insults. The testing confirmed that the waste does not exhibit the characteristic of

  17. Update to the R33 cross section file format

    International Nuclear Information System (INIS)

    Vickridge, I.C.

    2003-01-01

    In September 1991, in response to the workshop on cross sections for Ion Beam Analysis (IBA) held in Namur (July 1991, Nuclear Instruments and Methods B66(1992)), a simple ascii format was proposed to facilitate transfer and collation of nuclear reaction cross section data for Ion Beam Analysis (IBA) and especially for Nuclear Reaction Analysis (NRA). Although intended only as a discussion document, the ascii format - referred to as the R33 (Report 33) format - has become a de facto standard. In the decade since this first proposal there have been spectacular advances in computing power and in software usability, however the cross-platform compatibility of the ascii character set has ensured that the need for an ascii format remains. Nuclear reaction cross section data for Nuclear Reaction analysis has been collected and archived on internet web sites over the last decade. This data has largely been entered in the R33 format, although there is a series of elastic cross sections that are expressed as the ratio to the corresponding Rutherford cross sections that have been entered in a format referred to as RTR (ratio to Rutherford). During this time the R33 format has been modified and added to - firstly to take into account angular distributions, which were not catered for in the first proposal, and more recently to cater for elastic cross sections expressed as the ratio-to- Rutherford, which it is useful to have for some elastic scattering programs. It is thus timely to formally update the R33 format. There also exists the large nuclear cross section data collections of the Nuclear Data Network - of which the core centres are the OECD NEA Nuclear Data Bank, the IAEA Nuclear Data Section, the Brookhaven National Laboratory National Nuclear Data Centre and CJD IPPE Obninsk, Russia. The R33 format is now proposed to become a legal computational format for the NDN. It is thus also necessary to provide an updated formal definition of the R33 format in order to provide

  18. NB market update

    International Nuclear Information System (INIS)

    Marshall, W. K.

    2004-01-01

    The 2004 New Brunswick proclamation introduced several changes to the industry. This paper presents an update of the current New Brunswick electricity market from the perspective of the recently created New Brunswick System Operator (NBSO). A comparison was made between the modified industry and the previous industry structure. Significant changes included: corporate restructuring and market implementation; the formation of the independent system operator; and an increase in Public Utilities Board regulatory authority. The main objectives of the NBSO were reviewed, including its intention to reliably plan and operate the integrated power system as well as facilitating and operating the electricity market. Details of directors and officers were provided along with a list of legislated functions which included entering agreements with transmitters; provision and procurement of ancillary services; maintenance of integrated system; coordination of external activities; participation with standards authorities; planning and development of transmission; and the facilitation of a competitive market. An outline of the NBSO, Transco and Public Utilities Board relationships were presented. Details of the market advisory committee were outlined, with information concerning contracts, operations and services agreements. Transmission and ancillary services were also discussed, as well as issues concerning interruptible load agreements. A chart of the New Brunswick electricity market structure was presented, along with a market overview including details of capacity, ancillary services and suppliers. Market rules and amendments were presented, as well as market participation guides. Details of generation resource adequacy requirements and the imposition of penalties were outlined. Scheduling and dispatch issues were overviewed, as well as settlement processes, inputs and their sources, including settlements for variances. Future development possibilities included an expansion of

  19. Nova Scotia electricity update

    International Nuclear Information System (INIS)

    Crandlemire, A.L.

    2004-01-01

    This paper provides an update of electricity issues concerning Nova Scotia such as supply, capacity, emission commitments, as well as co-generation and the Electricity Marketplace Governance Committee (EMGC). The goals of the strategy were reliability combined with competitive prices and greater environmental responsibility. The scope of these objectives included new capacity, transmission, renewables and co-generation. Other objectives included encouraging wholesale market competition; meeting reciprocity requirements; and a 50 MW renewable energy target. Recommendations of the EMGC included wholesale market competition; a broader market scope with a cost benefit analysis; Open Access Transmission Tariff (OATT); a scheduling and information system; network integration and a point to point service; and a separation of transmission and generation business units. Other recommendations included an open competitive process for new generation; a consideration of emissions and overall efficiency; a Renewable Energy Portfolio Standard (RPS) to start in 2006; the separation of RPS tags from electricity; and net metering of renewables. These recommendations were accepted in 2003, followed by the new Electricity Act in 2004, which made OATT mandatory, established RPS and opened to the wholesale market. Capacity at present was considered to be tight, with preparations for the new regulations under way. Reductions in air pollution were reported at 25 per cent, with renewable energy projects such as 2 windmills currently under way, as well as various other projects. Opportunities for provincial Atlantic cooperation were identified as being management of reserve requirements; trading of lowest cost electricity; new generation on a regional scale; stronger transmission ties; a system operator; a regional approach to RPS; regional management of air emissions; and regional opportunities for Carbon dioxide reductions. tabs., figs

  20. Decentralized Consistent Updates in SDN

    KAUST Repository

    Nguyen, Thanh Dang

    2017-04-10

    We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.

  1. Nonlinear model updating applied to the IMAC XXXII Round Robin benchmark system

    Science.gov (United States)

    Kurt, Mehmet; Moore, Keegan J.; Eriten, Melih; McFarland, D. Michael; Bergman, Lawrence A.; Vakakis, Alexander F.

    2017-05-01

    We consider the application of a new nonlinear model updating strategy to a computational benchmark system. The approach relies on analyzing system response time series in the frequency-energy domain by constructing both Hamiltonian and forced and damped frequency-energy plots (FEPs). The system parameters are then characterized and updated by matching the backbone branches of the FEPs with the frequency-energy wavelet transforms of experimental and/or computational time series. The main advantage of this method is that no nonlinearity model is assumed a priori, and the system model is updated solely based on simulation and/or experimental measured time series. By matching the frequency-energy plots of the benchmark system and its reduced-order model, we show that we are able to retrieve the global strongly nonlinear dynamics in the frequency and energy ranges of interest, identify bifurcations, characterize local nonlinearities, and accurately reconstruct time series. We apply the proposed methodology to a benchmark problem, which was posed to the system identification community prior to the IMAC XXXII (2014) and XXXIII (2015) Conferences as a "Round Robin Exercise on Nonlinear System Identification". We show that we are able to identify the parameters of the non-linear element in the problem with a priori knowledge about its position.

  2. Updating the Psoriatic Arthritis (PsA) Core Domain Set

    DEFF Research Database (Denmark)

    Orbai, Ana-Maria; de Wit, Maarten; Mease, Philip J

    2017-01-01

    OBJECTIVE: To include the patient perspective in accordance with the Outcome Measures in Rheumatology (OMERACT) Filter 2.0 in the updated Psoriatic Arthritis (PsA) Core Domain Set for randomized controlled trials (RCT) and longitudinal observational studies (LOS). METHODS: At OMERACT 2016, research...... conducted to update the PsA Core Domain Set was presented and discussed in breakout groups. The updated PsA Core Domain Set was voted on and endorsed by OMERACT participants. RESULTS: We conducted a systematic literature review of domains measured in PsA RCT and LOS, and identified 24 domains. We conducted...... and breakout groups at OMERACT 2016 in which findings were presented and discussed. The updated PsA Core Domain Set endorsed with 90% agreement by OMERACT 2016 participants included musculoskeletal disease activity, skin disease activity, fatigue, pain, patient's global assessment, physical function, health...

  3. Food irradiation: An update

    International Nuclear Information System (INIS)

    Morrison, Rosanna M.

    1984-01-01

    Recent regulatory and commercial activity regarding food irradiation is highlighted. The effects of irradiation, used to kill insects and microorganisms which cause food spoilage, are discussed. Special attention is given to the current regulatory status of food irradiation in the USA; proposed FDA regulation regarding the use of irradiation; pending irradiation legislation in the US Congress; and industrial applications of irradiation

  4. A Training Program to Enhance Postgraduate Students' Research Skills in Preparing a Research Proposal in the Field of Curriculum and Instruction Methods of Arabic Language

    Science.gov (United States)

    Alfakih, Ahmed Hassan

    2017-01-01

    The study examined the impact of a training program on enhancing postgraduate students' research skills in preparing a research proposal. The nature of the skills required to prepare a research proposal were first determined using a questionnaire. A training program for improving such skills was then constructed and seven postgraduate students in…

  5. 76 FR 73564 - Federal Acquisition Regulation; Updates to Contract Reporting and Central Contractor Registration

    Science.gov (United States)

    2011-11-29

    ... Federal Acquisition Regulation; Updates to Contract Reporting and Central Contractor Registration AGENCIES... Procurement Data System (FPDS). Additionally, changes are proposed for the clauses requiring contractor registration in the Central Contractor Registration (CCR) database and DUNS number reporting. DATES: Interested...

  6. Método de custeio UEP: uma proposta para uma agroindústria avícola = UEP costing method: a proposal for an aviary agribusiness

    Directory of Open Access Journals (Sweden)

    Silvana Milanese

    2012-07-01

    Full Text Available O objetivo geral deste trabalho é elaborar uma proposta de implantação do método de custeio Unidade de Esforço de Produção (UEP para uma agroindústria avícola. Para tanto, realiza-se uma pesquisa descritiva, com abordagem do problema de forma qualitativa e quantitativa, por meio de um estudo de caso. Os resultados apontam que: a a empresa possui um processo produtivo diferente, pois seus produtos finais são procedentes da desmontagem de uma única matéria-prima, o frango; b os custos de transformação são significativos, com destaque para os postos operativos PO25, PO27 e PO13; c pelo método UEP, os produtos Filé de Peito de Frango e Peito de Frango com Osso atingem uma lucratividade de 34,43% e 25,60%, respectivamente. Conclui-se que o método UEP identifica os custos de transformação e ainda gera informações que dão suporte para melhorias nos processos produtivos. The general objective of this paper is to develop a proposal to implement the Unit Effort of Production (UEP for an aviary agribusiness. To do so, a descriptive research is carried out to approach the problem in a qualitative and quantitative way through a case study. The results show that: a the company has a different production process because its outcome products come from the disassembly of one single raw material, the chickens; b transforming costs are significant, especially with operational posts PO25, PO27 and PO13; c with the UEP method, the Boneless Chicken Breast and Bone-in Chicken Breast products attain a profitability of 34.43% and 25.60%, respectively. It can be concluded that the UEP method identifies the transformation costs as well as generates information to serve as the foundation for improvements in the production processes.

  7. Updating of states in operational hydrological models

    Science.gov (United States)

    Bruland, O.; Kolberg, S.; Engeland, K.; Gragne, A. S.; Liston, G.; Sand, K.; Tøfte, L.; Alfredsen, K.

    2012-04-01

    Operationally the main purpose of hydrological models is to provide runoff forecasts. The quality of the model state and the accuracy of the weather forecast together with the model quality define the runoff forecast quality. Input and model errors accumulate over time and may leave the model in a poor state. Usually model states can be related to observable conditions in the catchment. Updating of these states, knowing their relation to observable catchment conditions, influence directly the forecast quality. Norway is internationally in the forefront in hydropower scheduling both on short and long terms. The inflow forecasts are fundamental to this scheduling. Their quality directly influence the producers profit as they optimize hydropower production to market demand and at the same time minimize spill of water and maximize available hydraulic head. The quality of the inflow forecasts strongly depends on the quality of the models applied and the quality of the information they use. In this project the focus has been to improve the quality of the model states which the forecast is based upon. Runoff and snow storage are two observable quantities that reflect the model state and are used in this project for updating. Generally the methods used can be divided in three groups: The first re-estimates the forcing data in the updating period; the second alters the weights in the forecast ensemble; and the third directly changes the model states. The uncertainty related to the forcing data through the updating period is due to both uncertainty in the actual observation and to how well the gauging stations represent the catchment both in respect to temperatures and precipitation. The project looks at methodologies that automatically re-estimates the forcing data and tests the result against observed response. Model uncertainty is reflected in a joint distribution of model parameters estimated using the Dream algorithm.

  8. Eastern US seismic hazard characterization update

    International Nuclear Information System (INIS)

    Savy, J.B.; Boissonnade, A.C.; Mensing, R.W.; Short, C.M.

    1993-06-01

    In January 1989, LLNL published the results of a multi-year project, funded by NRC, on estimating seismic hazard at nuclear plant sites east of the Rockies. The goal of this study was twofold: to develop a good central estimate (median) of the seismic hazard and to characterize the uncertainty in the estimates of this hazard. In 1989, LLNL was asked by DOE to develop site specific estimates of the seismic hazard at the Savannah River Site (SRS) in South Carolina as part of the New Production Reactor (NPR) project. For the purpose of the NPR, a complete review of the methodology and of the data acquisition process was performed. Work done under the NPR project has shown that first order improvement in the estimates of the uncertainty (i.e., lower mean hazard values) could be easily achieved by updating the modeling of the seismicity and ground motion attenuation uncertainty. To this effect, NRC sponsored LLNL to perform a reelicitation to update the seismicity and ground motion experts' inputs and to revise methods to combine seismicity and ground motion inputs in the seismic hazard analysis for nuclear power plant sites east of the Rocky Mountains. The objective of the recent study was to include the first order improvements that reflect the latest knowledge in seismicity and ground motion modeling and produce an update of all the hazard results produced in the 1989 study. In particular, it had been demonstrated that eliciting seismicity information in terms of rates of earthquakes rather than a- and b-values, and changing the elicitation format to a one-on-one interview, improved our ability to express the uncertainty of earthquake rates of occurrence at large magnitudes. Thus, NRC sponsored this update study to refine the model of uncertainty, and to re-elicitate of the experts' interpretations of the zonation and seismicity, as well as to reelicitate the ground motion models, based on current state of knowledge

  9. Supercollider: Magnet update

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1990-07-15

    The heart of the proposed US Superconducting Supercollider (SSC) is the set of superconducting magnets to hold its beams in orbit. Approximately 8,000 dipoles and 2,000 quadrupoles are needed, as well as many other special magnets. In addition the 2 TeV high energy booster would also be a superconducting machine, using about 1,200 magnets. In all, some 12,000 superconducting magnets would need to be precision built at the lowest possible cost.

  10. Supercollider: Magnet update

    International Nuclear Information System (INIS)

    Anon.

    1990-01-01

    The heart of the proposed US Superconducting Supercollider (SSC) is the set of superconducting magnets to hold its beams in orbit. Approximately 8,000 dipoles and 2,000 quadrupoles are needed, as well as many other special magnets. In addition the 2 TeV high energy booster would also be a superconducting machine, using about 1,200 magnets. In all, some 12,000 superconducting magnets would need to be precision built at the lowest possible cost

  11. The qualitative research proposal

    Directory of Open Access Journals (Sweden)

    H Klopper

    2008-09-01

    Full Text Available Qualitative research in the health sciences has had to overcome many prejudices and a number of misunderstandings, but today qualitative research is as acceptable as quantitative research designs and is widely funded and published. Writing the proposal of a qualitative study, however, can be a challenging feat, due to the emergent nature of the qualitative research design and the description of the methodology as a process. Even today, many sub-standard proposals at post-graduate evaluation committees and application proposals to be considered for funding are still seen. This problem has led the researcher to develop a framework to guide the qualitative researcher in writing the proposal of a qualitative study based on the following research questions: (i What is the process of writing a qualitative research proposal? and (ii What does the structure and layout of a qualitative proposal look like? The purpose of this article is to discuss the process of writing the qualitative research proposal, as well as describe the structure and layout of a qualitative research proposal. The process of writing a qualitative research proposal is discussed with regards to the most important questions that need to be answered in your research proposal with consideration of the guidelines of being practical, being persuasive, making broader links, aiming for crystal clarity and planning before you write. While the structure of the qualitative research proposal is discussed with regards to the key sections of the proposal, namely the cover page, abstract, introduction, review of the literature, research problem and research questions, research purpose and objectives, research paradigm, research design, research method, ethical considerations, dissemination plan, budget and appendices.

  12. Claimed effects, outcome variables and methods of measurement for health claims on foods proposed under European Community Regulation 1924/2006 in the area of appetite ratings and weight management.

    Science.gov (United States)

    Martini, Daniela; Biasini, Beatrice; Rossi, Stefano; Zavaroni, Ivana; Bedogni, Giorgio; Musci, Marilena; Pruneti, Carlo; Passeri, Giovanni; Ventura, Marco; Galli, Daniela; Mirandola, Prisco; Vitale, Marco; Dei Cas, Alessandra; Bonadonna, Riccardo C; Del Rio, Daniele

    2018-06-01

    All the requests for authorisation to bear health claims under Articles 13(5) and 14 in the context of appetite ratings and weight management have received a negative opinion by the European Food Safety Authority (EFSA), mainly because of the insufficient substantiation of the claimed effects (CEs). This manuscript results from an investigation aimed to collect, collate and critically analyse the information related to outcome variables (OVs) and methods of measurement (MMs) in the context of appetite ratings and weight management compliant with Regulation 1924/2006. Based on the literature review, the appropriateness of OVs and MMs was evaluated for specific CEs. This work might help EFSA in the development of updated guidance addressed to stakeholders interested in bearing health claims in the area of weight management. Moreover, it could drive the applicants during the design of randomised controlled trials aimed to substantiate such claims.

  13. Proposed Method for Disaggregation of Secondary Data: The Model for External Reliance of Localities in the Coastal Management Zone (MERLIN-CMZ)

    Science.gov (United States)

    The Model for External Reliance of Localities In (MERLIN) Coastal Management Zones is a proposed solution to allow scaling of variables to smaller, nested geographies. Utilizing a Principal Components Analysis and data normalization techniques, smaller scale trends are linked to ...

  14. Updating design information questionnaire (DIQ) experiences

    International Nuclear Information System (INIS)

    Palafox-Garcia, P.

    2001-01-01

    , we had to update our two DIQ's and we have to go through the previous mentioned steps, that means that in order to get the updated Facility Attachment for our MBA's, we have to wait at least two years, and meanwhile the IAEA Safeguards Inspectors have to deal without the proper Facility Attachment during their inspections in the facility, but the main problem is that meanwhile we are dealing with the updated DIQ's, unexpected changes have been occurred in one of the MBA's. The situation we are facing in the MBA is, now that the Fuel Fabrication Pilot Plant (FFPP) was stopped, they want to take advantage of certain part of this area and some equipment in order of work at the facility as a research level, but meanwhile the nuclear material is still there, they have to comply with all the requirements of security, physical, radiological and safeguards, so they are planning to leave at the MBA, a small quantity of nuclear material in order that the security regulations to comply with, there were no so difficult, laborious and expensive. The most of the nuclear material is planned to pack it in 200 It. metallic drums and send it to the other MBA. 4. Conclusion - The updated DIQ we are dealing with, was sent after they stopped the FFPP and when we will get the updated Facility Attachment will send a third more updated DIQ and this it will be as soon as they could solve all the permissions and requirements needed for the maneuvers previously described and we do not know how long it will takes, but they are working on it. As a proposal, in order to avoid that length of time to get the new Facility Attachment, the IAEA has to establish to the facilities a delivering time for each review. (author)

  15. Cybercrimes: A Proposed Taxonomy and Challenges

    Directory of Open Access Journals (Sweden)

    Harmandeep Singh Brar

    2018-01-01

    Full Text Available Cybersecurity is one of the most important concepts of cyberworld which provides protection to the cyberspace from various types of cybercrimes. This paper provides an updated survey of cybersecurity. We conduct the survey of security of recent prominent researches and categorize the recent incidents in context to various fundamental principles of cybersecurity. We have proposed a new taxonomy of cybercrime which can cover all types of cyberattacks. We have analyzed various cyberattacks as per the updated cybercrime taxonomy to identify the challenges in the field of cybersecurity and highlight various research directions as future work in this field.

  16. Disruption of the Right Temporoparietal Junction Impairs Probabilistic Belief Updating.

    Science.gov (United States)

    Mengotti, Paola; Dombert, Pascasie L; Fink, Gereon R; Vossel, Simone

    2017-05-31

    Generating and updating probabilistic models of the environment is a fundamental modus operandi of the human brain. Although crucial for various cognitive functions, the neural mechanisms of these inference processes remain to be elucidated. Here, we show the causal involvement of the right temporoparietal junction (rTPJ) in updating probabilistic beliefs and we provide new insights into the chronometry of the process by combining online transcranial magnetic stimulation (TMS) with computational modeling of behavioral responses. Female and male participants performed a modified location-cueing paradigm, where false information about the percentage of cue validity (%CV) was provided in half of the experimental blocks to prompt updating of prior expectations. Online double-pulse TMS over rTPJ 300 ms (but not 50 ms) after target appearance selectively decreased participants' updating of false prior beliefs concerning %CV, reflected in a decreased learning rate of a Rescorla-Wagner model. Online TMS over rTPJ also impacted on participants' explicit beliefs, causing them to overestimate %CV. These results confirm the involvement of rTPJ in updating of probabilistic beliefs, thereby advancing our understanding of this area's function during cognitive processing. SIGNIFICANCE STATEMENT Contemporary views propose that the brain maintains probabilistic models of the world to minimize surprise about sensory inputs. Here, we provide evidence that the right temporoparietal junction (rTPJ) is causally involved in this process. Because neuroimaging has suggested that rTPJ is implicated in divergent cognitive domains, the demonstration of an involvement in updating internal models provides a novel unifying explanation for these findings. We used computational modeling to characterize how participants change their beliefs after new observations. By interfering with rTPJ activity through online transcranial magnetic stimulation, we showed that participants were less able to update

  17. A review on model updating of joint structure for dynamic analysis purpose

    Directory of Open Access Journals (Sweden)

    Zahari S.N.

    2016-01-01

    Full Text Available Structural joints provide connection between structural element (beam, plate etc. in order to construct a whole assembled structure. There are many types of structural joints such as bolted joint, riveted joints and welded joints. The joints structures significantly contribute to structural stiffness and dynamic behaviour of structures hence the main objectives of this paper are to review on method of model updating on joints structure and to discuss the guidelines to perform model updating for dynamic analysis purpose. This review paper firstly will outline some of the existing finite element modelling works of joints structure. Experimental modal analysis is the next step to obtain modal parameters (natural frequency & mode shape to validate and improve the discrepancy between results obtained from experimental and the simulation counterparts. Hence model updating will be carried out to minimize the differences between the two results. There are two methods of model updating; direct method and iterative method. Sensitivity analysis employed using SOL200 in NASTRAN by selecting the suitable updating parameters to avoid ill-conditioning problem. It is best to consider both geometrical and material properties in the updating procedure rather than choosing only a number of geometrical properties alone. Iterative method was chosen as the best model updating procedure because the physical meaning of updated parameters are guaranteed although this method required computational effort compare to direct method.

  18. Thalamocortical dysrhythmia: a theoretical update in tinnitus

    Directory of Open Access Journals (Sweden)

    Dirk eDe Ridder

    2015-06-01

    Full Text Available Tinnitus is the perception of a sound in the absence of an external sound source. Pathophysiologically it has been attributed to bottom up deafferentation and/or top down noise-cancelling deficit. Both mechanisms are proposed to alter auditory thalamocortical signal transmission resulting in thalamocortical dysrhythmia (TCD. In deafferentation, TCD is characterized by a slowing down of resting state alpha to theta activity associated with an increase in surrounding gamma activity, resulting in persisting cross-frequency coupling between theta and gamma activity. Theta burst-firing increases network synchrony and recruitment, a mechanism which might enable long range synchrony, which in turn could represent a means for finding the missing thalamocortical information and for gaining access to consciousness. Theta oscillations could function as a carrier wave to integrate the tinnitus related focal auditory gamma activity in a consciousness enabling network, as envisioned by the global workspace model. This model suggests that focal activity in the brain does not reach consciousness, except if the focal activity becomes functionally coupled to a consciousness enabling network, aka the global workspace. In limited deafferentation the missing information can be retrieved from the auditory cortical neighborhood, decreasing surround inhibition, resulting in TCD. When the deafferentation is too wide in bandwidth it is hypothesized that the missing information is retrieved from theta mediated parahippocampal auditory memory. This suggests that based on the amount of deafferentation TCD might change to parahippocampo-cortical persisting and thus pathological theta-gamma rhythm. From a Bayesian point of view, in which the brain is conceived as a prediction machine that updates its memory-based predictions through sensory updating, tinnitus is the result of a prediction error between the predicted and sensed auditory input. The decrease in sensory updating

  19. Updated safety analysis of ITER

    International Nuclear Information System (INIS)

    Taylor, Neill; Baker, Dennis; Ciattaglia, Sergio; Cortes, Pierre; Elbez-Uzan, Joelle; Iseli, Markus; Reyes, Susana; Rodriguez-Rodrigo, Lina; Rosanvallon, Sandrine; Topilski, Leonid

    2011-01-01

    An updated version of the ITER Preliminary Safety Report has been produced and submitted to the licensing authorities. It is revised and expanded in response to requests from the authorities after their review of an earlier version in 2008, to reflect enhancements in ITER safety provisions through design changes, to incorporate new and improved safety analyses and to take into account other ITER design evolution. The updated analyses show that changes to the Tokamak cooling water system design have enhanced confinement and reduced potential radiological releases as well as removing decay heat with very high reliability. New and updated accident scenario analyses, together with fire and explosion risk analyses, have shown that design provisions are sufficient to minimize the likelihood of accidents and reduce potential consequences to a very low level. Taken together, the improvements provided a stronger demonstration of the very good safety performance of the ITER design.

  20. Updated safety analysis of ITER

    Energy Technology Data Exchange (ETDEWEB)

    Taylor, Neill, E-mail: neill.taylor@iter.org [ITER Organization, CS 90 046, 13067 St Paul Lez Durance Cedex (France); Baker, Dennis; Ciattaglia, Sergio; Cortes, Pierre; Elbez-Uzan, Joelle; Iseli, Markus; Reyes, Susana; Rodriguez-Rodrigo, Lina; Rosanvallon, Sandrine; Topilski, Leonid [ITER Organization, CS 90 046, 13067 St Paul Lez Durance Cedex (France)

    2011-10-15

    An updated version of the ITER Preliminary Safety Report has been produced and submitted to the licensing authorities. It is revised and expanded in response to requests from the authorities after their review of an earlier version in 2008, to reflect enhancements in ITER safety provisions through design changes, to incorporate new and improved safety analyses and to take into account other ITER design evolution. The updated analyses show that changes to the Tokamak cooling water system design have enhanced confinement and reduced potential radiological releases as well as removing decay heat with very high reliability. New and updated accident scenario analyses, together with fire and explosion risk analyses, have shown that design provisions are sufficient to minimize the likelihood of accidents and reduce potential consequences to a very low level. Taken together, the improvements provided a stronger demonstration of the very good safety performance of the ITER design.

  1. Bayesian updating of reliability of civil infrastructure facilities based on condition-state data and fault-tree model

    International Nuclear Information System (INIS)

    Ching Jianye; Leu, S.-S.

    2009-01-01

    This paper considers a difficult but practical circumstance of civil infrastructure management-deterioration/failure data of the infrastructure system are absent while only condition-state data of its components are available. The goal is to develop a framework for estimating time-varying reliabilities of civil infrastructure facilities under such a circumstance. A novel method of analyzing time-varying condition-state data that only reports operational/non-operational status of the components is proposed to update the reliabilities of civil infrastructure facilities. The proposed method assumes that the degradation arrivals can be modeled as a Poisson process with unknown time-varying arrival rate and damage impact and that the target system can be represented as a fault-tree model. To accommodate large uncertainties, a Bayesian algorithm is proposed, and the reliability of the infrastructure system can be quickly updated based on the condition-state data. Use of the new method is demonstrated with a real-world example of hydraulic spillway gate system.

  2. A hardenability test proposal

    Energy Technology Data Exchange (ETDEWEB)

    Murthy, N.V.S.N. [Ingersoll-Rand (I) Ltd., Bangalore (India)

    1996-12-31

    A new approach for hardenability evaluation and its application to heat treatable steels will be discussed. This will include an overview and deficiencies of the current methods and discussion on the necessity for a new approach. Hardenability terminology will be expanded to avoid ambiguity and over-simplification as encountered with the current system. A new hardenability definition is proposed. Hardenability specification methods are simplified and rationalized. The new hardenability evaluation system proposed here utilizes a test specimen with varying diameter as an alternative to the cylindrical Jominy hardenability test specimen and is readily applicable to the evaluation of a wide variety of steels with different cross-section sizes.

  3. Federal Education Update, December 2004. Commission Update 04-17.

    Science.gov (United States)

    California Postsecondary Education Commission, 2004

    2004-01-01

    This update presents some of the major issues affecting education occurring at the national level. These include: Higher Education Act Extended for One Year; New Law Increases Loan Forgiveness for Teachers; Domestic Appropriations Measures Completed; Change in Federal Student Aid Rules; Bush Advisor Nominated To Be Education Secretary In Second…

  4. Neuro-ophthalmology update.

    Science.gov (United States)

    Weber, Konrad P; Straumann, Dominik

    2014-07-01

    This review summarizes the most relevant articles from the field of neuro-ophthalmology published in the Journal of Neurology from January 2012 to July 2013. With the advent of video-oculography, several articles describe new applications for eye movement recordings as a diagnostic tool for a wide range of disorders. In myasthenia gravis, anti-Kv1.4 and anti-Lrp4 have been characterized as promising novel autoantibodies for the diagnosis of hitherto 'seronegative' myasthenia gravis. Several articles address new diagnostic and therapeutic approaches to neuromyelitis optica, which further sharpen its profile as a distinct entity. Additionally, 4-aminopyridine has become a standard therapeutic for patients with cerebellar downbeat nystagmus. Finally, revised diagnostic criteria have been proposed for chronic relapsing inflammatory optic neuropathy based on a careful literature review over the last decade.

  5. [Cardiology update in 2015].

    Science.gov (United States)

    Pascale, Patrizio; Regamey, Julien; Iglesias, Juan F; Gabus, Vincent; Clair, Mathieu; Yerly, Patrick; Hullin, Roger; Müller, Olivier; Eeckhout, Éric; Vogt, Pierre

    2016-01-13

    The present review provides a selected choice of clinical trials and therapeutic advances in the field of cardiology in 2015. A new treatment option in heart failure will become available this year in Switzerland. In interventional cardiology, new trials have been published on the duration of dual antiplatelet therapy, the new stents with bioresorbable scaffold and the long-term results of TAVR in patients who are not surgical candidates or at high surgical risk. RegardingAF the BRIDGE trial provides new evidences to guide the management of patients during warfarin interruption for surgery. Recent publications are changing the paradigm of AF treatment by showing a major impact of the management of cardiometabolic risk factors. Finally, refined criteria for ECG interpretation in athletes have been recently proposed to reduce the burden of false-positive screening.

  6. Clinical Voices - an update

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Weed, Ethan

    Anomalous aspects of speech and voice, including pitch, fluency, and voice quality, are reported to characterise many mental disorders. However, it has proven difficult to quantify and explain this oddness of speech by employing traditional statistical methods. In this talk we will show how...

  7. A critical examination of some of the field indicators that have been proposed in connection with sound power determination using the intensity method

    DEFF Research Database (Denmark)

    Jacobsen, Finn

    1996-01-01

    A considerable number of 'field indicators' or 'quality indicators' have been proposed in connection with sound power determination based on measurement of intensity. For example, the ISO 9614-1 standard prescribes the use four indicators, and in the North American ANSI S12.12 standard no less th...

  8. Adaptive Fault Detection for Complex Dynamic Processes Based on JIT Updated Data Set

    Directory of Open Access Journals (Sweden)

    Jinna Li

    2012-01-01

    Full Text Available A novel fault detection technique is proposed to explicitly account for the nonlinear, dynamic, and multimodal problems existed in the practical and complex dynamic processes. Just-in-time (JIT detection method and k-nearest neighbor (KNN rule-based statistical process control (SPC approach are integrated to construct a flexible and adaptive detection scheme for the control process with nonlinear, dynamic, and multimodal cases. Mahalanobis distance, representing the correlation among samples, is used to simplify and update the raw data set, which is the first merit in this paper. Based on it, the control limit is computed in terms of both KNN rule and SPC method, such that we can identify whether the current data is normal or not by online approach. Noted that the control limit obtained changes with updating database such that an adaptive fault detection technique that can effectively eliminate the impact of data drift and shift on the performance of detection process is obtained, which is the second merit in this paper. The efficiency of the developed method is demonstrated by the numerical examples and an industrial case.

  9. The PMIPv6-Based Group Binding Update for IoT Devices

    Directory of Open Access Journals (Sweden)

    Jianfeng Guan

    2016-01-01

    Full Text Available Internet of Things (IoT has been booming with rapid increase of the various wearable devices, vehicle embedded devices, and so on, and providing the effective mobility management for these IoT devices becomes a challenge due to the different application scenarios as well as the limited energy and bandwidth. Recently, lots of researchers have focused on this topic and proposed several solutions based on the combination of IoT features and traditional mobility management protocols, in which most of these schemes take the IoT devices as mobile networks and adopt the NEtwork MObility (NEMO and its variants to provide the mobility support. However, these solutions are in face of the heavy signaling cost problem. Since IoT devices are generally combined to realize the complex functions, these devices may have similar movement behaviors. Clearly analyzing these characters and using them in the mobility management will reduce the signaling cost and improve the scalability. Motivated by this, we propose a PMIPv6-based group binding update method. In particular, we describe its group creation procedure, analyze its impact on the mobility management, and derive its reduction ratio in terms of signaling cost. The final results show that the introduction of group binding update can remarkably reduce the signaling cost.

  10. Bone scintiscanning updated.

    Science.gov (United States)

    Lentle, B C; Russell, A S; Percy, J S; Scott, J R; Jackson, F I

    1976-03-01

    Use of modern materials and methods has given bone scintiscanning a larger role in clinical medicine, The safety and ready availability of newer agents have led to its greater use in investigating both benign and malignant disease of bone and joint. Present evidence suggests that abnormal accumulation of 99mTc-polyphosphate and its analogues results from ionic deposition at crystal surfaces in immature bone, this process being facilitated by an increase in bone vascularity. There is, also, a component of matrix localization. These factors are in keeping with the concept that abnormal scintiscan sites represent areas of increased osteoblastic activity, although this may be an oversimplification. Increasing evidence shows that the bone scintiscan is more sensitive than conventional radiography in detecting focal disease of bone, and its ability to reflect the immediate status of bone further complements radiographic findings. The main limitation of this method relates to nonspecificity of the results obtained.

  11. Belief update as social choice

    NARCIS (Netherlands)

    van Benthem, J.; Girard, P.; Roy, O.; Marion, M.

    2011-01-01

    Dynamic epistemic-doxastic logics describe the new knowledge or new beliefs indexBelief of agents after some informational event has happened. Technically, this requires an update rule that turns a doxastic-epistemic modelM(recording the current information state of the agents) and a dynamic ‘event

  12. Deductive Updating Is Not Bayesian

    Science.gov (United States)

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2015-01-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based theories such as mental model theory and probabilistic theories. This study looks at conclusion updating after the addition of statistical information to examine the hypothesis that deductive reasoning cannot be explained by probabilistic…

  13. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. DateChange typeAffected areas March 26Update of the voice messaging systemAll CERN sites April 4Updat...

  14. Update of CERN exchange network

    CERN Multimedia

    2003-01-01

    An update of the CERN exchange network will be done next April. Disturbances or even interruptions of telephony services may occur from 4th to 24th April during evenings from 18:30 to 00:00 but will not exceed more than 4 consecutive hours (see tentative planning below). In addition, the voice messaging system will be shut down on March, 26th April from 18:00 to 00:00. Calls supposed to be routed to the voice messaging system will not be possible during the shutdown. CERN divisions are invited to avoid any change requests (set-ups, move or removals) of telephones and fax machines from 4th to 25th April. Everything will be done to minimize potential inconveniences which may occur during this update. There will be no loss of telephone functionalities. CERN GSM portable phones won't be affected by this change. Should you need more details, please send us your questions by email to Standard.Telephone@cern.ch. Date Change type Affected areas April 8 Update of switch in LHC 7 LHC 7 Point April 9 Update of...

  15. Treatability study sample exemption: update

    International Nuclear Information System (INIS)

    1997-01-01

    This document is a RCRA Information Brief intended to update the information in the 1991 Small-Scale Treatability Study Information Brief, and to address questions about the waste and treatability study sample exemptions that have arisen since References 3 and 5 were published

  16. A Mathematics Software Database Update.

    Science.gov (United States)

    Cunningham, R. S.; Smith, David A.

    1987-01-01

    Contains an update of an earlier listing of software for mathematics instruction at the college level. Topics are: advanced mathematics, algebra, calculus, differential equations, discrete mathematics, equation solving, general mathematics, geometry, linear and matrix algebra, logic, statistics and probability, and trigonometry. (PK)

  17. Internet Journal of Medical Update

    African Journals Online (AJOL)

    Arun Kumar Agnihotri

    The two surveys on 'Gambling Addiction' published in this issue of the Internet Journal of. Medical Update recommend that there is a need to improve training of psychiatrists in India as regards identification, assessment and treatment of gambling addicts. I think the findings will inform the development and implementation of ...

  18. Evidence-based guideline update

    DEFF Research Database (Denmark)

    Tfelt-Hansen, Peer Carsten

    2013-01-01

    Peer Carsten Tfelt-Hansen, Glostrup, Denmark: According to the recent American Academy of Neurology (AAN) guideline update, a drug can be recommended as possibly effective for migraine prevention if it had demonstrated efficacy in one Class II study.(1) Eight drugs are recommended as possibly...

  19. Eczema and ceramides: an update

    DEFF Research Database (Denmark)

    Jungersted, Jakob Mutanu; Agner, Tove

    2013-01-01

    types of treatment. We also consider the genetic influence on stratum corneum lipids. The review is an update on research indexed in PubMed following the discovery of the filaggrin mutations in atopic dermatitis in 2006, but when newer publications cannot stand alone, we include publications from before...

  20. Upgrade trigger: Biannual performance update

    CERN Document Server

    Aaij, Roel; Couturier, Ben; Esen, Sevda; De Cian, Michel; De Vries, Jacco Andreas; Dziurda, Agnieszka; Fitzpatrick, Conor; Fontana, Marianna; Grillo, Lucia; Hasse, Christoph; Jones, Christopher Rob; Le Gac, Renaud; Matev, Rosen; Neufeld, Niko; Nikodem, Thomas; Polci, Francesco; Del Buono, Luigi; Quagliani, Renato; Schwemmer, Rainer; Seyfert, Paul; Stahl, Sascha; Szumlak, Tomasz; Vesterinen, Mika Anton; Wanczyk, Joanna; Williams, Mark Richard James; Yin, Hang; Zacharjasz, Emilia Anna

    2017-01-01

    This document presents the performance of the LHCb Upgrade trigger reconstruction sequence, incorporating changes to the underlying reconstruction algorithms and detector description since the Trigger and Online Upgrade TDR. An updated extrapolation is presented using the most recent example of an Event Filter Farm node.