WorldWideScience

Sample records for advanced analysis methods

  1. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  2. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  3. Advanced Analysis Methods in High Energy Physics

    Energy Technology Data Exchange (ETDEWEB)

    Pushpalatha C. Bhat

    2001-10-03

    During the coming decade, high energy physics experiments at the Fermilab Tevatron and around the globe will use very sophisticated equipment to record unprecedented amounts of data in the hope of making major discoveries that may unravel some of Nature's deepest mysteries. The discovery of the Higgs boson and signals of new physics may be around the corner. The use of advanced analysis techniques will be crucial in achieving these goals. The author discusses some of the novel methods of analysis that could prove to be particularly valuable for finding evidence of any new physics, for improving precision measurements and for exploring parameter spaces of theoretical models.

  4. Advanced symbolic analysis for VLSI systems methods and applications

    CERN Document Server

    Shi, Guoyong; Tlelo Cuautle, Esteban

    2014-01-01

    This book provides comprehensive coverage of the recent advances in symbolic analysis techniques for design automation of nanometer VLSI systems. The presentation is organized in parts of fundamentals, basic implementation methods and applications for VLSI design. Topics emphasized include  statistical timing and crosstalk analysis, statistical and parallel analysis, performance bound analysis and behavioral modeling for analog integrated circuits . Among the recent advances, the Binary Decision Diagram (BDD) based approaches are studied in depth. The BDD-based hierarchical symbolic analysis approaches, have essentially broken the analog circuit size barrier. In particular, this book   • Provides an overview of classical symbolic analysis methods and a comprehensive presentation on the modern  BDD-based symbolic analysis techniques; • Describes detailed implementation strategies for BDD-based algorithms, including the principles of zero-suppression, variable ordering and canonical reduction; • Int...

  5. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    Science.gov (United States)

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  6. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  7. An advanced probabilistic structural analysis method for implicit performance functions

    Science.gov (United States)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  8. Recent advances in computational structural reliability analysis methods

    Science.gov (United States)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-10-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  9. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  10. Advanced methods for BWR transient and stability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, A; Wehle, F; Opel, S; Velten, R [AREVA, AREVA NP, Erlangen (Germany)

    2008-07-01

    The design of advanced Boiling Water Reactor (BWR) fuel assemblies and cores is governed by the basic requirement of safe, reliable and flexible reactor operation with optimal fuel utilization. AREVA NP's comprehensive steady state and transient BWR methodology allows the designer to respond quickly and effectively to customer needs. AREVA NP uses S-RELAP5/RAMONA as the appropriate methodology for the representation of the entire plant. The 3D neutron kinetics and thermal-hydraulics code has been developed for the prediction of system, fuel and core behavior and provides additional margins for normal operation and transients. Of major importance is the extensive validation of the methodology. The validation is based on measurements at AREVA NP's test facilities, and comparison of the predictions with a great wealth of measured data gathered from BWR plants during many years of operation. Three of the main fields of interest are stability analysis, operational transients and reactivity initiated accidents (RIAs). The introduced 3D methodology for operational transients shows significant margin regarding the operational limit of critical power ratio, which has been approved by the German licensing authority. Regarding BWR stability a large number of measurements at different plants under various conditions have been performed and successfully post-calculated with RAMONA. This is the basis of reliable pre-calculations of the locations of regional and core-wide stability boundaries. (authors)

  11. Advanced methods for BWR transient and stability analysis

    International Nuclear Information System (INIS)

    Schmidt, A.; Wehle, F.; Opel, S.; Velten, R.

    2008-01-01

    The design of advanced Boiling Water Reactor (BWR) fuel assemblies and cores is governed by the basic requirement of safe, reliable and flexible reactor operation with optimal fuel utilization. AREVA NP's comprehensive steady state and transient BWR methodology allows the designer to respond quickly and effectively to customer needs. AREVA NP uses S-RELAP5/RAMONA as the appropriate methodology for the representation of the entire plant. The 3D neutron kinetics and thermal-hydraulics code has been developed for the prediction of system, fuel and core behavior and provides additional margins for normal operation and transients. Of major importance is the extensive validation of the methodology. The validation is based on measurements at AREVA NP's test facilities, and comparison of the predictions with a great wealth of measured data gathered from BWR plants during many years of operation. Three of the main fields of interest are stability analysis, operational transients and reactivity initiated accidents (RIAs). The introduced 3D methodology for operational transients shows significant margin regarding the operational limit of critical power ratio, which has been approved by the German licensing authority. Regarding BWR stability a large number of measurements at different plants under various conditions have been performed and successfully post-calculated with RAMONA. This is the basis of reliable pre-calculations of the locations of regional and core-wide stability boundaries. (authors)

  12. Analysis advanced methods of data bases of industrial experience return

    International Nuclear Information System (INIS)

    Lannoy, A.; Procaccia, H.

    1994-05-01

    This is a presentation, through different conceptions of data bases on industrial experience return, of the principal methods for treatments and analyses of the collected data, going from the frequency statistic and factorial analysis, to the the Bayesian statistical decision theory, which is a real decision assistance tool for responsibles, conceivers and operators. Examples in various fields are given (OREDA: Offshore REliability DAta bank for marine drilling platforms, CEDB: Component Event Data Bank for european electric power industry, RDF 93: reliability of electronic components of ''France Telecom'', EVT: failure EVenTs data bank in the french nuclear power plants by ''EDF''). (A.B.). refs., figs., tabs

  13. Advanced methods of analysis variance on scenarios of nuclear prospective

    International Nuclear Information System (INIS)

    Blazquez, J.; Montalvo, C.; Balbas, M.; Garcia-Berrocal, A.

    2011-01-01

    Traditional techniques of propagation of variance are not very reliable, because there are uncertainties of 100% relative value, for this so use less conventional methods, such as Beta distribution, Fuzzy Logic and the Monte Carlo Method.

  14. Promising method advancement in palynology: a supplement to pollen analysis

    DEFF Research Database (Denmark)

    Enevold, Renée; Odgaard, Bent Vad

    2016-01-01

    The analysis of Non Pollen Palynomorphs (NPPs) has evolved over the last few decades to be a fruitful supplement to palynological surveys and has especially proven to be a useful addition when interpreting anthropogenic disturbance of the natural environment. NPPs in anthropogenic soils...... from sediment-, pollen- and macrofossil analyses. Fungal ascospore x1000. Photo: Renée Enevold...

  15. Advances in Modal Analysis Using a Robust and Multiscale Method

    Science.gov (United States)

    Picard, Cécile; Frisson, Christian; Faure, François; Drettakis, George; Kry, Paul G.

    2010-12-01

    This paper presents a new approach to modal synthesis for rendering sounds of virtual objects. We propose a generic method that preserves sound variety across the surface of an object at different scales of resolution and for a variety of complex geometries. The technique performs automatic voxelization of a surface model and automatic tuning of the parameters of hexahedral finite elements, based on the distribution of material in each cell. The voxelization is performed using a sparse regular grid embedding of the object, which permits the construction of plausible lower resolution approximations of the modal model. We can compute the audible impulse response of a variety of objects. Our solution is robust and can handle nonmanifold geometries that include both volumetric and surface parts. We present a system which allows us to manipulate and tune sounding objects in an appropriate way for games, training simulations, and other interactive virtual environments.

  16. Advances in Modal Analysis Using a Robust and Multiscale Method

    Directory of Open Access Journals (Sweden)

    Frisson Christian

    2010-01-01

    Full Text Available Abstract This paper presents a new approach to modal synthesis for rendering sounds of virtual objects. We propose a generic method that preserves sound variety across the surface of an object at different scales of resolution and for a variety of complex geometries. The technique performs automatic voxelization of a surface model and automatic tuning of the parameters of hexahedral finite elements, based on the distribution of material in each cell. The voxelization is performed using a sparse regular grid embedding of the object, which permits the construction of plausible lower resolution approximations of the modal model. We can compute the audible impulse response of a variety of objects. Our solution is robust and can handle nonmanifold geometries that include both volumetric and surface parts. We present a system which allows us to manipulate and tune sounding objects in an appropriate way for games, training simulations, and other interactive virtual environments.

  17. Functional efficiency comparison between split- and parallel-hybrid using advanced energy flow analysis methods

    Energy Technology Data Exchange (ETDEWEB)

    Guttenberg, Philipp; Lin, Mengyan [Romax Technology, Nottingham (United Kingdom)

    2009-07-01

    The following paper presents a comparative efficiency analysis of the Toyota Prius versus the Honda Insight using advanced Energy Flow Analysis methods. The sample study shows that even very different hybrid concepts like a split- and a parallel-hybrid can be compared in a high level of detail and demonstrates the benefit showing exemplary results. (orig.)

  18. Advanced differential quadrature methods

    CERN Document Server

    Zong, Zhi

    2009-01-01

    Modern Tools to Perform Numerical DifferentiationThe original direct differential quadrature (DQ) method has been known to fail for problems with strong nonlinearity and material discontinuity as well as for problems involving singularity, irregularity, and multiple scales. But now researchers in applied mathematics, computational mechanics, and engineering have developed a range of innovative DQ-based methods to overcome these shortcomings. Advanced Differential Quadrature Methods explores new DQ methods and uses these methods to solve problems beyond the capabilities of the direct DQ method.After a basic introduction to the direct DQ method, the book presents a number of DQ methods, including complex DQ, triangular DQ, multi-scale DQ, variable order DQ, multi-domain DQ, and localized DQ. It also provides a mathematical compendium that summarizes Gauss elimination, the Runge-Kutta method, complex analysis, and more. The final chapter contains three codes written in the FORTRAN language, enabling readers to q...

  19. Advanced methods for a probabilistic safety analysis of fires. Development of advanced methods for performing as far as possible realistic plant specific fire risk analysis (fire PSA)

    International Nuclear Information System (INIS)

    Hofer, E.; Roewekamp, M.; Tuerschmann, M.

    2003-07-01

    In the frame of the research project RS 1112 'Development of Methods for a Recent Probabilistic Safety Analysis, Particularly Level 2' funded by the German Federal Ministry of Economics and Technology (BMWi), advanced methods, in particular for performing as far as possible realistic plant specific fire risk analyses (fire PSA), should be developed. The present Technical Report gives an overview on the methodologies developed in this context for assessing the fire hazard. In the context of developing advanced methodologies for fire PSA, a probabilistic dynamics analysis with a fire simulation code including an uncertainty and sensitivity study has been performed for an exemplary scenario of a cable fire induced by an electric cabinet inside the containment of a modern Konvoi type German nuclear power plant taking into consideration the effects of fire detection and fire extinguishing means. With the present study, it was possible for the first time to determine the probabilities of specified fire effects from a class of fire events by means of probabilistic dynamics supplemented by uncertainty and sensitivity analyses. The analysis applies a deterministic dynamics model, consisting of a dynamic fire simulation code and a model of countermeasures, considering effects of the stochastics (so-called aleatory uncertainties) as well as uncertainties in the state of knowledge (so-called epistemic uncertainties). By this means, probability assessments including uncertainties are provided to be used within the PSA. (orig.) [de

  20. Features of an advanced human reliability analysis method, AGAPE-ET

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun

    2005-01-01

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided

  1. Features of an advanced human reliability analysis method, AGAPE-ET

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun [Korea Atomic Energy Research Institute, Taejeon (Korea, Republic of)

    2005-11-15

    This paper presents the main features of an advanced human reliability analysis (HRA) method, AGAPE-ET. It has the capabilities to deal with the diagnosis failures and the errors of commission (EOC), which have not been normally treated in the conventional HRAs. For the analysis of the potential for diagnosis failures, an analysis framework, which is called the misdiagnosis tree analysis (MDTA), and a taxonomy of the misdiagnosis causes with appropriate quantification schemes are provided. For the identification of the EOC events from the misdiagnosis, some procedural guidance is given. An example of the application of the method is also provided.

  2. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    Science.gov (United States)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  3. An advanced analysis method of initial orbit determination with too short arc data

    Science.gov (United States)

    Li, Binzhe; Fang, Li

    2018-02-01

    This paper studies the initial orbit determination (IOD) based on space-based angle measurement. Commonly, these space-based observations have short durations. As a result, classical initial orbit determination algorithms give poor results, such as Laplace methods and Gauss methods. In this paper, an advanced analysis method of initial orbit determination is developed for space-based observations. The admissible region and triangulation are introduced in the method. Genetic algorithm is also used for adding some constraints of parameters. Simulation results show that the algorithm can successfully complete the initial orbit determination.

  4. Comparing sensitivity analysis methods to advance lumped watershed model identification and evaluation

    Directory of Open Access Journals (Sweden)

    Y. Tang

    2007-01-01

    Full Text Available This study seeks to identify sensitivity tools that will advance our understanding of lumped hydrologic models for the purposes of model improvement, calibration efficiency and improved measurement schemes. Four sensitivity analysis methods were tested: (1 local analysis using parameter estimation software (PEST, (2 regional sensitivity analysis (RSA, (3 analysis of variance (ANOVA, and (4 Sobol's method. The methods' relative efficiencies and effectiveness have been analyzed and compared. These four sensitivity methods were applied to the lumped Sacramento soil moisture accounting model (SAC-SMA coupled with SNOW-17. Results from this study characterize model sensitivities for two medium sized watersheds within the Juniata River Basin in Pennsylvania, USA. Comparative results for the 4 sensitivity methods are presented for a 3-year time series with 1 h, 6 h, and 24 h time intervals. The results of this study show that model parameter sensitivities are heavily impacted by the choice of analysis method as well as the model time interval. Differences between the two adjacent watersheds also suggest strong influences of local physical characteristics on the sensitivity methods' results. This study also contributes a comprehensive assessment of the repeatability, robustness, efficiency, and ease-of-implementation of the four sensitivity methods. Overall ANOVA and Sobol's method were shown to be superior to RSA and PEST. Relative to one another, ANOVA has reduced computational requirements and Sobol's method yielded more robust sensitivity rankings.

  5. Advances in Numerical Methods

    CERN Document Server

    Mastorakis, Nikos E

    2009-01-01

    Features contributions that are focused on significant aspects of current numerical methods and computational mathematics. This book carries chapters that advanced methods and various variations on known techniques that can solve difficult scientific problems efficiently.

  6. Advances in research methods for information systems research data mining, data envelopment analysis, value focused thinking

    CERN Document Server

    Osei-Bryson, Kweku-Muata

    2013-01-01

    Advances in social science research methodologies and data analytic methods are changing the way research in information systems is conducted. New developments in statistical software technologies for data mining (DM) such as regression splines or decision tree induction can be used to assist researchers in systematic post-positivist theory testing and development. Established management science techniques like data envelopment analysis (DEA), and value focused thinking (VFT) can be used in combination with traditional statistical analysis and data mining techniques to more effectively explore

  7. Advances in iterative methods

    International Nuclear Information System (INIS)

    Beauwens, B.; Arkuszewski, J.; Boryszewicz, M.

    1981-01-01

    Results obtained in the field of linear iterative methods within the Coordinated Research Program on Transport Theory and Advanced Reactor Calculations are summarized. The general convergence theory of linear iterative methods is essentially based on the properties of nonnegative operators on ordered normed spaces. The following aspects of this theory have been improved: new comparison theorems for regular splittings, generalization of the notions of M- and H-matrices, new interpretations of classical convergence theorems for positive-definite operators. The estimation of asymptotic convergence rates was developed with two purposes: the analysis of model problems and the optimization of relaxation parameters. In the framework of factorization iterative methods, model problem analysis is needed to investigate whether the increased computational complexity of higher-order methods does not offset their increased asymptotic convergence rates, as well as to appreciate the effect of standard relaxation techniques (polynomial relaxation). On the other hand, the optimal use of factorization iterative methods requires the development of adequate relaxation techniques and their optimization. The relative performances of a few possibilities have been explored for model problems. Presently, the best results have been obtained with optimal diagonal-Chebyshev relaxation

  8. Sensitivity analysis of infectious disease models: methods, advances and their application

    Science.gov (United States)

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  9. A Proposal on the Advanced Sampling Based Sensitivity and Uncertainty Analysis Method for the Eigenvalue Uncertainty Analysis

    International Nuclear Information System (INIS)

    Kim, Song Hyun; Song, Myung Sub; Shin, Chang Ho; Noh, Jae Man

    2014-01-01

    In using the perturbation theory, the uncertainty of the response can be estimated by a single transport simulation, and therefore it requires small computational load. However, it has a disadvantage that the computation methodology must be modified whenever estimating different response type such as multiplication factor, flux, or power distribution. Hence, it is suitable for analyzing few responses with lots of perturbed parameters. Statistical approach is a sampling based method which uses randomly sampled cross sections from covariance data for analyzing the uncertainty of the response. XSUSA is a code based on the statistical approach. The cross sections are only modified with the sampling based method; thus, general transport codes can be directly utilized for the S/U analysis without any code modifications. However, to calculate the uncertainty distribution from the result, code simulation should be enough repeated with randomly sampled cross sections. Therefore, this inefficiency is known as a disadvantage of the stochastic method. In this study, an advanced sampling method of the cross sections is proposed and verified to increase the estimation efficiency of the sampling based method. In this study, to increase the estimation efficiency of the sampling based S/U method, an advanced sampling and estimation method was proposed. The main feature of the proposed method is that the cross section averaged from each single sampled cross section is used. For the use of the proposed method, the validation was performed using the perturbation theory

  10. Advances in exergy analysis: a novel assessment of the Extended Exergy Accounting method

    International Nuclear Information System (INIS)

    Rocco, M.V.; Colombo, E.; Sciubba, E.

    2014-01-01

    additional insight in and more relevant information for every comparative analysis of energy conversion systems, both at a global and a local level. In the paper, traditional and advanced exergy analysis methods are briefly discussed and EEA theoretical foundations and details for its application are described in detail. Methods: The method converts not only material and energy flows, but externalities as well (labour, capital and environmental costs) into flows of equivalent primary exergy, so that all exchanges between the system and the environment can be completely accounted for on a rigorous thermodynamic basis. The current emphasis decision makers and by public opinion alike seem to be placing on sustainability generates the need for continue research in the field of systems analysis, and a preliminary review confirms that exergy may constitute a coherent and rational basis for developing global and local analysis methods. Moreover, extended exergy accounting possesses some specific and peculiar characteristics that make it more suitable for life-cycle and cradle-to-grave (or well-to-wheel) applications. Results: Taxonomy for the classification of exergy-based methods is proposed. A novel assessment of the EEA method is provided, its advantages and drawbacks are discussed and areas in need of further theoretical investigation are identified. Conclusions: Since EEA is a life-cycle method, it is argued that it represents an improvement with regard to other current methods, in that it provides additional insight into the phenomenological aspects of any “energy conversion chain”. The paper demonstrates that the Extended Exergy cost function can be used within the traditional and very well formalized Thermoeconomic framework, replacing the economic cost function in order to evaluate and optimize the consumption of resources of a system in a more complete and rational way. Practical implications: This paper contains some specific proposals as to the further development

  11. Identification of advanced human factors engineering analysis, design and evaluation methods

    International Nuclear Information System (INIS)

    Plott, C.; Ronan, A. M.; Laux, L.; Bzostek, J.; Milanski, J.; Scheff, S.

    2006-01-01

    NUREG-0711 Rev.2, 'Human Factors Engineering Program Review Model,' provides comprehensive guidance to the Nuclear Regulatory Commission (NRC) in assessing the human factors practices employed by license applicants for Nuclear Power Plant control room designs. As software based human-system interface (HSI) technologies supplant traditional hardware-based technologies, the NRC may encounter new HSI technologies or seemingly unconventional approaches to human factors design, analysis, and evaluation methods which NUREG-0711 does not anticipate. A comprehensive survey was performed to identify advanced human factors engineering analysis, design and evaluation methods, tools, and technologies that the NRC may encounter in near term future licensee applications. A review was conducted to identify human factors methods, tools, and technologies relevant to each review element of NUREG-0711. Additionally emerging trends in technology which have the potential to impact review elements, such as Augmented Cognition, and various wireless tools and technologies were identified. The purpose of this paper is to provide an overview of the survey results and to highlight issues that could be revised or adapted to meet with emerging trends. (authors)

  12. Advances in Biosensing Methods

    Directory of Open Access Journals (Sweden)

    Reema Taneja

    2007-02-01

    Full Text Available A fractal analysis is presented for the binding and dissociation (if applicable kinetics of analyte-receptor reactions occurring on biosensor surfaces. The applications of the biosensors have appeared in the recent literature. The examples provided together provide the reader with a perspective of the advances in biosensors that are being used to detect analytes of interest. This should also stimulate interest in applying biosensors to other areas of application. The fractal analysis limits the evaluation of the rate constants for binding and dissociation (if applicable for the analyte-receptor reactions occurring in biosensor surfaces. The fractal dimension provides a quantitative measure of the degree of heterogeneity on the biosensor surface. Predictive relations are presented that relate the binding co-efficient with the degree of heterogeneity or the fractal dimension on the biosensor surface

  13. Application of advanced data reduction methods to gas turbine dynamic analysis

    International Nuclear Information System (INIS)

    Juhl, P.B.

    1978-01-01

    This paper discusses the application of advanced data reduction methods to the evaluation of dynamic data from gas turbines and turbine components. The use of the Fast Fourier Transform and of real-time spectrum analyzers is discussed. The use of power spectral density and probability density functions for analyzing random data is discussed. Examples of the application of these modern techniques to gas turbine testing are presented. The use of the computer to automate the data reduction procedures is discussed. (orig.) [de

  14. An Analysis and Quantification Method of Human Errors of Soft Controls in Advanced MCRs

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jae Whan; Jang, Seung Cheol

    2011-01-01

    In this work, a method was proposed for quantifying human errors that may occur during operation executions using soft control. Soft controls of advanced main control rooms (MCRs) have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to define the human error modes and to quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests a modified K-HRA method for quantifying error probability

  15. Technology Alignment and Portfolio Prioritization (TAPP): Advanced Methods in Strategic Analysis, Technology Forecasting and Long Term Planning for Human Exploration and Operations, Advanced Exploration Systems and Advanced Concepts

    Science.gov (United States)

    Funaro, Gregory V.; Alexander, Reginald A.

    2015-01-01

    The Advanced Concepts Office (ACO) at NASA, Marshall Space Flight Center is expanding its current technology assessment methodologies. ACO is developing a framework called TAPP that uses a variety of methods, such as association mining and rule learning from data mining, structure development using a Technological Innovation System (TIS), and social network modeling to measure structural relationships. The role of ACO is to 1) produce a broad spectrum of ideas and alternatives for a variety of NASA's missions, 2) determine mission architecture feasibility and appropriateness to NASA's strategic plans, and 3) define a project in enough detail to establish an initial baseline capable of meeting mission objectives ACO's role supports the decision­-making process associated with the maturation of concepts for traveling through, living in, and understanding space. ACO performs concept studies and technology assessments to determine the degree of alignment between mission objectives and new technologies. The first step in technology assessment is to identify the current technology maturity in terms of a technology readiness level (TRL). The second step is to determine the difficulty associated with advancing a technology from one state to the next state. NASA has used TRLs since 1970 and ACO formalized them in 1995. The DoD, ESA, Oil & Gas, and DoE have adopted TRLs as a means to assess technology maturity. However, "with the emergence of more complex systems and system of systems, it has been increasingly recognized that TRL assessments have limitations, especially when considering [the] integration of complex systems." When performing the second step in a technology assessment, NASA requires that an Advancement Degree of Difficulty (AD2) method be utilized. NASA has used and developed or used a variety of methods to perform this step: Expert Opinion or Delphi Approach, Value Engineering or Value Stream, Analytical Hierarchy Process (AHP), Technique for the Order of

  16. Advanced surrogate model and sensitivity analysis methods for sodium fast reactor accident assessment

    International Nuclear Information System (INIS)

    Marrel, A.; Marie, N.; De Lozzo, M.

    2015-01-01

    Within the framework of the generation IV Sodium Fast Reactors, the safety in case of severe accidents is assessed. From this statement, CEA has developed a new physical tool to model the accident initiated by the Total Instantaneous Blockage (TIB) of a sub-assembly. This TIB simulator depends on many uncertain input parameters. This paper aims at proposing a global methodology combining several advanced statistical techniques in order to perform a global sensitivity analysis of this TIB simulator. The objective is to identify the most influential uncertain inputs for the various TIB outputs involved in the safety analysis. The proposed statistical methodology combining several advanced statistical techniques enables to take into account the constraints on the TIB simulator outputs (positivity constraints) and to deal simultaneously with various outputs. To do this, a space-filling design is used and the corresponding TIB model simulations are performed. Based on this learning sample, an efficient constrained Gaussian process metamodel is fitted on each TIB model outputs. Then, using the metamodels, classical sensitivity analyses are made for each TIB output. Multivariate global sensitivity analyses based on aggregated indices are also performed, providing additional valuable information. Main conclusions on the influence of each uncertain input are derived. - Highlights: • Physical-statistical tool for Sodium Fast Reactors TIB accident. • 27 uncertain parameters (core state, lack of physical knowledge) are highlighted. • Constrained Gaussian process efficiently predicts TIB outputs (safety criteria). • Multivariate sensitivity analyses reveal that three inputs are mainly influential. • The type of corium propagation (thermal or hydrodynamic) is the most influential

  17. Advanced x-ray stress analysis method for a single crystal using different diffraction plane families

    International Nuclear Information System (INIS)

    Imafuku, Muneyuki; Suzuki, Hiroshi; Sueyoshi, Kazuyuki; Akita, Koichi; Ohya, Shin-ichi

    2008-01-01

    Generalized formula of the x-ray stress analysis for a single crystal with unknown stress-free lattice parameter was proposed. This method enables us to evaluate the plane stress states with any combination of diffraction planes. We can choose and combine the appropriate x-ray sources and diffraction plane families, depending on the sample orientation and the apparatus, whenever diffraction condition is satisfied. The analysis of plane stress distributions in an iron single crystal was demonstrated combining with the diffraction data for Fe{211} and Fe{310} plane families

  18. ADVANCEMENTS IN TIME-SPECTRA ANALYSIS METHODS FOR LEAD SLOWING-DOWN SPECTROSCOPY

    International Nuclear Information System (INIS)

    Smith, Leon E.; Anderson, Kevin K.; Gesh, Christopher J.; Shaver, Mark W.

    2010-01-01

    Direct measurement of Pu in spent nuclear fuel remains a key challenge for safeguarding nuclear fuel cycles of today and tomorrow. Lead slowing-down spectroscopy (LSDS) is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic mass with an uncertainty lower than the approximately 10 percent typical of today's confirmatory assay methods. Pacific Northwest National Laboratory's (PNNL) previous work to assess the viability of LSDS for the assay of pressurized water reactor (PWR) assemblies indicated that the method could provide direct assay of Pu-239 and U-235 (and possibly Pu-240 and Pu-241) with uncertainties less than a few percent, assuming suitably efficient instrumentation, an intense pulsed neutron source, and improvements in the time-spectra analysis methods used to extract isotopic information from a complex LSDS signal. This previous simulation-based evaluation used relatively simple PWR fuel assembly definitions (e.g. constant burnup across the assembly) and a constant initial enrichment and cooling time. The time-spectra analysis method was founded on a preliminary analytical model of self-shielding intended to correct for assay-signal nonlinearities introduced by attenuation of the interrogating neutron flux within the assembly.

  19. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections

    Directory of Open Access Journals (Sweden)

    Stefano Stacul

    2018-02-01

    Full Text Available A Boundary Element Method (BEM approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ.

  20. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections.

    Science.gov (United States)

    Stacul, Stefano; Squeglia, Nunziante

    2018-02-15

    A Boundary Element Method (BEM) approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ.

  1. Direct methods for limit and shakedown analysis of structures advanced computational algorithms and material modelling

    CERN Document Server

    Pisano, Aurora; Weichert, Dieter

    2015-01-01

    Articles in this book examine various materials and how to determine directly the limit state of a structure, in the sense of limit analysis and shakedown analysis. Apart from classical applications in mechanical and civil engineering contexts, the book reports on the emerging field of material design beyond the elastic limit, which has further industrial design and technological applications. Readers will discover that “Direct Methods” and the techniques presented here can in fact be used to numerically estimate the strength of structured materials such as composites or nano-materials, which represent fruitful fields of future applications.   Leading researchers outline the latest computational tools and optimization techniques and explore the possibility of obtaining information on the limit state of a structure whose post-elastic loading path and constitutive behavior are not well defined or well known. Readers will discover how Direct Methods allow rapid and direct access to requested information in...

  2. Recent advances in sample preparation methods for analysis of endocrine disruptors from various matrices.

    Science.gov (United States)

    Singh, Baljinder; Kumar, Ashwini; Malik, Ashok Kumar

    2014-01-01

    Due to the high toxicity of endocrine disruptors (EDs), studies are being undertaken to design effective techniques for separation and detection of EDs in various matrices. Recently, research activities in this area have shown that a diverse range of chromatographic techniques are available for the quantification and analysis of EDs. Therefore, on the basis of significant, recent original publications, we aimed at providing an overview of different separation and detection methods for the determination of trace-level concentrations of selected EDs. The biological effects of EDs and current pretreatment techniques applied to EDs are also discussed. Various types of chromatographic techniques are presented for quantification, highlighting time- and cost-effective techniques that separate and quantify trace levels of multiple EDs from various environmental matrices. Reports related to methods for the quantification of EDs from various matrices primarily published since 2008 have been cited.

  3. 3-D inelastic analysis methods for hot section components. Volume 2: Advanced special functions models

    Science.gov (United States)

    Wilson, R. B.; Banerjee, P. K.

    1987-01-01

    This Annual Status Report presents the results of work performed during the third year of the 3-D Inelastic Analysis Methods for Hot Sections Components program (NASA Contract NAS3-23697). The objective of the program is to produce a series of computer codes that permit more accurate and efficient three-dimensional analyses of selected hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The computer codes embody a progression of mathematical models and are streamlined to take advantage of geometrical features, loading conditions, and forms of material response that distinguish each group of selected components.

  4. Development of advanced methods for analysis of experimental data in diffusion

    Science.gov (United States)

    Jaques, Alonso V.

    There are numerous experimental configurations and data analysis techniques for the characterization of diffusion phenomena. However, the mathematical methods for estimating diffusivities traditionally do not take into account the effects of experimental errors in the data, and often require smooth, noiseless data sets to perform the necessary analysis steps. The current methods used for data smoothing require strong assumptions which can introduce numerical "artifacts" into the data, affecting confidence in the estimated parameters. The Boltzmann-Matano method is used extensively in the determination of concentration - dependent diffusivities, D(C), in alloys. In the course of analyzing experimental data, numerical integrations and differentiations of the concentration profile are performed. These methods require smoothing of the data prior to analysis. We present here an approach to the Boltzmann-Matano method that is based on a regularization method to estimate a differentiation operation on the data, i.e., estimate the concentration gradient term, which is important in the analysis process for determining the diffusivity. This approach, therefore, has the potential to be less subjective, and in numerical simulations shows an increased accuracy in the estimated diffusion coefficients. We present a regression approach to estimate linear multicomponent diffusion coefficients that eliminates the need pre-treat or pre-condition the concentration profile. This approach fits the data to a functional form of the mathematical expression for the concentration profile, and allows us to determine the diffusivity matrix directly from the fitted parameters. Reformulation of the equation for the analytical solution is done in order to reduce the size of the problem and accelerate the convergence. The objective function for the regression can incorporate point estimations for error in the concentration, improving the statistical confidence in the estimated diffusivity matrix

  5. Advanced methods for the analysis, design, and optimization of SMA-based aerostructures

    International Nuclear Information System (INIS)

    Hartl, D J; Lagoudas, D C; Calkins, F T

    2011-01-01

    Engineers continue to apply shape memory alloys to aerospace actuation applications due to their high energy density, robust solid-state actuation, and silent and shock-free operation. Past design and development of such actuators relied on experimental trial and error and empirically derived graphical methods. Over the last two decades, however, it has been repeatedly demonstrated that existing SMA constitutive models can capture stabilized SMA transformation behaviors with sufficient accuracy. This work builds upon past successes and suggests a general framework by which predictive tools can be used to assess the responses of many possible design configurations in an automated fashion. By applying methods of design optimization, it is shown that the integrated implementation of appropriate analysis tools can guide engineers and designers to the best design configurations. A general design optimization framework is proposed for the consideration of any SMA component or assembly of such components that applies when the set of design variables includes many members. This is accomplished by relying on commercially available software and utilizing tools already well established in the design optimization community. Such tools are combined with finite element analysis (FEA) packages that consider a multitude of structural effects. The foundation of this work is a three-dimensional thermomechanical constitutive model for SMAs applicable for arbitrarily shaped bodies. A reduced-order implementation also allows computationally efficient analysis of structural components such as wires, rods, beams and shells. The use of multiple optimization schemes, the consideration of assembled components, and the accuracy of the implemented constitutive model in full and reduced-order forms are all demonstrated

  6. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    International Nuclear Information System (INIS)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A.

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety

  7. Development of task analysis method for operator tasks in main control room of an advanced nuclear power plant

    International Nuclear Information System (INIS)

    Lin Chiuhsiangloe; Hsieh Tsungling

    2016-01-01

    Task analysis methods provide an insight for quantitative and qualitative predictions of how people will use a proposed system, though the different versions have different emphases. Most of the methods can attest to the coverage of the functionality of a system and all provide estimates of task performance time. However, most of the tasks that operators deal with in a digital work environment in the main control room of an advanced nuclear power plant require high mental activity. Such mental tasks overlap and must be dealt with at the same time; most of them can be assumed to be highly parallel in nature. Therefore, the primary aim to be addressed in this paper was to develop a method that adopts CPM-GOMS (cognitive perceptual motor-goals operators methods selection rules) as the basic pattern of mental task analysis for the advanced main control room. A within-subjects experiment design was used to examine the validity of the modified CPM-GOMS. Thirty participants participated in two task types, which included high- and low-compatibility types. The results indicated that the performance was significantly higher on the high-compatibility task type than on the low-compatibility task type; that is, the modified CPM-GOMS could distinguish the difference between high- and low-compatibility mental tasks. (author)

  8. Bayesian analysis of general failure data from an ageing distribution: advances in numerical methods

    International Nuclear Information System (INIS)

    Procaccia, H.; Villain, B.; Clarotti, C.A.

    1996-01-01

    EDF and ENEA carried out a joint research program for developing the numerical methods and computer codes needed for Bayesian analysis of component-lives in the case of ageing. Early results of this study were presented at ESREL'94. Since then the following further steps have been gone: input data have been generalized to the case that observed lives are censored both on the right and on the left; allowable life distributions are Weibull and gamma - their parameters are both unknown and can be statistically dependent; allowable priors are histograms relative to different parametrizations of the life distribution of concern; first-and-second-order-moments of the posterior distributions can be computed. In particular the covariance will give some important information about the degree of the statistical dependence between the parameters of interest. An application of the code to the appearance of a stress corrosion cracking in a tube of the PWR Steam Generator system is presented. (authors)

  9. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    Science.gov (United States)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  10. Intercomparison of analysis methods for seismically isolated nuclear structures. Part 1: Advanced test data and numerical methods. Working material

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of the meeting was to review proposed contributions from CRP participating organizations to discuss in detail the experimental data on seismic isolators, to review the numerical methods for the analysis of the seismic isolators, and to perform a first comparison of the calculation results. The aim of the CRP was to validate the reliable numerical methods used for both detailed evaluation of dynamic behaviour of isolation devices and isolated nuclear structures of different nuclear power plant types. The full maturity of seismic isolation for nuclear applications was stressed, as well as the excellent behaviour of isolated structures during the recent earthquakes in Japan and the USA. Participants from Italy, USA, Japan, Russian federation, Republic of Korea, United Kingdom, India and European Commission have presented overview papers on the present programs and their status of contribution to the CRP

  11. Intercomparison of analysis methods for seismically isolated nuclear structures. Part 1: Advanced test data and numerical methods. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1993-07-01

    The purpose of the meeting was to review proposed contributions from CRP participating organizations to discuss in detail the experimental data on seismic isolators, to review the numerical methods for the analysis of the seismic isolators, and to perform a first comparison of the calculation results. The aim of the CRP was to validate the reliable numerical methods used for both detailed evaluation of dynamic behaviour of isolation devices and isolated nuclear structures of different nuclear power plant types. The full maturity of seismic isolation for nuclear applications was stressed, as well as the excellent behaviour of isolated structures during the recent earthquakes in Japan and the USA. Participants from Italy, USA, Japan, Russian federation, Republic of Korea, United Kingdom, India and European Commission have presented overview papers on the present programs and their status of contribution to the CRP.

  12. Application of advanced irradiation analysis methods to light water reactor pressure vessel test and surveillance programs

    International Nuclear Information System (INIS)

    Odette, R.; Dudey, N.; McElroy, W.; Wullaert, R.; Fabry, A.

    1977-01-01

    Inaccurate characterization and inappropriate application of neutron irradiation exposure variables contribute a substantial amount of uncertainty to embrittlement analysis of light water reactor pressure vessels. Damage analysis involves characterization of the irradiation environment (dosimetry), correlation of test and surveillance metallurgical and dosimetry data, and projection of such data to service conditions. Errors in available test and surveillance dosimetry data are estimated to contribute a factor of approximately 2 to the data scatter. Non-physical (empirical) correlation procedures and the need to extrapolate to the vessel may add further error. Substantial reductions in these uncertainties in future programs can be obtained from a more complete application of available damage analysis tools which have been developed for the fast reactor program. An approach to reducing embrittlement analysis errors is described, and specific examples of potential applications are given. The approach is based on damage analysis techniques validated and calibrated in benchmark environments

  13. A Bayesian analysis of rare B decays with advanced Monte Carlo methods

    International Nuclear Information System (INIS)

    Beaujean, Frederik

    2012-01-01

    Searching for new physics in rare B meson decays governed by b → s transitions, we perform a model-independent global fit of the short-distance couplings C 7 , C 9 , and C 10 of the ΔB=1 effective field theory. We assume the standard-model set of b → sγ and b → sl + l - operators with real-valued C i . A total of 59 measurements by the experiments BaBar, Belle, CDF, CLEO, and LHCb of observables in B→K * γ, B→K (*) l + l - , and B s →μ + μ - decays are used in the fit. Our analysis is the first of its kind to harness the full power of the Bayesian approach to probability theory. All main sources of theory uncertainty explicitly enter the fit in the form of nuisance parameters. We make optimal use of the experimental information to simultaneously constrain theWilson coefficients as well as hadronic form factors - the dominant theory uncertainty. Generating samples from the posterior probability distribution to compute marginal distributions and predict observables by uncertainty propagation is a formidable numerical challenge for two reasons. First, the posterior has multiple well separated maxima and degeneracies. Second, the computation of the theory predictions is very time consuming. A single posterior evaluation requires O(1s), and a few million evaluations are needed. Population Monte Carlo (PMC) provides a solution to both issues; a mixture density is iteratively adapted to the posterior, and samples are drawn in a massively parallel way using importance sampling. The major shortcoming of PMC is the need for cogent knowledge of the posterior at the initial stage. In an effort towards a general black-box Monte Carlo sampling algorithm, we present a new method to extract the necessary information in a reliable and automatic manner from Markov chains with the help of hierarchical clustering. Exploiting the latest 2012 measurements, the fit reveals a flipped-sign solution in addition to a standard-model-like solution for the couplings C i . The

  14. A Bayesian analysis of rare B decays with advanced Monte Carlo methods

    Energy Technology Data Exchange (ETDEWEB)

    Beaujean, Frederik

    2012-11-12

    Searching for new physics in rare B meson decays governed by b {yields} s transitions, we perform a model-independent global fit of the short-distance couplings C{sub 7}, C{sub 9}, and C{sub 10} of the {Delta}B=1 effective field theory. We assume the standard-model set of b {yields} s{gamma} and b {yields} sl{sup +}l{sup -} operators with real-valued C{sub i}. A total of 59 measurements by the experiments BaBar, Belle, CDF, CLEO, and LHCb of observables in B{yields}K{sup *}{gamma}, B{yields}K{sup (*)}l{sup +}l{sup -}, and B{sub s}{yields}{mu}{sup +}{mu}{sup -} decays are used in the fit. Our analysis is the first of its kind to harness the full power of the Bayesian approach to probability theory. All main sources of theory uncertainty explicitly enter the fit in the form of nuisance parameters. We make optimal use of the experimental information to simultaneously constrain theWilson coefficients as well as hadronic form factors - the dominant theory uncertainty. Generating samples from the posterior probability distribution to compute marginal distributions and predict observables by uncertainty propagation is a formidable numerical challenge for two reasons. First, the posterior has multiple well separated maxima and degeneracies. Second, the computation of the theory predictions is very time consuming. A single posterior evaluation requires O(1s), and a few million evaluations are needed. Population Monte Carlo (PMC) provides a solution to both issues; a mixture density is iteratively adapted to the posterior, and samples are drawn in a massively parallel way using importance sampling. The major shortcoming of PMC is the need for cogent knowledge of the posterior at the initial stage. In an effort towards a general black-box Monte Carlo sampling algorithm, we present a new method to extract the necessary information in a reliable and automatic manner from Markov chains with the help of hierarchical clustering. Exploiting the latest 2012 measurements, the fit

  15. Advanced methods in X-ray and neutron structure analysis of materials

    International Nuclear Information System (INIS)

    1990-01-01

    The publication contains abstracts of 186 contributions presented at the international conference in the oral or poster form. Attention was paid particularly to crystal and molecular structures, diffraction analysis of physical phenomena, and to powder diffraction and real structures. (Z.M.)

  16. Sensitivity analysis of exergy destruction in a real combined cycle power plant based on advanced exergy method

    International Nuclear Information System (INIS)

    Boyaghchi, Fateme Ahmadi; Molaie, Hanieh

    2015-01-01

    Highlights: • The advanced exergy destruction components of a real CCPP are calculated. • The TIT and r c variation are investigated on exergy destruction parts of the cycle. • The TIT and r c growth increase the improvement potential in the most of components. • The TIT and r c growth decrease the unavoidable part in some components. - Abstract: The advanced exergy analysis extends engineering knowledge beyond the respective conventional methods by improving the design and operation of energy conversion systems. In advanced exergy analysis, the exergy destruction is splitting into endogenous/exogenous and avoidable/unavoidable parts. In this study, an advanced exergy analysis of a real combined cycle power plant (CCPP) with supplementary firing is done. The endogenous/exogenous irreversibilities of each component as well as their combination with avoidable/unavoidable irreversibilities are determined. A parametric study is presented discussing the sensitivity of various performance indicators to the turbine inlet temperature (TIT), and compressor pressure ratio (r c ). It is observed that the thermal and exergy efficiencies increase when TIT and r c rise. Results show that combustion chamber (CC) concentrates most of the exergy destruction (more than 62%), dominantly in unavoidable endogenous form which is decreased by 11.89% and 13.12% while the avoidable endogenous exergy destruction increase and is multiplied by the factors of 1.3 and 8.6 with increasing TIT and r c , respectively. In addition, TIT growth strongly increases the endogenous avoidable exergy destruction in high pressure superheater (HP.SUP), CC and low pressure evaporator (LP.EVAP). It, also, increases the exogenous avoidable exergy destruction of HP.SUP and low pressure steam turbine (LP.ST) and leads to the high decrement in the endogenous exergy destruction of the preheater (PRE) by about 98.8%. Furthermore, r c growth extremely rises the endogenous avoidable exergy destruction of gas

  17. Utilization of Advanced Diagnostic Methods for Texture and Rut Depth Analysis on a Testing Pavement Section

    Directory of Open Access Journals (Sweden)

    Slabej Martin

    2015-05-01

    Full Text Available Qualitative characteristics of pavement in wide range reflects the pavement serviceability, which is a summary of the characteristics of the pavement, providing a fast, smooth, economical and especially safe driving of motor-vehicles. The target factor of pavement serviceability and safety of roads represents the quality of their surface properties. In the framework of research activities performed in the Research Centre founded under the auspices of University of Žilina, individual parameters of pavement serviceability were monitored by pavement surface scanning. This paper describes the creation of a 3D - road surface model and its analysis and evaluation from the viewpoint of two pavement serviceability parameters - the rut depth and texture. Measurements were performed on an experimental pavement section used contemporary in an Accelerated Pavement Testing experiment. The long-term goal is to ascertain functions predicting degradation of these two pavement serviceability parameters.

  18. Advanced statistical methods in data science

    CERN Document Server

    Chen, Jiahua; Lu, Xuewen; Yi, Grace; Yu, Hao

    2016-01-01

    This book gathers invited presentations from the 2nd Symposium of the ICSA- CANADA Chapter held at the University of Calgary from August 4-6, 2015. The aim of this Symposium was to promote advanced statistical methods in big-data sciences and to allow researchers to exchange ideas on statistics and data science and to embraces the challenges and opportunities of statistics and data science in the modern world. It addresses diverse themes in advanced statistical analysis in big-data sciences, including methods for administrative data analysis, survival data analysis, missing data analysis, high-dimensional and genetic data analysis, longitudinal and functional data analysis, the design and analysis of studies with response-dependent and multi-phase designs, time series and robust statistics, statistical inference based on likelihood, empirical likelihood and estimating functions. The editorial group selected 14 high-quality presentations from this successful symposium and invited the presenters to prepare a fu...

  19. Advanced Economic Analysis

    Science.gov (United States)

    Greenberg, Marc W.; Laing, William

    2013-01-01

    An Economic Analysis (EA) is a systematic approach to the problem of choosing the best method of allocating scarce resources to achieve a given objective. An EA helps guide decisions on the "worth" of pursuing an action that departs from status quo ... an EA is the crux of decision-support.

  20. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay; Law, Kody; Suciu, Carina

    2017-01-01

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  1. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay

    2017-04-24

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  2. Cost-Benefit Analysis for the Advanced Near Net Shape Technology (ANNST) Method for Fabricating Stiffened Cylinders

    Science.gov (United States)

    Ivanco, Marie L.; Domack, Marcia S.; Stoner, Mary Cecilia; Hehir, Austin R.

    2016-01-01

    Low Technology Readiness Levels (TRLs) and high levels of uncertainty make it challenging to develop cost estimates of new technologies in the R&D phase. It is however essential for NASA to understand the costs and benefits associated with novel concepts, in order to prioritize research investments and evaluate the potential for technology transfer and commercialization. This paper proposes a framework to perform a cost-benefit analysis of a technology in the R&D phase. This framework was developed and used to assess the Advanced Near Net Shape Technology (ANNST) manufacturing process for fabricating integrally stiffened cylinders. The ANNST method was compared with the conventional multi-piece metallic construction and composite processes for fabricating integrally stiffened cylinders. Following the definition of a case study for a cryogenic tank cylinder of specified geometry, data was gathered through interviews with Subject Matter Experts (SMEs), with particular focus placed on production costs and process complexity. This data served as the basis to produce process flowcharts and timelines, mass estimates, and rough order-of-magnitude cost and schedule estimates. The scalability of the results was subsequently investigated to understand the variability of the results based on tank size. Lastly, once costs and benefits were identified, the Analytic Hierarchy Process (AHP) was used to assess the relative value of these achieved benefits for potential stakeholders. These preliminary, rough order-of-magnitude results predict a 46 to 58 percent reduction in production costs and a 7-percent reduction in weight over the conventional metallic manufacturing technique used in this study for comparison. Compared to the composite manufacturing technique, these results predict cost savings of 35 to 58 percent; however, the ANNST concept was heavier. In this study, the predicted return on investment of equipment required for the ANNST method was ten cryogenic tank barrels

  3. Advanced methods of fatigue assessment

    CERN Document Server

    Radaj, Dieter

    2013-01-01

    The book in hand presents advanced methods of brittle fracture and fatigue assessment. The Neuber concept of fictitious notch rounding is enhanced with regard to theory and application. The stress intensity factor concept for cracks is extended to pointed and rounded corner notches as well as to locally elastic-plastic material behaviour. The averaged strain energy density within a circular sector volume around the notch tip is shown to be suitable for strength-assessments. Finally, the various implications of cyclic plasticity on fatigue crack growth are explained with emphasis being laid on the DJ-integral approach.   This book continues the expositions of the authors’ well known reference work in German language ‘Ermüdungsfestigkeit – Grundlagen für Ingenieure’ (Fatigue strength – fundamentals for engineers).

  4. Advanced construction methods in ACR

    International Nuclear Information System (INIS)

    Elgohary, M.; Choy, E.; Yu, S.K.W.

    2002-01-01

    The ACR - Advanced CANDU Reactor, developed by Atomic Energy of Canada Limited (AECL), is designed with constructability considerations as a major requirement during all project phases from the concept design stage to the detail design stage. This necessitated a much more comprehensive approach in including constructability considerations in the design to ensure that the construction duration is met. For the ACR-700, a project schedule of 48 months has been developed for the nth replicated unit with a 36 month construction period duration from First Concrete to Fuel Load. An overall construction strategy that builds on the success of the construction methods that are proven in the construction of the Qinshan CANDU 6 project has been developed for the ACR. The overall construction strategy comprises the 'Open Top' construction technique using a Very Heavy Lift crane, parallel construction activities, with extensive modularization and prefabrication. In addition, significant applications of up to date construction technology will be implemented, e.g. large volume concrete pours, prefabricated rebar, use of climbing forms, composite structures, prefabricated permanent formwork, automatic welding, and utilization of the latest electronic technology tools such as 3D CADDs modelling yields a very high quality, clash free product to allow construction to be completed 'right the first time' and eliminates rework. Integration of 3D CADDs models and scheduling tools such as Primavera has allowed development of actual construction sequences and an iterative approach to schedule verification and improvement. Modularization and prefabrication are major features of the ACR design in order to achieve the project schedule. For the reactor building approximately 80% of the volume will be installed as modules or prefabricated assembles. This ensures critical path activities are achieved. This paper examines the advanced construction methods implemented in the design in order to

  5. Evaluation of core physics analysis methods for conversion of the INL advanced test reactor to low-enrichment fuel

    International Nuclear Information System (INIS)

    DeHart, M. D.; Chang, G. S.

    2012-01-01

    Computational neutronics studies to support the possible conversion of the ATR to LEU are underway. Simultaneously, INL is engaged in a physics methods upgrade project to put into place modern computational neutronics tools for future support of ATR fuel cycle and experiment analysis. A number of experimental measurements have been performed in the ATRC in support of the methods upgrade project, and are being used to validate the new core physics methods. The current computational neutronics work is focused on performance of scoping calculations for the ATR core loaded with a candidate LEU fuel design. This will serve as independent confirmation of analyses that have been performed previously, and will evaluate some of the new computational methods for analysis of a candidate LEU fuel for ATR. (authors)

  6. Advances in fingerprint analysis.

    Science.gov (United States)

    Hazarika, Pompi; Russell, David A

    2012-04-10

    Fingerprints have been used in forensic investigations for the identification of individuals since the late 19th century. However, it is now clear that fingerprints can provide significantly more information about an individual. Here, we highlight the considerable advances in fingerprinting technology that can simultaneously provide chemical information regarding the drugs ingested and the explosives and drugs handled by a person as well as the identity of that individual. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Broadband Studies of Semsmic Sources at Regional and Teleseismic Distances Using Advanced Time Series Analysis Methods. Volume 1.

    Science.gov (United States)

    1991-03-21

    discussion of spectral factorability and motivations for broadband analysis, the report is subdivided into four main sections. In Section 1.0, we...estimates. The motivation for developing our multi-channel deconvolution method was to gain information about seismic sources, most notably, nuclear...with complex constraints for estimating the rupture history. Such methods (applied mostly to data sets that also include strong rmotion data), were

  8. Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales

    Energy Technology Data Exchange (ETDEWEB)

    Kollias, Pavlos [McGill Univ., Montreal, QC (Canada

    2016-09-06

    This the final report for the DE-SC0007096 - Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales - PI: Pavlos Kollias. The final report outline the main findings of the research conducted using the aforementioned award in the area of cloud research from the cloud scale (10-100 m) to the mesoscale (20-50 km).

  9. Analysis of numerical methods

    CERN Document Server

    Isaacson, Eugene

    1994-01-01

    This excellent text for advanced undergraduates and graduate students covers norms, numerical solution of linear systems and matrix factoring, iterative solutions of nonlinear equations, eigenvalues and eigenvectors, polynomial approximation, and other topics. It offers a careful analysis and stresses techniques for developing new methods, plus many examples and problems. 1966 edition.

  10. Advanced Fine Particulate Characterization Methods

    Energy Technology Data Exchange (ETDEWEB)

    Steven Benson; Lingbu Kong; Alexander Azenkeng; Jason Laumb; Robert Jensen; Edwin Olson; Jill MacKenzie; A.M. Rokanuzzaman

    2007-01-31

    The characterization and control of emissions from combustion sources are of significant importance in improving local and regional air quality. Such emissions include fine particulate matter, organic carbon compounds, and NO{sub x} and SO{sub 2} gases, along with mercury and other toxic metals. This project involved four activities including Further Development of Analytical Techniques for PM{sub 10} and PM{sub 2.5} Characterization and Source Apportionment and Management, Organic Carbonaceous Particulate and Metal Speciation for Source Apportionment Studies, Quantum Modeling, and High-Potassium Carbon Production with Biomass-Coal Blending. The key accomplishments included the development of improved automated methods to characterize the inorganic and organic components particulate matter. The methods involved the use of scanning electron microscopy and x-ray microanalysis for the inorganic fraction and a combination of extractive methods combined with near-edge x-ray absorption fine structure to characterize the organic fraction. These methods have direction application for source apportionment studies of PM because they provide detailed inorganic analysis along with total organic and elemental carbon (OC/EC) quantification. Quantum modeling using density functional theory (DFT) calculations was used to further elucidate a recently developed mechanistic model for mercury speciation in coal combustion systems and interactions on activated carbon. Reaction energies, enthalpies, free energies and binding energies of Hg species to the prototype molecules were derived from the data obtained in these calculations. Bimolecular rate constants for the various elementary steps in the mechanism have been estimated using the hard-sphere collision theory approximation, and the results seem to indicate that extremely fast kinetics could be involved in these surface reactions. Activated carbon was produced from a blend of lignite coal from the Center Mine in North Dakota and

  11. Failure and damage analysis of advanced materials

    CERN Document Server

    Sadowski, Tomasz

    2015-01-01

    The papers in this volume present basic concepts and new developments in failure and damage analysis with focus on advanced materials such as composites, laminates, sandwiches and foams, and also new metallic materials. Starting from some mathematical foundations (limit surfaces, symmetry considerations, invariants) new experimental results and their analysis are shown. Finally, new concepts for failure prediction and analysis will be introduced and discussed as well as new methods of failure and damage prediction for advanced metallic and non-metallic materials. Based on experimental results the traditional methods will be revised.

  12. Advanced thermodynamic (exergetic) analysis

    International Nuclear Information System (INIS)

    Tsatsaronis, G; Morosuk, T

    2012-01-01

    Exergy analysis is a powerful tool for developing, evaluating and improving an energy conversion system. However, the lack of a formal procedure in using the results obtained by an exergy analysis is one of the reasons for exergy analysis not being very popular among energy practitioners. Such a formal procedure cannot be developed as long as the interactions among components of the overall system are not being taken properly into account. Splitting the exergy destruction into unavoidable and avoidable parts in a component provides a realistic measure of the potential for improving the thermodynamic efficiency of this component. Alternatively splitting the exergy destruction into endogenous and exogenous parts provides information on the interactions among system components. Distinctions between avoidable and unavoidable exergy destruction on one side and endogenous and exogenous exergy destruction on the other side allow the engineer to focus on the thermodynamic inefficiencies that can be avoided and to consider the interactions among system components. The avoidable endogenous and the avoidable exogenous exergy destruction provide the best guidance for improving the thermodynamic performance of energy conversion systems.

  13. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Sezen, Halil [The Ohio State Univ., Columbus, OH (United States). Dept. of Civil, Environmental and Geodetic Engineering; Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States). College of Engineering, Nuclear Engineering Program, Dept. of Mechanical and Aerospace Engineering; Denning, R. [The Ohio State Univ., Columbus, OH (United States); Vaidya, N. [Rizzo Associates, Pittsburgh, PA (United States)

    2017-12-29

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  14. Advanced accelerator methods: The cyclotrino

    International Nuclear Information System (INIS)

    Welch, J.J.; Bertsche, K.J.; Friedman, P.G.; Morris, D.E.; Muller, R.A.

    1987-04-01

    Several new and unusual, advanced techniques in the small cyclotron are described. The cyclotron is run at low energy, using negative ions and at high harmonics. Electrostatic focusing is used exclusively. The ion source and injection system is in the center, which unfortunately does not provide enough current, but the new system design should solve this problem. An electrostatic extractor that runs at low voltage, under 5 kV, and a microchannel plate detector which is able to discriminate low energy ions from the 14 C are used. The resolution is sufficient for 14 C dating and a higher intensity source should allow dating of a milligram size sample of 30,000 year old material with less than 10% uncertainty

  15. Comparison of the Performance of Two Advanced Spectral Methods for the Analysis of Times Series in Paleoceanography

    Directory of Open Access Journals (Sweden)

    Eulogio Pardo-Igúzquiza

    2015-08-01

    Full Text Available Many studies have revealed the cyclicity of past ocean/atmosphere dynamics at a wide range of time scales (from decadal to millennial time scales, based on the spectral analysis of time series of climate proxies obtained from deep sea sediment cores. Among the many techniques available for spectral analysis, the maximum entropy method and the Thomson multitaper approach have frequently been used because of their good statistical properties and high resolution with short time series. The novelty of the present study is that we compared the two methods by according to the performance of their statistical tests to assess the statistical significance of their power spectrum estimates. The statistical significance of maximum entropy estimates was assessed by a random permutation test (Pardo-Igúzquiza and Rodríguez-Tovar, 2000, while the statistical significance of the Thomson multitaper method was assessed by an F-test (Thomson, 1982. We compared the results obtained in a case study using simulated data where the spectral content of the time series was known and in a case study with real data. In both cases the results are similar: while the cycles identified as significant by maximum entropy and the permutation test have a clear physical interpretation, the F-test with the Thomson multitaper estimator tends to find as no significant the peaks in the low frequencies and tends to give as significant more spurious peaks in the middle and high frequencies. Nevertheless, the best strategy is to use both techniques and to use the advantages of each of them.

  16. Electromagnetic analysis of HTSC by advanced fluxoid dynamics method; Kairyogata jisoku ryoshi doryokugakuho ni yoru koon chodendotai no denji kaiseki

    Energy Technology Data Exchange (ETDEWEB)

    Demachi, K.; Nakano, M.; Miya, K. [The Univ. of Tokyo, Tokyo (Japan). Graduate School of Engineering

    2000-05-29

    In this study, the improved fluxoid quantum dynamics method which applied the handling of the lattice in the lattice gas automaton method into the fluxoid quantum dynamics was developed. The saving of a memory capacity and the speedup of the calculation were carried out. Using a BSCCO single crystal which introduced a pinning center by the heavy ion irradiation as an analysis object, the validity of this technique was shown by the comparison with the experimental result. By the improved fluxoid quantum dynamics method, the space was made discrete by the triangular lattice in the two-dimensional system. The fluxoid quantum exists only on the lattice point, and received the force from the fluxoid quantum, the Meissner magnetic field and the pinning center in the circumference, and then moved to the 6 neighboring lattice points. Since the pattern of the positional relation of the fluxoid quantum was limited by the use of triangular lattice, the high-speed computation became possible by storing the interaction force into a database. The dependence for the radiation value of the torque in the axial type super-conductive magnet bearings was analyzed, and the possibility of the rotational loss control by the heavy ion irradiation was shown. (NEDO)

  17. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    International Nuclear Information System (INIS)

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations

  18. Advanced midwifery practice: An evolutionary concept analysis.

    Science.gov (United States)

    Goemaes, Régine; Beeckman, Dimitri; Goossens, Joline; Shawe, Jill; Verhaeghe, Sofie; Van Hecke, Ann

    2016-11-01

    the concept of 'advanced midwifery practice' is explored to a limited extent in the international literature. However, a clear conception of advanced midwifery practice is vital to advance the discipline and to achieve both internal and external legitimacy. This concept analysis aims to clarify advanced midwifery practice and identify its components. a review of the literature was executed using Rodgers' evolutionary method of concept analysis to analyze the attributes, references, related terms, antecedents and consequences of advanced midwifery practice. an international consensus definition of advanced midwifery practice is currently lacking. Four major attributes of advanced midwife practitioners (AMPs) are identified: autonomy in practice, leadership, expertise, and research skills. A consensus was found on the need of preparation at master's level for AMPs. Such midwives have a broad and internationally varied scope of practice, fulfilling different roles such as clinicians, clinical and professional leaders, educators, consultants, managers, change agents, researchers, and auditors. Evidence illustrating the important part AMPs play on a clinical and strategic level is mounting. the findings of this concept analysis support a wide variety in the emergence, titles, roles, and scope of practice of AMPs. Research on clinical and strategic outcomes of care provided by AMPs supports further implementation of these roles. As the indistinctness of AMPs' titles and roles is one of the barriers for implementation, a clear conceptualization of advanced midwifery practice seems essential for successful implementation. an international debate and consensus on the defining elements of advanced midwifery practice could enhance the further development of midwifery as a profession and is a prerequisite for its successful implementation. Due to rising numbers of AMPs, extension of practice and elevated quality requirements in healthcare, more outcomes research exclusively

  19. Advancing Alternative Analysis: Integration of Decision Science.

    Science.gov (United States)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina M; Blake, Ann; Carroll, William F; Corbett, Charles J; Hansen, Steffen Foss; Lempert, Robert J; Linkov, Igor; McFadden, Roger; Moran, Kelly D; Olivetti, Elsa; Ostrom, Nancy K; Romero, Michelle; Schoenung, Julie M; Seager, Thomas P; Sinsheimer, Peter; Thayer, Kristina A

    2017-06-13

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate the safety and viability of potential substitutes for hazardous chemicals. We assessed whether decision science may assist the alternatives analysis decision maker in comparing alternatives across a range of metrics. A workshop was convened that included representatives from government, academia, business, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and were prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect on other groups' findings. We concluded that the further incorporation of decision science into alternatives analysis would advance the ability of companies and regulators to select alternatives to harmful ingredients and would also advance the science of decision analysis. We advance four recommendations: a ) engaging the systematic development and evaluation of decision approaches and tools; b ) using case studies to advance the integration of decision analysis into alternatives analysis; c ) supporting transdisciplinary research; and d ) supporting education and outreach efforts. https://doi.org/10.1289/EHP483.

  20. Advanced continuous cultivation methods for systems microbiology.

    Science.gov (United States)

    Adamberg, Kaarel; Valgepea, Kaspar; Vilu, Raivo

    2015-09-01

    Increasing the throughput of systems biology-based experimental characterization of in silico-designed strains has great potential for accelerating the development of cell factories. For this, analysis of metabolism in the steady state is essential as only this enables the unequivocal definition of the physiological state of cells, which is needed for the complete description and in silico reconstruction of their phenotypes. In this review, we show that for a systems microbiology approach, high-resolution characterization of metabolism in the steady state--growth space analysis (GSA)--can be achieved by using advanced continuous cultivation methods termed changestats. In changestats, an environmental parameter is continuously changed at a constant rate within one experiment whilst maintaining cells in the physiological steady state similar to chemostats. This increases the resolution and throughput of GSA compared with chemostats, and, moreover, enables following of the dynamics of metabolism and detection of metabolic switch-points and optimal growth conditions. We also describe the concept, challenge and necessary criteria of the systematic analysis of steady-state metabolism. Finally, we propose that such systematic characterization of the steady-state growth space of cells using changestats has value not only for fundamental studies of metabolism, but also for systems biology-based metabolic engineering of cell factories.

  1. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  2. Recent Advances in the Analysis of Macromolecular Interactions Using the Matrix-Free Method of Sedimentation in the Analytical Ultracentrifuge

    Directory of Open Access Journals (Sweden)

    Stephen E. Harding

    2015-03-01

    Full Text Available Sedimentation in the analytical ultracentrifuge is a matrix free solution technique with no immobilisation, columns, or membranes required and can be used to study self-association and complex or “hetero”-interactions, stoichiometry, reversibility and interaction strength of a wide variety of macromolecular types and across a very large dynamic range (dissociation constants from 10−12 M to 10−1 M. We extend an earlier review specifically highlighting advances in sedimentation velocity and sedimentation equilibrium in the analytical ultracentrifuge applied to protein interactions and mucoadhesion and to review recent applications in protein self-association (tetanus toxoid, agrin, protein-like carbohydrate association (aminocelluloses, carbohydrate-protein interactions (polysaccharide-gliadin, nucleic-acid protein (G-duplexes, nucleic acid-carbohydrate (DNA-chitosan and finally carbohydrate-carbohydrate (xanthan-chitosan and a ternary polysaccharide complex interactions.

  3. Advanced computational electromagnetic methods and applications

    CERN Document Server

    Li, Wenxing; Elsherbeni, Atef; Rahmat-Samii, Yahya

    2015-01-01

    This new resource covers the latest developments in computational electromagnetic methods, with emphasis on cutting-edge applications. This book is designed to extend existing literature to the latest development in computational electromagnetic methods, which are of interest to readers in both academic and industrial areas. The topics include advanced techniques in MoM, FEM and FDTD, spectral domain method, GPU and Phi hardware acceleration, metamaterials, frequency and time domain integral equations, and statistics methods in bio-electromagnetics.

  4. Independent component analysis: recent advances

    OpenAIRE

    Hyv?rinen, Aapo

    2013-01-01

    Independent component analysis is a probabilistic method for learning a linear transform of a random vector. The goal is to find components that are maximally independent and non-Gaussian (non-normal). Its fundamental difference to classical multi-variate statistical methods is in the assumption of non-Gaussianity, which enables the identification of original, underlying components, in contrast to classical methods. The basic theory of independent component analysis was mainly developed in th...

  5. Advanced Computational Methods for Monte Carlo Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-12

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  6. Advancing Dose-Response Assessment Methods for Environmental Regulatory Impact Analysis: A Bayesian Belief Network Approach Applied to Inorganic Arsenic.

    Science.gov (United States)

    Zabinski, Joseph W; Garcia-Vargas, Gonzalo; Rubio-Andrade, Marisela; Fry, Rebecca C; Gibson, Jacqueline MacDonald

    2016-05-10

    Dose-response functions used in regulatory risk assessment are based on studies of whole organisms and fail to incorporate genetic and metabolomic data. Bayesian belief networks (BBNs) could provide a powerful framework for incorporating such data, but no prior research has examined this possibility. To address this gap, we develop a BBN-based model predicting birthweight at gestational age from arsenic exposure via drinking water and maternal metabolic indicators using a cohort of 200 pregnant women from an arsenic-endemic region of Mexico. We compare BBN predictions to those of prevailing slope-factor and reference-dose approaches. The BBN outperforms prevailing approaches in balancing false-positive and false-negative rates. Whereas the slope-factor approach had 2% sensitivity and 99% specificity and the reference-dose approach had 100% sensitivity and 0% specificity, the BBN's sensitivity and specificity were 71% and 30%, respectively. BBNs offer a promising opportunity to advance health risk assessment by incorporating modern genetic and metabolomic data.

  7. Advancing cloud lifecycle representation in numerical models using innovative analysis methods that bridge arm observations over a breadth of scales

    Energy Technology Data Exchange (ETDEWEB)

    Tselioudis, George [Columbia Univ., New York, NY (United States)

    2016-03-04

    From its location on the subtropics-midlatitude boundary, the Azores is influenced by both the subtropical high pressure and the midlatitude baroclinic storm regimes, and therefore experiences a wide range of cloud structures, from fair-weather scenes to stratocumulus sheets to deep convective systems. This project combined three types of data sets to study cloud variability in the Azores: a satellite analysis of cloud regimes, a reanalysis characterization of storminess, and a 19-month field campaign that occurred on Graciosa Island. Combined analysis of the three data sets provides a detailed picture of cloud variability and the respective dynamic influences, with emphasis on low clouds that constitute a major uncertainty source in climate model simulations. The satellite cloud regime analysis shows that the Azores cloud distribution is similar to the mean global distribution and can therefore be used to evaluate cloud simulation in global models. Regime analysis of low clouds shows that stratocumulus decks occur under the influence of the Azores high-pressure system, while shallow cumulus clouds are sustained by cold-air outbreaks, as revealed by their preference for post-frontal environments and northwesterly flows. An evaluation of CMIP5 climate model cloud regimes over the Azores shows that all models severely underpredict shallow cumulus clouds, while most models also underpredict the occurrence of stratocumulus cloud decks. It is demonstrated that carefully selected case studies can be related through regime analysis to climatological cloud distributions, and a methodology is suggested utilizing process-resolving model simulations of individual cases to better understand cloud-dynamics interactions and attempt to explain and correct climate model cloud deficiencies.

  8. Parameter Identification with the Random Perturbation Particle Swarm Optimization Method and Sensitivity Analysis of an Advanced Pressurized Water Reactor Nuclear Power Plant Model for Power Systems

    Directory of Open Access Journals (Sweden)

    Li Wang

    2017-02-01

    Full Text Available The ability to obtain appropriate parameters for an advanced pressurized water reactor (PWR unit model is of great significance for power system analysis. The attributes of that ability include the following: nonlinear relationships, long transition time, intercoupled parameters and difficult obtainment from practical test, posed complexity and difficult parameter identification. In this paper, a model and a parameter identification method for the PWR primary loop system were investigated. A parameter identification process was proposed, using a particle swarm optimization (PSO algorithm that is based on random perturbation (RP-PSO. The identification process included model variable initialization based on the differential equations of each sub-module and program setting method, parameter obtainment through sub-module identification in the Matlab/Simulink Software (Math Works Inc., Natick, MA, USA as well as adaptation analysis for an integrated model. A lot of parameter identification work was carried out, the results of which verified the effectiveness of the method. It was found that the change of some parameters, like the fuel temperature and coolant temperature feedback coefficients, changed the model gain, of which the trajectory sensitivities were not zero. Thus, obtaining their appropriate values had significant effects on the simulation results. The trajectory sensitivities of some parameters in the core neutron dynamic module were interrelated, causing the parameters to be difficult to identify. The model parameter sensitivity could be different, which would be influenced by the model input conditions, reflecting the parameter identifiability difficulty degree for various input conditions.

  9. New theory of discriminant analysis after R. Fisher advanced research by the feature selection method for microarray data

    CERN Document Server

    Shinmura, Shuichi

    2016-01-01

    This is the first book to compare eight LDFs by different types of datasets, such as Fisher’s iris data, medical data with collinearities, Swiss banknote data that is a linearly separable data (LSD), student pass/fail determination using student attributes, 18 pass/fail determinations using exam scores, Japanese automobile data, and six microarray datasets (the datasets) that are LSD. We developed the 100-fold cross-validation for the small sample method (Method 1) instead of the LOO method. We proposed a simple model selection procedure to choose the best model having minimum M2 and Revised IP-OLDF based on MNM criterion was found to be better than other M2s in the above datasets. We compared two statistical LDFs and six MP-based LDFs. Those were Fisher’s LDF, logistic regression, three SVMs, Revised IP-OLDF, and another two OLDFs. Only a hard-margin SVM (H-SVM) and Revised IP-OLDF could discriminate LSD theoretically (Problem 2). We solved the defect of the generalized inverse matrices (Problem 3). For ...

  10. Advanced analysis technology for MOX fuel

    International Nuclear Information System (INIS)

    Hiyama, T.; Kamimura, K.

    1997-01-01

    PNC has developed MOX fuels for advanced thermal reactor (ATR) and fast breeder reactor (FBR). The MOX samples have been chemically analysed to characterize the MOX fuel for JOYO, MONJU, FUGEN and so on. The analysis of the MOX samples in glove box has required complicated and highly skilled operations. Therefore, for quality control analysis of the MOX fuel in a fabrication plant, simple, rapid and accurate analysis methods are necessary. To solve the above problems instrumental analysis and techniques were developed. This paper describes some of the recent developments in PNC. 2. Outline of recently developed analysis methods by PNC. 2.1 Determination of oxygen to metal atomic ratio (O/M) in MOX by non-dispersive infrared spectrophotometry after inert gas fusion. 7 refs, 9 figs, 4 tabs

  11. Development of the high-order decoupled direct method in three dimensions for particulate matter: enabling advanced sensitivity analysis in air quality models

    Directory of Open Access Journals (Sweden)

    W. Zhang

    2012-03-01

    Full Text Available The high-order decoupled direct method in three dimensions for particulate matter (HDDM-3D/PM has been implemented in the Community Multiscale Air Quality (CMAQ model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity analysis of ISORROPIA, the inorganic aerosol module of CMAQ. A case-specific approach has been applied, and the sensitivities of activity coefficients and water content are explicitly computed. Stand-alone tests are performed for ISORROPIA by comparing the sensitivities (first- and second-order computed by HDDM and the brute force (BF approximations. Similar comparison has also been carried out for CMAQ sensitivities simulated using a week-long winter episode for a continental US domain. Second-order sensitivities of aerosol species (e.g., sulfate, nitrate, and ammonium with respect to domain-wide SO2, NOx, and NH3 emissions show agreement with BF results, yet exhibit less noise in locations where BF results are demonstrably inaccurate. Second-order sensitivity analysis elucidates poorly understood nonlinear responses of secondary inorganic aerosols to their precursors and competing species. Adding second-order sensitivity terms to the Taylor series projection of the nitrate concentrations with a 50% reduction in domain-wide NOx or SO2 emissions rates improves the prediction with statistical significance.

  12. Advanced methods in diagnosis and therapy

    International Nuclear Information System (INIS)

    1987-01-01

    This important meeting covers the following topics: use and optimization of monoclonal antibobies in oncology: - Tumor markers: Clinical follow-up of patients through tumor marker serum determinations. - Cancer and medical imaging: The use of monoclonal antibodies in immunoscintigraphy. - Immunoradiotherapy: Monoclonal antibodies as therapeutic vectors. Advanced methods in diagnosis: - Contribution of monoclonal antibodies in modern immunochemistry (RIA, EIA). - Interest of monoclonal antibody in immunohistochemical pathology diagnosis. - In vitro diagnosis future prospects: with receptors and oncogenes. - Immunofluoroassay: a new sensitive immunoanalytical procedure with broad applications. Recent advances in brachitherapy: - Interest of computer processing. Blood products irradiation: - Interest in transfusion and bone marrow transplantations [fr

  13. Mathematics for natural scientists II advanced methods

    CERN Document Server

    Kantorovich, Lev

    2016-01-01

    This book covers the advanced mathematical techniques useful for physics and engineering students, presented in a form accessible to physics students, avoiding precise mathematical jargon and laborious proofs. Instead, all proofs are given in a simplified form that is clear and convincing for a physicist. Examples, where appropriate, are given from physics contexts. Both solved and unsolved problems are provided in each chapter. Mathematics for Natural Scientists II: Advanced Methods is the second of two volumes. It follows the first volume on Fundamentals and Basics.

  14. Advancing UAS methods for monitoring coastal environments

    Science.gov (United States)

    Ridge, J.; Seymour, A.; Rodriguez, A. B.; Dale, J.; Newton, E.; Johnston, D. W.

    2017-12-01

    Utilizing fixed-wing Unmanned Aircraft Systems (UAS), we are working to improve coastal monitoring by increasing the accuracy, precision, temporal resolution, and spatial coverage of habitat distribution maps. Generally, multirotor aircraft are preferred for precision imaging, but recent advances in fixed-wing technology have greatly increased their capabilities and application for fine-scale (decimeter-centimeter) measurements. Present mapping methods employed by North Carolina coastal managers involve expensive, time consuming and localized observation of coastal environments, which often lack the necessary frequency to make timely management decisions. For example, it has taken several decades to fully map oyster reefs along the NC coast, making it nearly impossible to track trends in oyster reef populations responding to harvesting pressure and water quality degradation. It is difficult for the state to employ manned flights for collecting aerial imagery to monitor intertidal oyster reefs, because flights are usually conducted after seasonal increases in turbidity. In addition, post-storm monitoring of coastal erosion from manned platforms is often conducted days after the event and collects oblique aerial photographs which are difficult to use for accurately measuring change. Here, we describe how fixed wing UAS and standard RGB sensors can be used to rapidly quantify and assess critical coastal habitats (e.g., barrier islands, oyster reefs, etc.), providing for increased temporal frequency to isolate long-term and event-driven (storms, harvesting) impacts. Furthermore, drone-based approaches can accurately image intertidal habitats as well as resolve information such as vegetation density and bathymetry from shallow submerged areas. We obtain UAS imagery of a barrier island and oyster reefs under ideal conditions (low tide, turbidity, and sun angle) to create high resolution (cm scale) maps and digital elevation models to assess habitat condition

  15. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  16. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  17. Recent advances in segmented gamma scanner analysis

    International Nuclear Information System (INIS)

    Sprinkle, J.K. Jr.; Hsue, S.T.

    1987-01-01

    The segmented gamma scanner (SGS) is used in many facilities to assay low-density scrap and waste generated in the facilities. The procedures for using the SGS can cause a negative bias if the sample does not satisfy the assumptions made in the method. Some process samples do not comply with the assumptions. This paper discusses the effect of the presence of lumps on the SGS assay results, describes a method to detect the presence of lumps, and describes an approach to correct for the lumps. Other recent advances in SGS analysis are also discussed

  18. Editorial: Latest methods and advances in biotechnology.

    Science.gov (United States)

    Lee, Sang Yup; Jungbauer, Alois

    2014-01-01

    The latest "Biotech Methods and Advances" special issue of Biotechnology Journal continues the BTJ tradition of featuring the latest breakthroughs in biotechnology. The special issue is edited by our Editors-in-Chief, Prof. Sang Yup Lee and Prof. Alois Jungbauer and covers a wide array of topics in biotechnology, including the perennial favorite workhorses of the biotech industry, Chinese hamster ovary (CHO) cell and Escherichia coli. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Advances in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Hardung von Hardung, H.

    1982-01-01

    Probabilistic risk analysis can now look back upon almost a quarter century of intensive development. The early studies, whose methods and results are still referred to occasionally, however, only permitted rough estimates to be made of the probabilities of recognizable accident scenarios, failing to provide a method which could have served as a reference base in calculating the overall risk associated with nuclear power plants. The first truly solid attempt was the Rasmussen Study and, partly based on it, the German Risk Study. In those studies, probabilistic risk analysis has been given a much more precise basis. However, new methodologies have been developed in the meantime, which allow much more informative risk studies to be carried out. They have been found to be valuable tools for management decisions with respect to backfitting, reinforcement and risk limitation. Today they are mainly applied by specialized private consultants and have already found widespread application especially in the USA. (orig.) [de

  20. Advancing Alternative Analysis: Integration of Decision Science

    DEFF Research Database (Denmark)

    Malloy, Timothy F; Zaunbrecher, Virginia M; Batteate, Christina

    2016-01-01

    Decision analysis-a systematic approach to solving complex problems-offers tools and frameworks to support decision making that are increasingly being applied to environmental challenges. Alternatives analysis is a method used in regulation and product design to identify, compare, and evaluate......, and civil society and included experts in toxicology, decision science, alternatives assessment, engineering, and law and policy. Participants were divided into two groups and prompted with targeted questions. Throughout the workshop, the groups periodically came together in plenary sessions to reflect......) engaging the systematic development and evaluation of decision approaches and tools; (2) using case studies to advance the integration of decision analysis into alternatives analysis; (3) supporting transdisciplinary research; and (4) supporting education and outreach efforts....

  1. Advanced Source Deconvolution Methods for Compton Telescopes

    Science.gov (United States)

    Zoglauer, Andreas

    The next generation of space telescopes utilizing Compton scattering for astrophysical observations is destined to one day unravel the mysteries behind Galactic nucleosynthesis, to determine the origin of the positron annihilation excess near the Galactic center, and to uncover the hidden emission mechanisms behind gamma-ray bursts. Besides astrophysics, Compton telescopes are establishing themselves in heliophysics, planetary sciences, medical imaging, accelerator physics, and environmental monitoring. Since the COMPTEL days, great advances in the achievable energy and position resolution were possible, creating an extremely vast, but also extremely sparsely sampled data space. Unfortunately, the optimum way to analyze the data from the next generation of Compton telescopes has not yet been found, which can retrieve all source parameters (location, spectrum, polarization, flux) and achieves the best possible resolution and sensitivity at the same time. This is especially important for all sciences objectives looking at the inner Galaxy: the large amount of expected sources, the high background (internal and Galactic diffuse emission), and the limited angular resolution, make it the most taxing case for data analysis. In general, two key challenges exist: First, what are the best data space representations to answer the specific science questions? Second, what is the best way to deconvolve the data to fully retrieve the source parameters? For modern Compton telescopes, the existing data space representations can either correctly reconstruct the absolute flux (binned mode) or achieve the best possible resolution (list-mode), both together were not possible up to now. Here we propose to develop a two-stage hybrid reconstruction method which combines the best aspects of both. Using a proof-of-concept implementation we can for the first time show that it is possible to alternate during each deconvolution step between a binned-mode approach to get the flux right and a

  2. Advanced Techniques of Stress Analysis

    Directory of Open Access Journals (Sweden)

    Simion TATARU

    2013-12-01

    Full Text Available This article aims to check the stress analysis technique based on 3D models also making a comparison with the traditional technique which utilizes a model built directly into the stress analysis program. This comparison of the two methods will be made with reference to the rear fuselage of IAR-99 aircraft, structure with a high degree of complexity which allows a meaningful evaluation of both approaches. Three updated databases are envisaged: the database having the idealized model obtained using ANSYS and working directly on documentation, without automatic generation of nodes and elements (with few exceptions, the rear fuselage database (performed at this stage obtained with Pro/ ENGINEER and the one obtained by using ANSYS with the second database. Then, each of the three databases will be used according to arising necessities.The main objective is to develop the parameterized model of the rear fuselage using the computer aided design software Pro/ ENGINEER. A review of research regarding the use of virtual reality with the interactive analysis performed by the finite element method is made to show the state- of- the-art achieved in this field.

  3. Advanced Measuring (Instrumentation Methods for Nuclear Installations: A Review

    Directory of Open Access Journals (Sweden)

    Wang Qiu-kuan

    2012-01-01

    Full Text Available The nuclear technology has been widely used in the world. The research of measurement in nuclear installations involves many aspects, such as nuclear reactors, nuclear fuel cycle, safety and security, nuclear accident, after action, analysis, and environmental applications. In last decades, many advanced measuring devices and techniques have been widely applied in nuclear installations. This paper mainly introduces the development of the measuring (instrumentation methods for nuclear installations and the applications of these instruments and methods.

  4. Methods for geochemical analysis

    Science.gov (United States)

    Baedecker, Philip A.

    1987-01-01

    The laboratories for analytical chemistry within the Geologic Division of the U.S. Geological Survey are administered by the Office of Mineral Resources. The laboratory analysts provide analytical support to those programs of the Geologic Division that require chemical information and conduct basic research in analytical and geochemical areas vital to the furtherance of Division program goals. Laboratories for research and geochemical analysis are maintained at the three major centers in Reston, Virginia, Denver, Colorado, and Menlo Park, California. The Division has an expertise in a broad spectrum of analytical techniques, and the analytical research is designed to advance the state of the art of existing techniques and to develop new methods of analysis in response to special problems in geochemical analysis. The geochemical research and analytical results are applied to the solution of fundamental geochemical problems relating to the origin of mineral deposits and fossil fuels, as well as to studies relating to the distribution of elements in varied geologic systems, the mechanisms by which they are transported, and their impact on the environment.

  5. Advances in Packaging Methods, Processes and Systems

    Directory of Open Access Journals (Sweden)

    Nitaigour Premchand Mahalik

    2014-10-01

    Full Text Available The food processing and packaging industry is becoming a multi-trillion dollar global business. The reason is that the recent increase in incomes in traditionally less economically developed countries has led to a rise in standards of living that includes a significantly higher consumption of packaged foods. As a result, food safety guidelines have been more stringent than ever. At the same time, the number of research and educational institutions—that is, the number of potential researchers and stakeholders—has increased in the recent past. This paper reviews recent developments in food processing and packaging (FPP, keeping in view the aforementioned advancements and bearing in mind that FPP is an interdisciplinary area in that materials, safety, systems, regulation, and supply chains play vital roles. In particular, the review covers processing and packaging principles, standards, interfaces, techniques, methods, and state-of-the-art technologies that are currently in use or in development. Recent advances such as smart packaging, non-destructive inspection methods, printing techniques, application of robotics and machineries, automation architecture, software systems and interfaces are reviewed.

  6. Advanced instrumentation and analysis methods for in-pile thermal and nuclear measurements: from out-of-pile studies to irradiation campaigns

    International Nuclear Information System (INIS)

    Reynard-Carette, C.; Lyoussi, A.

    2015-01-01

    nuclear heating. The last one consists in the development of accurate measurement and analysis methods. The paper will be dedicated to a complete review of the experimental and numerical works performed since 2009 thanks to two parts. The first part will detail a new thermal approach implemented to improve nuclear heating measurements by radiometric calorimeters. New experimental tools (calorimeter prototypes and set-ups such BETHY Bench) developed to perform preliminary out-of-pile studies under suitable conditions will be presented (temperature and velocity of the external cooling fluid, heat source localization and intensity inside the calorimetric cells). Then the response of two kinds of sensors, their calibrations curves and their thermal behaviors will be compared for various parameters. Finally validated numerical thermal and Monte Carlo works will be discussed to propose new improvements. The second parts of the paper will focus on works realized in order to design, develop and test the first prototype of the multi-sensor device called CARMEN [7-9]. The two mock-ups dedicated respectively to neutron measurements and photon measurements will be detailed. The results obtained during two irradiation campaigns inside the periphery of OSIRIS reactor will be shown. The new analysis method will be discussed. (authors)

  7. Advanced instrumentation and analysis methods for in-pile thermal and nuclear measurements: from out-of-pile studies to irradiation campaigns

    Energy Technology Data Exchange (ETDEWEB)

    Reynard-Carette, C. [Aix Marseille Universite, CNRS, Universite de Toulon, IM2NP UMR 7334, 13397, Marseille (France); Lyoussi, A. [CEA, DEN, DER, Instrumentation Sensors and Dosimetry Laboratory, Cadarache, F-13108 (France)

    2015-07-01

    nuclear heating. The last one consists in the development of accurate measurement and analysis methods. The paper will be dedicated to a complete review of the experimental and numerical works performed since 2009 thanks to two parts. The first part will detail a new thermal approach implemented to improve nuclear heating measurements by radiometric calorimeters. New experimental tools (calorimeter prototypes and set-ups such BETHY Bench) developed to perform preliminary out-of-pile studies under suitable conditions will be presented (temperature and velocity of the external cooling fluid, heat source localization and intensity inside the calorimetric cells). Then the response of two kinds of sensors, their calibrations curves and their thermal behaviors will be compared for various parameters. Finally validated numerical thermal and Monte Carlo works will be discussed to propose new improvements. The second parts of the paper will focus on works realized in order to design, develop and test the first prototype of the multi-sensor device called CARMEN [7-9]. The two mock-ups dedicated respectively to neutron measurements and photon measurements will be detailed. The results obtained during two irradiation campaigns inside the periphery of OSIRIS reactor will be shown. The new analysis method will be discussed. (authors)

  8. Advanced methods in teaching reactor physics

    International Nuclear Information System (INIS)

    Snoj, Luka; Kromar, Marjan; Zerovnik, Gasper; Ravnik, Matjaz

    2011-01-01

    Modern computer codes allow detailed neutron transport calculations. In combination with advanced 3D visualization software capable of treating large amounts of data in real time they form a powerful tool that can be used as a convenient modern educational tool for (nuclear power plant) operators, nuclear engineers, students and specialists involved in reactor operation and design. Visualization is applicable not only in education and training, but also as a tool for fuel management, core analysis and irradiation planning. The paper treats the visualization of neutron transport in different moderators, neutron flux and power distributions in two nuclear reactors (TRIGA type research reactor and typical PWR). The distributions are calculated with MCNP and CORD-2 computer codes and presented using Amira software.

  9. Advanced methods in teaching reactor physics

    Energy Technology Data Exchange (ETDEWEB)

    Snoj, Luka, E-mail: luka.snoj@ijs.s [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Kromar, Marjan, E-mail: marjan.kromar@ijs.s [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Zerovnik, Gasper, E-mail: gasper.zerovnik@ijs.s [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Ravnik, Matjaz [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia)

    2011-04-15

    Modern computer codes allow detailed neutron transport calculations. In combination with advanced 3D visualization software capable of treating large amounts of data in real time they form a powerful tool that can be used as a convenient modern educational tool for (nuclear power plant) operators, nuclear engineers, students and specialists involved in reactor operation and design. Visualization is applicable not only in education and training, but also as a tool for fuel management, core analysis and irradiation planning. The paper treats the visualization of neutron transport in different moderators, neutron flux and power distributions in two nuclear reactors (TRIGA type research reactor and typical PWR). The distributions are calculated with MCNP and CORD-2 computer codes and presented using Amira software.

  10. Advances in iterative methods for nonlinear equations

    CERN Document Server

    Busquier, Sonia

    2016-01-01

    This book focuses on the approximation of nonlinear equations using iterative methods. Nine contributions are presented on the construction and analysis of these methods, the coverage encompassing convergence, efficiency, robustness, dynamics, and applications. Many problems are stated in the form of nonlinear equations, using mathematical modeling. In particular, a wide range of problems in Applied Mathematics and in Engineering can be solved by finding the solutions to these equations. The book reveals the importance of studying convergence aspects in iterative methods and shows that selection of the most efficient and robust iterative method for a given problem is crucial to guaranteeing a good approximation. A number of sample criteria for selecting the optimal method are presented, including those regarding the order of convergence, the computational cost, and the stability, including the dynamics. This book will appeal to researchers whose field of interest is related to nonlinear problems and equations...

  11. Advanced tools for in vivo skin analysis.

    Science.gov (United States)

    Cal, Krzysztof; Zakowiecki, Daniel; Stefanowska, Justyna

    2010-05-01

    A thorough examination of the skin is essential for accurate disease diagnostics, evaluation of the effectiveness of topically applied drugs and the assessment of the results of dermatologic surgeries such as skin grafts. Knowledge of skin parameters is also important in the cosmetics industry, where the effects of skin care products are evaluated. Due to significant progress in the electronics and computer industries, sophisticated analytic devices are increasingly available for day-to-day diagnostics. The aim of this article is to review several advanced methods for in vivo skin analysis in humans: magnetic resonance imaging, electron paramagnetic resonance, laser Doppler flowmetry and time domain reflectometry. The molecular bases of these techniques are presented, and several interesting applications in the field are discussed. Methods for in vivo assessment of the biomechanical properties of human skin are also reviewed.

  12. Advances in Statistical Methods for Substance Abuse Prevention Research

    Science.gov (United States)

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  13. Computational advances in transition phase analysis

    International Nuclear Information System (INIS)

    Morita, K.; Kondo, S.; Tobita, Y.; Shirakawa, N.; Brear, D.J.; Fischer, E.A.

    1994-01-01

    In this paper, historical perspective and recent advances are reviewed on computational technologies to evaluate a transition phase of core disruptive accidents in liquid-metal fast reactors. An analysis of the transition phase requires treatment of multi-phase multi-component thermohydraulics coupled with space- and energy-dependent neutron kinetics. Such a comprehensive modeling effort was initiated when the program of SIMMER-series computer code development was initiated in the late 1970s in the USA. Successful application of the latest SIMMER-II in USA, western Europe and Japan have proved its effectiveness, but, at the same time, several areas that require further research have been identified. Based on the experience and lessons learned during the SIMMER-II application through 1980s, a new project of SIMMER-III development is underway at the Power Reactor and Nuclear Fuel Development Corporation (PNC), Japan. The models and methods of SIMMER-III are briefly described with emphasis on recent advances in multi-phase multi-component fluid dynamics technologies and their expected implication on a future reliable transition phase analysis. (author)

  14. Advances in human reliability analysis in Mexico

    International Nuclear Information System (INIS)

    Nelson, Pamela F.; Gonzalez C, M.; Ruiz S, T.; Guillen M, D.; Contreras V, A.

    2010-10-01

    Human Reliability Analysis (HRA) is a very important part of Probabilistic Risk Analysis (PRA), and constant work is dedicated to improving methods, guidance and data in order to approach realism in the results as well as looking for ways to use these to reduce accident frequency at plants. Further, in order to advance in these areas, several HRA studies are being performed globally. Mexico has participated in the International HRA Empirical study with the objective of -benchmarking- HRA methods by comparing HRA predictions to actual crew performance in a simulator, as well as in the empirical study on a US nuclear power plant currently in progress. The focus of the first study was the development of an understanding of how methods are applied by various analysts, and characterize the methods for their capability to guide the analysts to identify potential human failures, and associated causes and performance shaping factors. The HRA benchmarking study has been performed by using the Halden simulator, 14 European crews, and 15 HRA equipment s (NRC, EPRI, and foreign HRA equipment s using different HRA methods). This effort in Mexico is reflected through the work being performed on updating the Laguna Verde PRA to comply with the ASME PRA standard. In order to be considered an HRA with technical adequacy, that is, be considered as a capability category II, for risk-informed applications, the methodology used for the HRA in the original PRA is not considered sufficiently detailed, and the methodology had to upgraded. The HCR/CBDT/THERP method was chosen, since this is used in many nuclear plants with similar design. The HRA update includes identification and evaluation of human errors that can occur during testing and maintenance, as well as human errors that can occur during an accident using the Emergency Operating Procedures. The review of procedures for maintenance, surveillance and operation is a necessary step in HRA and provides insight into the possible

  15. Advances in social media analysis

    CERN Document Server

    Cocea, Mihaela; Wiratunga, Nirmalie; Goker, Ayse

    2015-01-01

    This volume presents a collection of carefully selected contributions in the area of social media analysis. Each chapter opens up a number of research directions that have the potential to be taken on further in this rapidly growing area of research. The chapters are diverse enough to serve a number of directions of research with Sentiment Analysis as the dominant topic in the book. The authors have provided a broad range of research achievements from multimodal sentiment identification to emotion detection in a Chinese microblogging website. The book will be useful to research students, academics and practitioners in the area of social media analysis.  .

  16. An Exploratory Analysis for the Selection and Implementation of Advanced Manufacturing Technology by Fuzzy Multi-criteria Decision Making Methods: A Comparative Study

    Science.gov (United States)

    Nath, Surajit; Sarkar, Bijan

    2017-08-01

    Advanced Manufacturing Technologies (AMTs) offer opportunities for the manufacturing organizations to excel their competitiveness and in turn their effectiveness in manufacturing. Proper selection and evaluation of AMTs is the most significant task in today's modern world. But this involves a lot of uncertainty and vagueness as it requires many conflicting criteria to deal with. So the task of selection and evaluation of AMTs becomes very tedious for the evaluators as they are not able to provide crisp data for the criteria. Different Fuzzy Multi-criteria Decision Making (MCDM) methods help greatly in dealing with this problem. This paper focuses on the application of two very much potential Fuzzy MCDM methods namely COPRAS-G, EVAMIX and a comparative study between them on some rarely mentioned criteria. Each of the two methods is very powerful evaluation tool and has beauty in its own. Although, performance wise these two methods are almost at same level, but, the approach of each one of them are quite unique. This uniqueness is revealed by introducing a numerical example of selection of AMT.

  17. Advance of core design method for ATR

    International Nuclear Information System (INIS)

    Maeda, Seiichirou; Ihara, Toshiteru; Iijima, Takashi; Seino, Hideaki; Kobayashi, Tetsurou; Takeuchi, Michio; Sugawara, Satoru; Matsumoto, Mitsuo.

    1995-01-01

    Core characteristics of ATR demonstration plant has been revised such as increasing the fuel burnup and the channel power, which is achieved by changing the number of fuel rod per fuel assembly from 28 to 36. The research and development concerning the core design method for ATR have been continued. The calculational errors of core analysis code have been evaluated using the operational data of FUGEN and the full scale simulated test results in DCA (Deuterium Critical Assembly) and HTL (Heat Transfer Loop) at O-arai engineering center. It is confirmed that the calculational error of power distribution is smaller than the design value of ATR demonstration plant. Critical heat flux correlation curve for 36 fuel rod cluster has been developed and the probability evaluation method based on its curve, which is more rational to evaluate the fuel dryout, has been adopted. (author)

  18. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  19. CADDIS Volume 4. Data Analysis: Advanced Analyses - Controlling for Natural Variability: SSD Plot Diagrams

    Science.gov (United States)

    Methods for controlling natural variability, predicting environmental conditions from biological observations method, biological trait data, species sensitivity distributions, propensity scores, Advanced Analyses of Data Analysis references.

  20. Advanced nuclear energy analysis technology

    International Nuclear Information System (INIS)

    Gauntt, Randall O.; Murata, Kenneth K.; Romero, Vicente Josce; Young, Michael Francis; Rochau, Gary Eugene

    2004-01-01

    A two-year effort focused on applying ASCI technology developed for the analysis of weapons systems to the state-of-the-art accident analysis of a nuclear reactor system was proposed. The Sandia SIERRA parallel computing platform for ASCI codes includes high-fidelity thermal, fluids, and structural codes whose coupling through SIERRA can be specifically tailored to the particular problem at hand to analyze complex multiphysics problems. Presently, however, the suite lacks several physics modules unique to the analysis of nuclear reactors. The NRC MELCOR code, not presently part of SIERRA, was developed to analyze severe accidents in present-technology reactor systems. We attempted to: (1) evaluate the SIERRA code suite for its current applicability to the analysis of next generation nuclear reactors, and the feasibility of implementing MELCOR models into the SIERRA suite, (2) examine the possibility of augmenting ASCI codes or alternatives by coupling to the MELCOR code, or portions thereof, to address physics particular to nuclear reactor issues, especially those facing next generation reactor designs, and (3) apply the coupled code set to a demonstration problem involving a nuclear reactor system. We were successful in completing the first two in sufficient detail to determine that an extensive demonstration problem was not feasible at this time. In the future, completion of this research would demonstrate the feasibility of performing high fidelity and rapid analyses of safety and design issues needed to support the development of next generation power reactor systems

  1. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    2003-05-01

    The IAEA Safety Guide on Safety Assessment and Verification defines that the aim of the safety analysis should be by means of appropriate analytical tools to establish and confirm the design basis for the items important to safety, and to ensure that the overall plant design is capable of meeting the prescribed and acceptable limits for radiation doses and releases for each plant condition category. Practical guidance on how to perform accident analyses of nuclear power plants (NPPs) is provided by the IAEA Safety Report on Accident Analysis for Nuclear Power Plants. The safety analyses are performed both in the form of deterministic and probabilistic analyses for NPPs. It is customary to refer to deterministic safety analyses as accident analyses. This report discusses the aspects of using the advanced accident analysis methods to carry out accident analyses in order to introduce them into the Safety Analysis Reports (SARs). In relation to the SAR, purposes of deterministic safety analysis can be further specified as (1) to demonstrate compliance with specific regulatory acceptance criteria; (2) to complement other analyses and evaluations in defining a complete set of design and operating requirements; (3) to identify and quantify limiting safety system set points and limiting conditions for operation to be used in the NPP limits and conditions; (4) to justify appropriateness of the technical solutions employed in the fulfillment of predetermined safety requirements. The essential parts of accident analyses are performed by applying sophisticated computer code packages, which have been specifically developed for this purpose. These code packages include mainly thermal-hydraulic system codes and reactor dynamics codes meant for the transient and accident analyses. There are also specific codes such as those for the containment thermal-hydraulics, for the radiological consequences and for severe accident analyses. In some cases, codes of a more general nature such

  2. Advanced time-series analysis of MEG data as a method to explore olfactory function in healthy controls and Parikinson's disease patients

    NARCIS (Netherlands)

    Boesveldt, S.; Knol, D.L.; Verbunt, J.P.A.; Berendse, H.W.

    2009-01-01

    Objectives: To determine whether time-series analysis of magnetoencephalography (MEG) data is a suitable method to study brain activity related to olfactory information processing, and to detect differences in odor-induced brain activity between patients with Parkinson's disease (PD) and controls.

  3. Advanced calculus a transition to analysis

    CERN Document Server

    Dence, Thomas P

    2010-01-01

    Designed for a one-semester advanced calculus course, Advanced Calculus explores the theory of calculus and highlights the connections between calculus and real analysis -- providing a mathematically sophisticated introduction to functional analytical concepts. The text is interesting to read and includes many illustrative worked-out examples and instructive exercises, and precise historical notes to aid in further exploration of calculus. Ancillary list: * Companion website, Ebook- http://www.elsevierdirect.com/product.jsp?isbn=9780123749550 * Student Solutions Manual- To come * Instructor

  4. Advances on geometric flux optical design method

    Science.gov (United States)

    García-Botella, Ángel; Fernández-Balbuena, Antonio Álvarez; Vázquez, Daniel

    2017-09-01

    Nonimaging optics is focused on the study of methods to design concentrators or illuminators systems. It can be included in the area of photometry and radiometry and it is governed by the laws of geometrical optics. The field vector method, which starts with the definition of the irradiance vector E, is one of the techniques used in nonimaging optics. Called "Geometrical flux vector" it has provide ideal designs. The main property of this model is, its ability to estimate how radiant energy is transferred by the optical system, from the concepts of field line, flux tube and pseudopotential surface, overcoming traditional raytrace methods. Nevertheless this model has been developed only at an academic level, where characteristic optical parameters are ideal not real and the studied geometries are simple. The main objective of the present paper is the application of the vector field method to the analysis and design of real concentration and illumination systems. We propose the development of a calculation tool for optical simulations by vector field, using algorithms based on Fermat`s principle, as an alternative to traditional tools for optical simulations by raytrace, based on reflection and refraction law. This new tool provides, first, traditional simulations results: efficiency, illuminance/irradiance calculations, angular distribution of light- with lower computation time, photometrical information needs about a few tens of field lines, in comparison with million rays needed nowadays. On the other hand the tool will provides new information as vector field maps produced by the system, composed by field lines and quasipotential surfaces. We show our first results with the vector field simulation tool.

  5. Advances in Moessbauer data analysis

    International Nuclear Information System (INIS)

    Souza, Paulo A. de

    1998-01-01

    The whole Moessbauer community generates a huge amount of data in several fields of human knowledge since the first publication of Rudolf Moessbauer. Interlaboratory measurements of the same substance may result in minor differences in the Moessbauer Parameters (MP) of isomer shift, quadrupole splitting and internal magnetic field. Therefore, a conventional data bank of published MP will be of limited help in identification of substances. Data bank search for exact information became incapable to differentiate the values of Moessbauer parameters within the experimental errors (e.g., IS = 0.22 mm/s from IS = 0.23 mm/s), but physically both values may be considered the same. An artificial neural network (ANN) is able to identify a substance and its crystalline structure from measured MP, and its slight variations do not represent an obstacle for the ANN identification. A barrier to the popularization of Moessbauer spectroscopy as an analytical technique is the absence of a full automated equipment, since the analysis of a Moessbauer spectrum normally is time-consuming and requires a specialist. In this work, the fitting process of a Moessbauer spectrum was completely automated through the use of genetic algorithms and fuzzy logic. Both software and hardware systems were implemented turning out to be a fully automated Moessbauer data analysis system. The developed system will be presented

  6. Advanced microtechnologies for cytogenetic analysis

    DEFF Research Database (Denmark)

    Kwasny, Dorota; Vedarethinam, Indumathi; Shah, Pranjul Jaykumar

    2012-01-01

    Cytogenetic and molecular cytogenetic analyses, which aim to detect chromosome abnormalities, are routinely performed in cytogenetic laboratories all over the world. Traditional cytogenetic studies are performed by analyzing the banding pattern of chromosomes, and are complemented by molecular...... cytogenetic techniques such as fluorescent in situ hybridization (FISH). To improve FISH application in cytogenetic analysis the issues with long experimental time, high volumes of expensive reagents and requirement for trained technicians need to be addressed. The protocol has recently evolved towards...... to introduce automation in the cytogenetic laboratories at a microscale. We have developed membrane based micro perfusion systems capable of expansion of lymphocytes in a shorter time and at a smaller scale. The simulated and experimental results show very efficient exchange of the growth medium...

  7. Damped time advance methods for particles and EM fields

    International Nuclear Information System (INIS)

    Friedman, A.; Ambrosiano, J.J.; Boyd, J.K.; Brandon, S.T.; Nielsen, D.E. Jr.; Rambo, P.W.

    1990-01-01

    Recent developments in the application of damped time advance methods to plasma simulations include the synthesis of implicit and explicit ''adjustably damped'' second order accurate methods for particle motion and electromagnetic field propagation. This paper discusses this method

  8. Advanced Aqueous Phase Catalyst Development using Combinatorial Methods, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Combinatorial methods are proposed to develop advanced Aqueous Oxidation Catalysts (AOCs) with the capability to mineralize organic contaminants present in effluents...

  9. Observer variation factor on advanced method for accurate, robust, and efficient spectral fitting of java based magnetic resonance user interface for MRS data analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Suk Jun [Dept. of Biomedical Laboratory Science, College of Health Science, Cheongju University, Cheongju (Korea, Republic of); Yu, Seung Man [Dept. of Radiological Science, College of Health Science, Gimcheon University, Gimcheon (Korea, Republic of)

    2016-06-15

    The purpose of this study was examined the measurement error factor on AMARES of jMRUI method for magnetic resonance spectroscopy (MRS) quantitative analysis by skilled and unskilled observer method and identified the reason of independent observers. The Point-resolved spectroscopy sequence was used to acquired magnetic resonance spectroscopy data of 10 weeks male Sprague-Dawley rat liver. The methylene protons ((-CH2-)n) of 1.3 ppm and water proton (H2O) of 4.7 ppm ratio was calculated by LCModel software for using the reference data. The seven unskilled observers were calculated total lipid (methylene/water) using the jMRUI AMARES technique twice every 1 week, and we conducted interclass correlation coefficient (ICC) statistical analysis by SPSS software. The inter-observer reliability (ICC) of Cronbach's alpha value was less than 0.1. The average value of seven observer's total lipid (0.096±0.038) was 50% higher than LCModel reference value. The jMRUI AMARES analysis method is need to minimize the presence of the residual metabolite by identified metabolite MRS profile in order to obtain the same results as the LCModel.

  10. The surface analysis methods

    International Nuclear Information System (INIS)

    Deville, J.P.

    1998-01-01

    Nowadays, there are a lot of surfaces analysis methods, each having its specificity, its qualities, its constraints (for instance vacuum) and its limits. Expensive in time and in investment, these methods have to be used deliberately. This article appeals to non specialists. It gives some elements of choice according to the studied information, the sensitivity, the use constraints or the answer to a precise question. After having recalled the fundamental principles which govern these analysis methods, based on the interaction between radiations (ultraviolet, X) or particles (ions, electrons) with matter, two methods will be more particularly described: the Auger electron spectroscopy (AES) and x-rays photoemission spectroscopy (ESCA or XPS). Indeed, they are the most widespread methods in laboratories, the easier for use and probably the most productive for the analysis of surface of industrial materials or samples submitted to treatments in aggressive media. (O.M.)

  11. Advancement of compressible multiphase flows and sodium-water reaction analysis program SERAPHIM. Validation of a numerical method for the simulation of highly underexpanded jets

    International Nuclear Information System (INIS)

    Uchibori, Akihiro; Ohshima, Hiroyuki; Watanabe, Akira

    2010-01-01

    SERAPHIM is a computer program for the simulation of the compressible multiphase flow involving the sodium-water chemical reaction under a tube failure accident in a steam generator of sodium cooled fast reactors. In this study, the numerical analysis of the highly underexpanded air jets into the air or into the water was performed as a part of validation of the SERAPHIM program. The multi-fluid model, the second-order TVD scheme and the HSMAC method considering a compressibility were used in this analysis. Combining these numerical methods makes it possible to calculate the multiphase flow including supersonic gaseous jets. In the case of the air jet into the air, the calculated pressure, the shape of the jet and the location of a Mach disk agreed with the existing experimental results. The effect of the difference scheme and the mesh resolution on the prediction accuracy was clarified through these analyses. The behavior of the air jet into the water was also reproduced successfully by the proposed numerical method. (author)

  12. Lecture notes for Advanced Time Series Analysis

    DEFF Research Database (Denmark)

    Madsen, Henrik; Holst, Jan

    1997-01-01

    A first version of this notes was used at the lectures in Grenoble, and they are now extended and improved (together with Jan Holst), and used in Ph.D. courses on Advanced Time Series Analysis at IMM and at the Department of Mathematical Statistics, University of Lund, 1994, 1997, ...

  13. NATO Advanced Study Institute on Advances in Microlocal Analysis

    CERN Document Server

    1986-01-01

    The 1985 Castel vecchio-Pas coli NATO Advanced Study Institute is aimed to complete the trilogy with the two former institutes I organized : "Boundary Value Problem for Evolution Partial Differential Operators", Liege, 1976 and "Singularities in Boundary Value Problems", Maratea, 1980. It was indeed necessary to record the considerable progress realized in the field of the propagation of singularities of Schwartz Distri­ butions which led recently to the birth of a new branch of Mathema­ tical Analysis called Microlocal Analysis. Most of this theory was mainly built to be applied to distribution solutions of linear partial differential problems. A large part of this institute still went in this direction. But, on the other hand, it was also time to explore the new trend to use microlocal analysis In non linear differential problems. I hope that the Castelvecchio NATO ASI reached its purposes with the help of the more famous authorities in the field. The meeting was held in Tuscany (Italy) at Castelvecchio-P...

  14. Advances in Risk Analysis with Big Data.

    Science.gov (United States)

    Choi, Tsan-Ming; Lambert, James H

    2017-08-01

    With cloud computing, Internet-of-things, wireless sensors, social media, fast storage and retrieval, etc., organizations and enterprises have access to unprecedented amounts and varieties of data. Current risk analysis methodology and applications are experiencing related advances and breakthroughs. For example, highway operations data are readily available, and making use of them reduces risks of traffic crashes and travel delays. Massive data of financial and enterprise systems support decision making under risk by individuals, industries, regulators, etc. In this introductory article, we first discuss the meaning of big data for risk analysis. We then examine recent advances in risk analysis with big data in several topic areas. For each area, we identify and introduce the relevant articles that are featured in the special issue. We conclude with a discussion on future research opportunities. © 2017 Society for Risk Analysis.

  15. Methods of Multivariate Analysis

    CERN Document Server

    Rencher, Alvin C

    2012-01-01

    Praise for the Second Edition "This book is a systematic, well-written, well-organized text on multivariate analysis packed with intuition and insight . . . There is much practical wisdom in this book that is hard to find elsewhere."-IIE Transactions Filled with new and timely content, Methods of Multivariate Analysis, Third Edition provides examples and exercises based on more than sixty real data sets from a wide variety of scientific fields. It takes a "methods" approach to the subject, placing an emphasis on how students and practitioners can employ multivariate analysis in real-life sit

  16. Advanced Methods of Biomedical Signal Processing

    CERN Document Server

    Cerutti, Sergio

    2011-01-01

    This book grew out of the IEEE-EMBS Summer Schools on Biomedical Signal Processing, which have been held annually since 2002 to provide the participants state-of-the-art knowledge on emerging areas in biomedical engineering. Prominent experts in the areas of biomedical signal processing, biomedical data treatment, medicine, signal processing, system biology, and applied physiology introduce novel techniques and algorithms as well as their clinical or physiological applications. The book provides an overview of a compelling group of advanced biomedical signal processing techniques, such as mult

  17. Advanced Method of the Elastomagnetic Sensors Calibration

    Directory of Open Access Journals (Sweden)

    Mikulas Prascak

    2004-01-01

    Full Text Available Elastomagnetic method (EM method is a highly sensitive non-contact evaluation method for measuring tensile and compressive stress in steel. The latest development of measuring devices and EM sensors has shown that the thermomagnetic phenomenon has a stron influence on th accuracy during the EM sensor calibration. To eliminate the influence of this effect a two dimensional regression method is presented.

  18. Recent advances in boundary element methods

    CERN Document Server

    Manolis, GD

    2009-01-01

    Addresses the needs of the computational mechanics research community in terms of information on boundary integral equation-based methods and techniques applied to a variety of fields. This book collects both original and review articles on contemporary Boundary Element Methods (BEM) as well as on the Mesh Reduction Methods (MRM).

  19. Advanced Interval Management: A Benefit Analysis

    Science.gov (United States)

    Timer, Sebastian; Peters, Mark

    2016-01-01

    This document is the final report for the NASA Langley Research Center (LaRC)- sponsored task order 'Possible Benefits for Advanced Interval Management Operations.' Under this research project, Architecture Technology Corporation performed an analysis to determine the maximum potential benefit to be gained if specific Advanced Interval Management (AIM) operations were implemented in the National Airspace System (NAS). The motivation for this research is to guide NASA decision-making on which Interval Management (IM) applications offer the most potential benefit and warrant further research.

  20. Optoelectronic Devices Advanced Simulation and Analysis

    CERN Document Server

    Piprek, Joachim

    2005-01-01

    Optoelectronic devices transform electrical signals into optical signals and vice versa by utilizing the sophisticated interaction of electrons and light within micro- and nano-scale semiconductor structures. Advanced software tools for design and analysis of such devices have been developed in recent years. However, the large variety of materials, devices, physical mechanisms, and modeling approaches often makes it difficult to select appropriate theoretical models or software packages. This book presents a review of devices and advanced simulation approaches written by leading researchers and software developers. It is intended for scientists and device engineers in optoelectronics, who are interested in using advanced software tools. Each chapter includes the theoretical background as well as practical simulation results that help to better understand internal device physics. The software packages used in the book are available to the public, on a commercial or noncommercial basis, so that the interested r...

  1. Advanced Excel for scientific data analysis

    CERN Document Server

    De Levie, Robert

    2004-01-01

    Excel is by far the most widely distributed data analysis software but few users are aware of its full powers. Advanced Excel For Scientific Data Analysis takes off from where most books dealing with scientific applications of Excel end. It focuses on three areas-least squares, Fourier transformation, and digital simulation-and illustrates these with extensive examples, often taken from the literature. It also includes and describes a number of sample macros and functions to facilitate common data analysis tasks. These macros and functions are provided in uncompiled, computer-readable, easily

  2. Advanced finite element method in structural engineering

    CERN Document Server

    Long, Yu-Qiu; Long, Zhi-Fei

    2009-01-01

    This book systematically introduces the research work on the Finite Element Method completed over the past 25 years. Original theoretical achievements and their applications in the fields of structural engineering and computational mechanics are discussed.

  3. Advanced repair methods for enhanced reactor safety

    International Nuclear Information System (INIS)

    Kornfeldt, H.

    1993-01-01

    A few innovative concepts are described of the ABB Atom Service Division for repair and mitigation techniques for primary systems in nuclear power plants. The concepts are based on Shape Memory Alloy (SMA) technology. A basic feature of all methods is that welding and component replacement is being avoided and the radiation dose imposed on maintenance personnel reduced. The SMA-based repair methods give plant operators new ways to meet increased safety standards and rising maintenance costs. (Z.S.) 4 figs

  4. Advanced verification methods for OVI security ink

    Science.gov (United States)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  5. Advanced Computational Methods in Bio-Mechanics.

    Science.gov (United States)

    Al Qahtani, Waleed M S; El-Anwar, Mohamed I

    2018-04-15

    A novel partnership between surgeons and machines, made possible by advances in computing and engineering technology, could overcome many of the limitations of traditional surgery. By extending surgeons' ability to plan and carry out surgical interventions more accurately and with fewer traumas, computer-integrated surgery (CIS) systems could help to improve clinical outcomes and the efficiency of healthcare delivery. CIS systems could have a similar impact on surgery to that long since realised in computer-integrated manufacturing. Mathematical modelling and computer simulation have proved tremendously successful in engineering. Computational mechanics has enabled technological developments in virtually every area of our lives. One of the greatest challenges for mechanists is to extend the success of computational mechanics to fields outside traditional engineering, in particular to biology, the biomedical sciences, and medicine. Biomechanics has significant potential for applications in orthopaedic industry, and the performance arts since skills needed for these activities are visibly related to the human musculoskeletal and nervous systems. Although biomechanics is widely used nowadays in the orthopaedic industry to design orthopaedic implants for human joints, dental parts, external fixations and other medical purposes, numerous researches funded by billions of dollars are still running to build a new future for sports and human healthcare in what is called biomechanics era.

  6. Core design methods for advanced LMFBRs

    International Nuclear Information System (INIS)

    Chandler, J.C.; Marr, D.R.; McCurry, D.C.; Cantley, D.A.

    1977-05-01

    The multidiscipline approach to advanced LMFBR core design requires an iterative design procedure to obtain a closely-coupled design. HEDL's philosophy requires that the designs should be coupled to the extent that the design limiting fuel pin, the design limiting duct and the core reactivity lifetime should all be equal and should equal the fuel residence time. The design procedure consists of an iterative loop involving three stages of the design sequence. Stage 1 consists of general mechanical design and reactor physics scoping calculations to arrive at an initial core layout. Stage 2 consists of detailed reactor physics calculations for the core configuration arrived at in Stage 1. Based upon the detailed reactor physics results, a decision is made either to alter the design (Stage 1) or go to Stage 3. Stage 3 consists of core orificing and detailed component mechanical design calculations. At this point, an assessment is made regarding design adequacy. If the design is inadequate the entire procedure is repeated until the design is acceptable

  7. Methods for RNA Analysis

    DEFF Research Database (Denmark)

    Olivarius, Signe

    of the transcriptome, 5’ end capture of RNA is combined with next-generation sequencing for high-throughput quantitative assessment of transcription start sites by two different methods. The methods presented here allow for functional investigation of coding as well as noncoding RNA and contribute to future...... RNAs rely on interactions with proteins, the establishment of protein-binding profiles is essential for the characterization of RNAs. Aiming to facilitate RNA analysis, this thesis introduces proteomics- as well as transcriptomics-based methods for the functional characterization of RNA. First, RNA...

  8. System Level Analysis of LTE-Advanced

    DEFF Research Database (Denmark)

    Wang, Yuanye

    This PhD thesis focuses on system level analysis of Multi-Component Carrier (CC) management for Long Term Evolution (LTE)-Advanced. Cases where multiple CCs are aggregated to form a larger bandwidth are studied. The analysis is performed for both local area and wide area networks. In local area...... reduction. Compared to the case of reuse-1, they achieve a gain of 50∼500% in cell edge user throughput, with small or no loss in average cell throughput. For the wide area network, effort is devoted to the downlink of LTE-Advanced. Such a system is assumed to be backwards compatible to LTE release 8, i...... scheme is recommended. It reduces the CQI by 94% at low load, and 79∼93% at medium to high load, with reasonable loss in downlink performance. To reduce the ACK/NACK feedback, multiple ACK/NACKs can be bundled, with slightly degraded downlink throughput....

  9. Recent advances in coupled-cluster methods

    CERN Document Server

    Bartlett, Rodney J

    1997-01-01

    Today, coupled-cluster (CC) theory has emerged as the most accurate, widely applicable approach for the correlation problem in molecules. Furthermore, the correct scaling of the energy and wavefunction with size (i.e. extensivity) recommends it for studies of polymers and crystals as well as molecules. CC methods have also paid dividends for nuclei, and for certain strongly correlated systems of interest in field theory.In order for CC methods to have achieved this distinction, it has been necessary to formulate new, theoretical approaches for the treatment of a variety of essential quantities

  10. Advanced method for making vitreous waste forms

    International Nuclear Information System (INIS)

    Pope, J.M.; Harrison, D.E.

    1980-01-01

    A process is described for making waste glass that circumvents the problems of dissolving nuclear waste in molten glass at high temperatures. Because the reactive mixing process is independent of the inherent viscosity of the melt, any glass composition can be prepared with equal facility. Separation of the mixing and melting operations permits novel glass fabrication methods to be employed

  11. Advanced Testing Method for Ground Thermal Conductivity

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiaobing [ORNL; Clemenzi, Rick [Geothermal Design Center Inc.; Liu, Su [University of Tennessee (UT)

    2017-04-01

    A new method is developed that can quickly and more accurately determine the effective ground thermal conductivity (GTC) based on thermal response test (TRT) results. Ground thermal conductivity is an important parameter for sizing ground heat exchangers (GHEXs) used by geothermal heat pump systems. The conventional GTC test method usually requires a TRT for 48 hours with a very stable electric power supply throughout the entire test. In contrast, the new method reduces the required test time by 40%–60% or more, and it can determine GTC even with an unstable or intermittent power supply. Consequently, it can significantly reduce the cost of GTC testing and increase its use, which will enable optimal design of geothermal heat pump systems. Further, this new method provides more information about the thermal properties of the GHEX and the ground than previous techniques. It can verify the installation quality of GHEXs and has the potential, if developed, to characterize the heterogeneous thermal properties of the ground formation surrounding the GHEXs.

  12. An advanced method of heterogeneous reactor theory

    International Nuclear Information System (INIS)

    Kochurov, B.P.

    1994-08-01

    Recent approaches to heterogeneous reactor theory for numerical applications were presented in the course of 8 lectures given in JAERI. The limitations of initial theory known after the First Conference on Peacefull Uses of Atomic Energy held in Geneva in 1955 as Galanine-Feinberg heterogeneous theory:-matrix from of equations, -lack of consistent theory for heterogeneous parameters for reactor cell, -were overcome by a transformation of heterogeneous reactor equations to a difference form and by a development of a consistent theory for the characteristics of a reactor cell based on detailed space-energy calculations. General few group (G-number of groups) heterogeneous reactor equations in dipole approximation are formulated with the extension of two-dimensional problem to three-dimensions by finite Furie expansion of axial dependence of neutron fluxes. A transformation of initial matrix reactor equations to a difference form is presented. The methods for calculation of heterogeneous reactor cell characteristics giving the relation between vector-flux and vector-current on a cell boundary are based on a set of detailed space-energy neutron flux distribution calculations with zero current across cell boundary and G calculations with linearly independent currents across the cell boundary. The equations for reaction rate matrices are formulated. Specific methods were developed for description of neutron migration in axial and radial directions. The methods for resonance level's approach for numerous high-energy resonances. On the basis of these approaches the theory, methods and computer codes were developed for 3D space-time react or problems including simulation of slow processes with fuel burn-up, control rod movements, Xe poisoning and fast transients depending on prompt and delayed neutrons. As a result reactors with several thousands of channels having non-uniform axial structure can be feasibly treated. (author)

  13. Advanced Fuel Cycle Economic Sensitivity Analysis

    Energy Technology Data Exchange (ETDEWEB)

    David Shropshire; Kent Williams; J.D. Smith; Brent Boore

    2006-12-01

    A fuel cycle economic analysis was performed on four fuel cycles to provide a baseline for initial cost comparison using the Gen IV Economic Modeling Work Group G4 ECON spreadsheet model, Decision Programming Language software, the 2006 Advanced Fuel Cycle Cost Basis report, industry cost data, international papers, the nuclear power related cost study from MIT, Harvard, and the University of Chicago. The analysis developed and compared the fuel cycle cost component of the total cost of energy for a wide range of fuel cycles including: once through, thermal with fast recycle, continuous fast recycle, and thermal recycle.

  14. Advances in statistical models for data analysis

    CERN Document Server

    Minerva, Tommaso; Vichi, Maurizio

    2015-01-01

    This edited volume focuses on recent research results in classification, multivariate statistics and machine learning and highlights advances in statistical models for data analysis. The volume provides both methodological developments and contributions to a wide range of application areas such as economics, marketing, education, social sciences and environment. The papers in this volume were first presented at the 9th biannual meeting of the Classification and Data Analysis Group (CLADAG) of the Italian Statistical Society, held in September 2013 at the University of Modena and Reggio Emilia, Italy.

  15. Advanced Topology Optimization Methods for Conceptual Architectural Design

    DEFF Research Database (Denmark)

    Aage, Niels; Amir, Oded; Clausen, Anders

    2015-01-01

    This paper presents a series of new, advanced topology optimization methods, developed specifically for conceptual architectural design of structures. The proposed computational procedures are implemented as components in the framework of a Grasshopper plugin, providing novel capacities...

  16. Advanced Topology Optimization Methods for Conceptual Architectural Design

    DEFF Research Database (Denmark)

    Aage, Niels; Amir, Oded; Clausen, Anders

    2014-01-01

    This paper presents a series of new, advanced topology optimization methods, developed specifically for conceptual architectural design of structures. The proposed computational procedures are implemented as components in the framework of a Grasshopper plugin, providing novel capacities...

  17. Advanced photon counting applications, methods, instrumentation

    CERN Document Server

    Kapusta, Peter; Erdmann, Rainer

    2015-01-01

    This volume focuses on Time-Correlated Single Photon Counting (TCSPC), a powerful tool allowing luminescence lifetime measurements to be made with high temporal resolution, even on single molecules. Combining spectrum and lifetime provides a "fingerprint" for identifying such molecules in the presence of a background. Used together with confocal detection, this permits single-molecule spectroscopy and microscopy in addition to ensemble measurements, opening up an enormous range of hot life science applications such as fluorescence lifetime imaging (FLIM) and measurement of Förster Resonant Energy Transfer (FRET) for the investigation of protein folding and interaction. Several technology-related chapters present both the basics and current state-of-the-art, in particular of TCSPC electronics, photon detectors and lasers. The remaining chapters cover a broad range of applications and methodologies for experiments and data analysis, including the life sciences, defect centers in diamonds, super-resolution micr...

  18. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  19. Advanced methods of analysis variance on scenarios of nuclear prospective; Metodos avanzados de analisis de varianza en escenarios de prospectiva nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Blazquez, J.; Montalvo, C.; Balbas, M.; Garcia-Berrocal, A.

    2011-07-01

    Traditional techniques of propagation of variance are not very reliable, because there are uncertainties of 100% relative value, for this so use less conventional methods, such as Beta distribution, Fuzzy Logic and the Monte Carlo Method.

  20. Polarization control method for UV writing of advanced bragg gratings

    DEFF Research Database (Denmark)

    Deyerl, Hans-Jürgen; Plougmann, Nikolai; Jensen, Jesper Bo Damm

    2002-01-01

    We report the application of the polarization control method for the UV writing of advanced fiber Bragg gratings (FBG). We demonstrate the strength of the new method for different apodization profiles, including the Sinc-profile and two designs for dispersion-free square filters. The method has...

  1. Methods for Risk Analysis

    International Nuclear Information System (INIS)

    Alverbro, Karin

    2010-01-01

    Many decision-making situations today affect humans and the environment. In practice, many such decisions are made without an overall view and prioritise one or other of the two areas. Now and then these two areas of regulation come into conflict, e.g. the best alternative as regards environmental considerations is not always the best from a human safety perspective and vice versa. This report was prepared within a major project with the aim of developing a framework in which both the environmental aspects and the human safety aspects are integrated, and decisions can be made taking both fields into consideration. The safety risks have to be analysed in order to be successfully avoided and one way of doing this is to use different kinds of risk analysis methods. There is an abundance of existing methods to choose from and new methods are constantly being developed. This report describes some of the risk analysis methods currently available for analysing safety and examines the relationships between them. The focus here is mainly on human safety aspects

  2. Advances in independent component analysis and learning machines

    CERN Document Server

    Bingham, Ella; Laaksonen, Jorma; Lampinen, Jouko

    2015-01-01

    In honour of Professor Erkki Oja, one of the pioneers of Independent Component Analysis (ICA), this book reviews key advances in the theory and application of ICA, as well as its influence on signal processing, pattern recognition, machine learning, and data mining. Examples of topics which have developed from the advances of ICA, which are covered in the book are: A unifying probabilistic model for PCA and ICA Optimization methods for matrix decompositions Insights into the FastICA algorithmUnsupervised deep learning Machine vision and image retrieval A review of developments in the t

  3. Method of signal analysis

    International Nuclear Information System (INIS)

    Berthomier, Charles

    1975-01-01

    A method capable of handling the amplitude and the frequency time laws of a certain kind of geophysical signals is described here. This method is based upon the analytical signal idea of Gabor and Ville, which is constructed either in the time domain by adding an imaginary part to the real signal (in-quadrature signal), or in the frequency domain by suppressing negative frequency components. The instantaneous frequency of the initial signal is then defined as the time derivative of the phase of the analytical signal, and his amplitude, or envelope, as the modulus of this complex signal. The method is applied to three types of magnetospheric signals: chorus, whistlers and pearls. The results obtained by analog and numerical calculations are compared to results obtained by classical systems using filters, i.e. based upon a different definition of the concept of frequency. The precision with which the frequency-time laws are determined leads then to the examination of the principle of the method and to a definition of instantaneous power density spectrum attached to the signal, and to the first consequences of this definition. In this way, a two-dimensional representation of the signal is introduced which is less deformed by the analysis system properties than the usual representation, and which moreover has the advantage of being obtainable practically in real time [fr

  4. Advances in the Surface Renewal Flux Measurement Method

    Science.gov (United States)

    Shapland, T. M.; McElrone, A.; Paw U, K. T.; Snyder, R. L.

    2011-12-01

    The measurement of ecosystem-scale energy and mass fluxes between the planetary surface and the atmosphere is crucial for understanding geophysical processes. Surface renewal is a flux measurement technique based on analyzing the turbulent coherent structures that interact with the surface. It is a less expensive technique because it does not require fast-response velocity measurements, but only a fast-response scalar measurement. It is therefore also a useful tool for the study of the global cycling of trace gases. Currently, surface renewal requires calibration against another flux measurement technique, such as eddy covariance, to account for the linear bias of its measurements. We present two advances in the surface renewal theory and methodology that bring the technique closer to becoming a fully independent flux measurement method. The first advance develops the theory of turbulent coherent structure transport associated with the different scales of coherent structures. A novel method was developed for identifying the scalar change rate within structures at different scales. Our results suggest that for canopies less than one meter in height, the second smallest coherent structure scale dominates the energy and mass flux process. Using the method for resolving the scalar exchange rate of the second smallest coherent structure scale, calibration is unnecessary for surface renewal measurements over short canopies. This study forms the foundation for analysis over more complex surfaces. The second advance is a sensor frequency response correction for measuring the sensible heat flux via surface renewal. Inexpensive fine-wire thermocouples are frequently used to record high frequency temperature data in the surface renewal technique. The sensible heat flux is used in conjunction with net radiation and ground heat flux measurements to determine the latent heat flux as the energy balance residual. The robust thermocouples commonly used in field experiments

  5. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  6. Advanced methods of solid oxide fuel cell modeling

    CERN Document Server

    Milewski, Jaroslaw; Santarelli, Massimo; Leone, Pierluigi

    2011-01-01

    Fuel cells are widely regarded as the future of the power and transportation industries. Intensive research in this area now requires new methods of fuel cell operation modeling and cell design. Typical mathematical models are based on the physical process description of fuel cells and require a detailed knowledge of the microscopic properties that govern both chemical and electrochemical reactions. ""Advanced Methods of Solid Oxide Fuel Cell Modeling"" proposes the alternative methodology of generalized artificial neural networks (ANN) solid oxide fuel cell (SOFC) modeling. ""Advanced Methods

  7. Strategy to Promote Active Learning of an Advanced Research Method

    Science.gov (United States)

    McDermott, Hilary J.; Dovey, Terence M.

    2013-01-01

    Research methods courses aim to equip students with the knowledge and skills required for research yet seldom include practical aspects of assessment. This reflective practitioner report describes and evaluates an innovative approach to teaching and assessing advanced qualitative research methods to final-year psychology undergraduate students. An…

  8. Statistical methods of discrimination and classification advances in theory and applications

    CERN Document Server

    Choi, Sung C

    1986-01-01

    Statistical Methods of Discrimination and Classification: Advances in Theory and Applications is a collection of papers that tackles the multivariate problems of discriminating and classifying subjects into exclusive population. The book presents 13 papers that cover that advancement in the statistical procedure of discriminating and classifying. The studies in the text primarily focus on various methods of discriminating and classifying variables, such as multiple discriminant analysis in the presence of mixed continuous and categorical data; choice of the smoothing parameter and efficiency o

  9. An Innovative Three-Dimensional Heterogeneous Coarse-Mesh Transport Method for Advanced and Generation IV Reactor Core Analysis and Design

    International Nuclear Information System (INIS)

    Rahnema, Farzad

    2009-01-01

    This project has resulted in a highly efficient method that has been shown to provide accurate solutions to a variety of 2D and 3D reactor problems. The goal of this project was to develop (1) an accurate and efficient three-dimensional whole-core neutronics method with the following features: based solely on transport theory, does not require the use of cross-section homogenization, contains a highly accurate and self-consistent global flux reconstruction procedure, and is applicable to large, heterogeneous reactor models, and to (2) create new numerical benchmark problems for code cross-comparison.

  10. Advanced approaches to failure mode and effect analysis (FMEA applications

    Directory of Open Access Journals (Sweden)

    D. Vykydal

    2015-10-01

    Full Text Available The present paper explores advanced approaches to the FMEA method (Failure Mode and Effect Analysis which take into account the costs associated with occurrence of failures during the manufacture of a product. Different approaches are demonstrated using an example FMEA application to production of drawn wire. Their purpose is to determine risk levels, while taking account of the above-mentioned costs. Finally, the resulting priority levels are compared for developing actions mitigating the risks.

  11. Proceedings of the ANS/ASME/NRC international topical meeting on nuclear reactor thermal-hydraulics: LMFBR and HTGR advanced reactor concepts and analysis methods

    International Nuclear Information System (INIS)

    1980-01-01

    Separate abstracts are included for each of the papers presented concerning the thermal-hydraulics of LMFBR type reactors; mathematical methods in nuclear reactor thermal-hydraulics; heat transfer in gas-cooled reactors; and thermal-hydraulics of pebble-bed reactors. Two papers have been previously abstracted and input to the data base

  12. COMPUTER METHODS OF GENETIC ANALYSIS.

    Directory of Open Access Journals (Sweden)

    A. L. Osipov

    2017-02-01

    Full Text Available The basic statistical methods used in conducting the genetic analysis of human traits. We studied by segregation analysis, linkage analysis and allelic associations. Developed software for the implementation of these methods support.

  13. Advancing Usability Evaluation through Human Reliability Analysis

    International Nuclear Information System (INIS)

    Ronald L. Boring; David I. Gertman

    2005-01-01

    This paper introduces a novel augmentation to the current heuristic usability evaluation methodology. The SPAR-H human reliability analysis method was developed for categorizing human performance in nuclear power plants. Despite the specialized use of SPAR-H for safety critical scenarios, the method also holds promise for use in commercial off-the-shelf software usability evaluations. The SPAR-H method shares task analysis underpinnings with human-computer interaction, and it can be easily adapted to incorporate usability heuristics as performance shaping factors. By assigning probabilistic modifiers to heuristics, it is possible to arrive at the usability error probability (UEP). This UEP is not a literal probability of error but nonetheless provides a quantitative basis to heuristic evaluation. When combined with a consequence matrix for usability errors, this method affords ready prioritization of usability issues

  14. Methods for studying fuel management in advanced gas cooled reactors

    International Nuclear Information System (INIS)

    Buckler, A.N.; Griggs, C.F.; Tyror, J.G.

    1971-07-01

    The methods used for studying fuel and absorber management problems in AGRs are described. The basis of the method is the use of ARGOSY lattice data in reactor calculations performed at successive time steps. These reactor calculations may be quite crude but for advanced design calculations a detailed channel-by-channel representation of the whole core is required. The main emphasis of the paper is in describing such an advanced approach - the ODYSSEUS-6 code. This code evaluates reactor power distributions as a function of time and uses the information to select refuelling moves and determine controller positions. (author)

  15. SCHEME (Soft Control Human error Evaluation MEthod) for advanced MCR HRA

    International Nuclear Information System (INIS)

    Jang, Inseok; Jung, Wondea; Seong, Poong Hyun

    2015-01-01

    The Technique for Human Error Rate Prediction (THERP), Korean Human Reliability Analysis (K-HRA), Human Error Assessment and Reduction Technique (HEART), A Technique for Human Event Analysis (ATHEANA), Cognitive Reliability and Error Analysis Method (CREAM), and Simplified Plant Analysis Risk Human Reliability Assessment (SPAR-H) in relation to NPP maintenance and operation. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). They are still used for HRA in advanced MCRs even though the operating environment of advanced MCRs in NPPs has been considerably changed by the adoption of new human-system interfaces such as computer-based soft controls. Among the many features in advanced MCRs, soft controls are an important feature because the operation action in NPP advanced MCRs is performed by soft controls. Consequently, those conventional methods may not sufficiently consider the features of soft control execution human errors. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing the soft control task analysis and the literature reviews regarding widely accepted human error taxonomies. In this study, the framework of a HRA method for evaluating soft control execution human error in advanced MCRs is developed. First, the factors which HRA method in advanced MCRs should encompass are derived based on the literature review, and soft control task analysis. Based on the derived factors, execution HRA framework in advanced MCRs is developed mainly focusing on the features of soft control. Moreover, since most current HRA database deal with operation in conventional type of MCRs and are not explicitly designed to deal with digital HSI, HRA database are developed under lab scale simulation

  16. Searching the literature for proteins facilitates the identification of biological processes, if advanced methods of analysis are linked: a case study on microgravity-caused changes in cells.

    Science.gov (United States)

    Bauer, Johann; Bussen, Markus; Wise, Petra; Wehland, Markus; Schneider, Sabine; Grimm, Daniela

    2016-07-01

    More than one hundred reports were published about the characterization of cells from malignant and healthy tissues, as well as of endothelial cells and stem cells exposed to microgravity conditions. We retrieved publications about microgravity related studies on each type of cells, extracted the proteins mentioned therein and analyzed them aiming to identify biological processes affected by microgravity culture conditions. The analysis revealed 66 different biological processes, 19 of them were always detected when papers about the four types of cells were analyzed. Since a response to the removal of gravity is common to the different cell types, some of the 19 biological processes could play a role in cellular adaption to microgravity. Applying computer programs, to extract and analyze proteins and genes mentioned in publications becomes essential for scientists interested to get an overview of the rapidly growing fields of gravitational biology and space medicine.

  17. Advanced Technology Lifecycle Analysis System (ATLAS)

    Science.gov (United States)

    O'Neil, Daniel A.; Mankins, John C.

    2004-01-01

    Developing credible mass and cost estimates for space exploration and development architectures require multidisciplinary analysis based on physics calculations, and parametric estimates derived from historical systems. Within the National Aeronautics and Space Administration (NASA), concurrent engineering environment (CEE) activities integrate discipline oriented analysis tools through a computer network and accumulate the results of a multidisciplinary analysis team via a centralized database or spreadsheet Each minute of a design and analysis study within a concurrent engineering environment is expensive due the size of the team and supporting equipment The Advanced Technology Lifecycle Analysis System (ATLAS) reduces the cost of architecture analysis by capturing the knowledge of discipline experts into system oriented spreadsheet models. A framework with a user interface presents a library of system models to an architecture analyst. The analyst selects models of launchers, in-space transportation systems, and excursion vehicles, as well as space and surface infrastructure such as propellant depots, habitats, and solar power satellites. After assembling the architecture from the selected models, the analyst can create a campaign comprised of missions spanning several years. The ATLAS controller passes analyst specified parameters to the models and data among the models. An integrator workbook calls a history based parametric analysis cost model to determine the costs. Also, the integrator estimates the flight rates, launched masses, and architecture benefits over the years of the campaign. An accumulator workbook presents the analytical results in a series of bar graphs. In no way does ATLAS compete with a CEE; instead, ATLAS complements a CEE by ensuring that the time of the experts is well spent Using ATLAS, an architecture analyst can perform technology sensitivity analysis, study many scenarios, and see the impact of design decisions. When the analyst is

  18. Advances in Computational Stability Analysis of Composite Aerospace Structures

    International Nuclear Information System (INIS)

    Degenhardt, R.; Araujo, F. C. de

    2010-01-01

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  19. Method and Tools for Development of Advanced Instructional Systems

    NARCIS (Netherlands)

    Arend, J. van der; Riemersma, J.B.J.

    1994-01-01

    The application of advanced instructional systems (AISs), like computer-based training systems, intelligent tutoring systems and training simulators, is widely spread within the Royal Netherlands Army. As a consequence there is a growing interest in methods and tools to develop effective and

  20. Advances in the Analytical Methods for Determining the Antioxidant ...

    African Journals Online (AJOL)

    Advances in the Analytical Methods for Determining the Antioxidant Properties of Honey: A Review. M Moniruzzaman, MI Khalil, SA Sulaiman, SH Gan. Abstract. Free radicals and reactive oxygen species (ROS) have been implicated in contributing to the processes of aging and disease. In an effort to combat free radical ...

  1. COMPETITIVE INTELLIGENCE ANALYSIS - SCENARIOS METHOD

    Directory of Open Access Journals (Sweden)

    Ivan Valeriu

    2014-07-01

    Full Text Available Keeping a company in the top performing players in the relevant market depends not only on its ability to develop continually, sustainably and balanced, to the standards set by the customer and competition, but also on the ability to protect its strategic information and to know in advance the strategic information of the competition. In addition, given that economic markets, regardless of their profile, enable interconnection not only among domestic companies, but also between domestic companies and foreign companies, the issue of economic competition moves from the national economies to the field of interest of regional and international economic organizations. The stakes for each economic player is to keep ahead of the competition and to be always prepared to face market challenges. Therefore, it needs to know as early as possible, how to react to others’ strategy in terms of research, production and sales. If a competitor is planning to produce more and cheaper, then it must be prepared to counteract quickly this movement. Competitive intelligence helps to evaluate the capabilities of competitors in the market, legally and ethically, and to develop response strategies. One of the main goals of the competitive intelligence is to acknowledge the role of early warning and prevention of surprises that could have a major impact on the market share, reputation, turnover and profitability in the medium and long term of a company. This paper presents some aspects of competitive intelligence, mainly in terms of information analysis and intelligence generation. Presentation is theoretical and addresses a structured method of information analysis - scenarios method – in a version that combines several types of analysis in order to reveal some interconnecting aspects of the factors governing the activity of a company.

  2. Methods of nonlinear analysis

    CERN Document Server

    Bellman, Richard Ernest

    1970-01-01

    In this book, we study theoretical and practical aspects of computing methods for mathematical modelling of nonlinear systems. A number of computing techniques are considered, such as methods of operator approximation with any given accuracy; operator interpolation techniques including a non-Lagrange interpolation; methods of system representation subject to constraints associated with concepts of causality, memory and stationarity; methods of system representation with an accuracy that is the best within a given class of models; methods of covariance matrix estimation;methods for low-rank mat

  3. Higher geometry an introduction to advanced methods in analytic geometry

    CERN Document Server

    Woods, Frederick S

    2005-01-01

    For students of mathematics with a sound background in analytic geometry and some knowledge of determinants, this volume has long been among the best available expositions of advanced work on projective and algebraic geometry. Developed from Professor Woods' lectures at the Massachusetts Institute of Technology, it bridges the gap between intermediate studies in the field and highly specialized works.With exceptional thoroughness, it presents the most important general concepts and methods of advanced algebraic geometry (as distinguished from differential geometry). It offers a thorough study

  4. Advanced Coal Wind Hybrid: Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Phadke, Amol; Goldman, Charles; Larson, Doug; Carr, Tom; Rath, Larry; Balash, Peter; Yih-Huei, Wan

    2008-11-28

    Growing concern over climate change is prompting new thinking about the technologies used to generate electricity. In the future, it is possible that new government policies on greenhouse gas emissions may favor electric generation technology options that release zero or low levels of carbon emissions. The Western U.S. has abundant wind and coal resources. In a world with carbon constraints, the future of coal for new electrical generation is likely to depend on the development and successful application of new clean coal technologies with near zero carbon emissions. This scoping study explores the economic and technical feasibility of combining wind farms with advanced coal generation facilities and operating them as a single generation complex in the Western US. The key questions examined are whether an advanced coal-wind hybrid (ACWH) facility provides sufficient advantages through improvements to the utilization of transmission lines and the capability to firm up variable wind generation for delivery to load centers to compete effectively with other supply-side alternatives in terms of project economics and emissions footprint. The study was conducted by an Analysis Team that consists of staff from the Lawrence Berkeley National Laboratory (LBNL), National Energy Technology Laboratory (NETL), National Renewable Energy Laboratory (NREL), and Western Interstate Energy Board (WIEB). We conducted a screening level analysis of the economic competitiveness and technical feasibility of ACWH generation options located in Wyoming that would supply electricity to load centers in California, Arizona or Nevada. Figure ES-1 is a simple stylized representation of the configuration of the ACWH options. The ACWH consists of a 3,000 MW coal gasification combined cycle power plant equipped with carbon capture and sequestration (G+CC+CCS plant), a fuel production or syngas storage facility, and a 1,500 MW wind plant. The ACWH project is connected to load centers by a 3,000 MW

  5. Classification methods for noise transients in advanced gravitational-wave detectors II: performance tests on Advanced LIGO data

    International Nuclear Information System (INIS)

    Powell, Jade; Heng, Ik Siong; Torres-Forné, Alejandro; Font, José A; Lynch, Ryan; Trifirò, Daniele; Cuoco, Elena; Cavaglià, Marco

    2017-01-01

    The data taken by the advanced LIGO and Virgo gravitational-wave detectors contains short duration noise transients that limit the significance of astrophysical detections and reduce the duty cycle of the instruments. As the advanced detectors are reaching sensitivity levels that allow for multiple detections of astrophysical gravitational-wave sources it is crucial to achieve a fast and accurate characterization of non-astrophysical transient noise shortly after it occurs in the detectors. Previously we presented three methods for the classification of transient noise sources. They are Principal Component Analysis for Transients (PCAT), Principal Component LALInference Burst (PC-LIB) and Wavelet Detection Filter with Machine Learning (WDF-ML). In this study we carry out the first performance tests of these algorithms on gravitational-wave data from the Advanced LIGO detectors. We use the data taken between the 3rd of June 2015 and the 14th of June 2015 during the 7th engineering run (ER7), and outline the improvements made to increase the performance and lower the latency of the algorithms on real data. This work provides an important test for understanding the performance of these methods on real, non stationary data in preparation for the second advanced gravitational-wave detector observation run, planned for later this year. We show that all methods can classify transients in non stationary data with a high level of accuracy and show the benefits of using multiple classifiers. (paper)

  6. Advances in Time Estimation Methods for Molecular Data.

    Science.gov (United States)

    Kumar, Sudhir; Hedges, S Blair

    2016-04-01

    Molecular dating has become central to placing a temporal dimension on the tree of life. Methods for estimating divergence times have been developed for over 50 years, beginning with the proposal of molecular clock in 1962. We categorize the chronological development of these methods into four generations based on the timing of their origin. In the first generation approaches (1960s-1980s), a strict molecular clock was assumed to date divergences. In the second generation approaches (1990s), the equality of evolutionary rates between species was first tested and then a strict molecular clock applied to estimate divergence times. The third generation approaches (since ∼2000) account for differences in evolutionary rates across the tree by using a statistical model, obviating the need to assume a clock or to test the equality of evolutionary rates among species. Bayesian methods in the third generation require a specific or uniform prior on the speciation-process and enable the inclusion of uncertainty in clock calibrations. The fourth generation approaches (since 2012) allow rates to vary from branch to branch, but do not need prior selection of a statistical model to describe the rate variation or the specification of speciation model. With high accuracy, comparable to Bayesian approaches, and speeds that are orders of magnitude faster, fourth generation methods are able to produce reliable timetrees of thousands of species using genome scale data. We found that early time estimates from second generation studies are similar to those of third and fourth generation studies, indicating that methodological advances have not fundamentally altered the timetree of life, but rather have facilitated time estimation by enabling the inclusion of more species. Nonetheless, we feel an urgent need for testing the accuracy and precision of third and fourth generation methods, including their robustness to misspecification of priors in the analysis of large phylogenies and data

  7. Advanced non-destructive methods for an efficient service performance

    International Nuclear Information System (INIS)

    Rauschenbach, H.; Clossen-von Lanken Schulz, M.; Oberlin, R.

    2015-01-01

    Due to the power generation industry's desire to decrease outage time and extend inspection intervals for highly stressed turbine parts, advanced and reliable Non-destructive methods were developed by Siemens Non-destructive laboratory. Effective outage performance requires the optimized planning of all outage activities as well as modern Non-destructive examination methods, in order to examine the highly stressed components (turbine rotor, casings, valves, generator rotor) reliably and in short periods of access. This paper describes the experience of Siemens Energy with an ultrasonic Phased Array inspection technique for the inspection of radial entry pinned turbine blade roots. The developed inspection technique allows the ultrasonic inspection of steam turbine blades without blade removal. Furthermore advanced Non-destructive examination methods for joint bolts will be described, which offer a significant reduction of outage duration in comparison to conventional inspection techniques. (authors)

  8. Advanced airflow distribution methods for reducing exposure of indoor pollution

    DEFF Research Database (Denmark)

    Cao, Guangyu; Nielsen, Peter Vilhelm; Melikov, Arsen

    2017-01-01

    The adverse effect of various indoor pollutants on occupants’ health have been recognized. In public spaces flu viruses may spread from person to person by airflow generated by various traditional ventilation methods, like natural ventilation and mixing ventilation (MV Personalized ventilation (PV......) supplies clean air close to the occupant and directly into the breathing zone. Studies show that it improves the inhaled air quality and reduces the risk of airborne cross-infection in comparison with total volume (TV) ventilation. However, it is still challenging for PV and other advanced air distribution...... methods to reduce the exposure to gaseous and particulate pollutants under disturbed conditions and to ensure thermal comfort at the same time. The objective of this study is to analyse the performance of different advanced airflow distribution methods for protection of occupants from exposure to indoor...

  9. Advanced airflow distribution methods for reducing exposure of indoor pollution

    DEFF Research Database (Denmark)

    Cao, Guangyu; Nielsen, Peter Vilhelm; Melikov, Arsen Krikor

    methods to reduce the exposure to gaseous and particulate pollutants under disturbed conditions and to ensure thermal comfort at the same time. The objective of this study is to analyse the performance of different advanced airflow distribution methods for protection of occupants from exposure to indoor......The adverse effect of various indoor pollutants on occupants’ health have been recognized. In public spaces flu viruses may spread from person to person by airflow generated by various traditional ventilation methods, like natural ventilation and mixing ventilation (MV Personalized ventilation (PV......) supplies clean air close to the occupant and directly into the breathing zone. Studies show that it improves the inhaled air quality and reduces the risk of airborne cross-infection in comparison with total volume (TV) ventilation. However, it is still challenging for PV and other advanced air distribution...

  10. Recent advances in radial basis function collocation methods

    CERN Document Server

    Chen, Wen; Chen, C S

    2014-01-01

    This book surveys the latest advances in radial basis function (RBF) meshless collocation methods which emphasis on recent novel kernel RBFs and new numerical schemes for solving partial differential equations. The RBF collocation methods are inherently free of integration and mesh, and avoid tedious mesh generation involved in standard finite element and boundary element methods. This book focuses primarily on the numerical algorithms, engineering applications, and highlights a large class of novel boundary-type RBF meshless collocation methods. These methods have shown a clear edge over the traditional numerical techniques especially for problems involving infinite domain, moving boundary, thin-walled structures, and inverse problems. Due to the rapid development in RBF meshless collocation methods, there is a need to summarize all these new materials so that they are available to scientists, engineers, and graduate students who are interest to apply these newly developed methods for solving real world’s ...

  11. Tokamak advanced pump limiter experiments and analysis

    International Nuclear Information System (INIS)

    Conn, R.W.

    1983-06-01

    Experiments with pump limiter modules on several operating tokamaks establish such limiters as efficient collectors of particles and has demonstrated the importance of ballistic scattering as predicted theoretically. Plasma interaction with recycling neutral gas appears to become important as the plasma density increases and the effective ionization mean free path within the module decreases. In limiters with particle collection but without active internal pumping, the neutral gas pressure is found to vary nonlinearly with the edge plasma density at the highest densities studies. Both experiments and theory indicate that the energy spectrum of gas atoms in the pump ducting is non-thermal, consistent with the results of Monte Carlo neutral atom transport calculations. The distribution of plasma power over the front surface of such modules has been measured and appears to be consistent with the predictions of simple theory. Initial results from the latest experiment on the ISX-B tokamak with an actively pumped limiter module demonstrates that the core plasma density can be controlled with a pump limiter and that the scrape-off layer plasma can partially screen the core plasma from gas injection. The results from module pump limiter experiments and from the theory and design analysis of advanced pump limiters for reactors are used to suggest the major features of a definitive, axisymmetric, toroidal belt pump limiter experiment

  12. Advanced Markov chain Monte Carlo methods learning from past samples

    CERN Document Server

    Liang, Faming; Carrol, Raymond J

    2010-01-01

    This book provides comprehensive coverage of simulation of complex systems using Monte Carlo methods. Developing algorithms that are immune to the local trap problem has long been considered as the most important topic in MCMC research. Various advanced MCMC algorithms which address this problem have been developed include, the modified Gibbs sampler, the methods based on auxiliary variables and the methods making use of past samples. The focus of this book is on the algorithms that make use of past samples. This book includes the multicanonical algorithm, dynamic weighting, dynamically weight

  13. Development of advanced earthquake resistant performance verification on reinforced concrete underground structure. Pt. 2. Verification of the ground modeling methods applied to non-linear soil-structure interaction analysis

    International Nuclear Information System (INIS)

    Kawai, Tadashi; Kanatani, Mamoru; Ohtomo, Keizo; Matsui, Jun; Matsuo, Toyofumi

    2003-01-01

    In order to develop an advanced verification method for earthquake resistant performance on reinforced concrete underground structures, the applicability of two different types of soil modeling methods in numerical analysis were verified through non-linear dynamic numerical simulations of the large shaking table tests conducted using the model comprised of free-field ground or soils and a reinforced concrete two-box culvert structure system. In these simulations, the structure was modeled by a beam type element having a tri-linear curve of the relations between curvature and flexural moment. The soil was modeled by the Ramberg-Osgood model as well as an elasto-plastic constitutive model. The former model only employs non-linearity of shear modulus regarding strain and initial stress conditions, whereas the latter can express non-linearity of shear modulus caused by changes of mean effective stress during ground excitation and dilatancy of ground soil. Therefore the elasto-plastic constitutive model could precisely simulate the vertical acceleration and displacement response on ground surface, which were produced by the soil dilations during a shaking event of a horizontal base input in the model tests. In addition, the model can explain distinctive dynamic earth pressure acting on the vertical walls of the structure which was also confirmed to be related to the soil dilations. However, since both these modeling methods could express the shear force on the upper slab surface of the model structure, which plays the predominant role on structural deformation, these modeling methods were applicable equally to the evaluation of seismic performance similar to the model structure of this study. (author)

  14. New or improved computational methods and advanced reactor design

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Takeda, Toshikazu; Ushio, Tadashi

    1997-01-01

    Nuclear computational method has been studied continuously up to date, as a fundamental technology supporting the nuclear development. At present, research on computational method according to new theory and the calculating method thought to be difficult to practise are also continued actively to find new development due to splendid improvement of features of computer. In Japan, many light water type reactors are now in operations, new computational methods are induced for nuclear design, and a lot of efforts are concentrated for intending to more improvement of economics and safety. In this paper, some new research results on the nuclear computational methods and their application to nuclear design of the reactor were described for introducing recent trend of the nuclear design of the reactor. 1) Advancement of the computational method, 2) Reactor core design and management of the light water reactor, and 3) Nuclear design of the fast reactor. (G.K.)

  15. Development of a HRA method based on Human Factor Issues for advanced NPP

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Seong, Poong Hyun; Ha, Jun Su; Park, Jae Hyuk; Kim, Ja Kyung

    2010-01-01

    A design of instrumentation and control (I and C) systems for various plant systems including nuclear power plants (NPPs) is rapidly moving toward fully digital I and C and modern computer techniques have been gradually introduced into the design of advanced main control room (MCR). In advanced MCR, computer based Human-System Interfaces (HSIs) such as CRT based displays, large display panels (LDP), advanced information system, soft control and computerized procedure system (CPS) are applied in advanced MCR. Human operators in an advanced MCR still play an important role. However, various research and experiences from NPPs with an advanced MCR show that characteristics of human operators' task would be changed due to the use of inexperienced HSIs. This gives implications to the PSFs (Performance Shaping Factors) in HRA (Human Reliability Analysis). PSF in HRA is an aspect of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance resulting in increasing or decreasing the likelihood of human error. These PSFs have been suggested in various ways depending on the HRA methods used. In most HRA methods, however, there is a lack of inconsistency for the derivation of the PSFs and a lack of considerations of how the changes implemented in advanced MCR give impact on the operators' task. In this study, a framework for the derivation of and evaluation in the PSFs to be used in HRA for advanced NPPs is suggested

  16. Advanced microscopic methods for the detection of adhesion barriers in immunology in medical imaging

    Science.gov (United States)

    Lawrence, Shane

    2017-07-01

    Advanced methods of microscopy and advanced techniques of analysis stemming therefrom have developed greatly in the past few years.The use of single discrete methods has given way to the combination of methods which means an increase in data for processing to progress to the analysis and diagnosis of ailments and diseases which can be viewed by each and any method.This presentation shows the combination of such methods and gives example of the data which arises from each individual method and the combined methodology and suggests how such data can be streamlined to enable conclusions to be drawn about the particular biological and biochemical considerations that arise.In this particular project the subject of the methodology was human lactoferrin and the relation of the adhesion properties of hlf in the overcoming of barriers to adhesion mainly on the perimeter of the cellular unit and how this affects the process of immunity in any particular case.

  17. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  18. Recent advances in statistical energy analysis

    Science.gov (United States)

    Heron, K. H.

    1992-01-01

    Statistical Energy Analysis (SEA) has traditionally been developed using modal summation and averaging approach, and has led to the need for many restrictive SEA assumptions. The assumption of 'weak coupling' is particularly unacceptable when attempts are made to apply SEA to structural coupling. It is now believed that this assumption is more a function of the modal formulation rather than a necessary formulation of SEA. The present analysis ignores this restriction and describes a wave approach to the calculation of plate-plate coupling loss factors. Predictions based on this method are compared with results obtained from experiments using point excitation on one side of an irregular six-sided box structure. Conclusions show that the use and calculation of infinite transmission coefficients is the way forward for the development of a purely predictive SEA code.

  19. Advanced event reweighting using multivariate analysis

    International Nuclear Information System (INIS)

    Martschei, D; Feindt, M; Honc, S; Wagner-Kuhr, J

    2012-01-01

    Multivariate analysis (MVA) methods, especially discrimination techniques such as neural networks, are key ingredients in modern data analysis and play an important role in high energy physics. They are usually trained on simulated Monte Carlo (MC) samples to discriminate so called 'signal' from 'background' events and are then applied to data to select real events of signal type. We here address procedures that improve this work flow. This will be the enhancement of data / MC agreement by reweighting MC samples on a per event basis. Then training MVAs on real data using the sPlot technique will be discussed. Finally we will address the construction of MVAs whose discriminator is independent of a certain control variable, i.e. cuts on this variable will not change the discriminator shape.

  20. Constructing an Intelligent Patent Network Analysis Method

    Directory of Open Access Journals (Sweden)

    Chao-Chan Wu

    2012-11-01

    Full Text Available Patent network analysis, an advanced method of patent analysis, is a useful tool for technology management. This method visually displays all the relationships among the patents and enables the analysts to intuitively comprehend the overview of a set of patents in the field of the technology being studied. Although patent network analysis possesses relative advantages different from traditional methods of patent analysis, it is subject to several crucial limitations. To overcome the drawbacks of the current method, this study proposes a novel patent analysis method, called the intelligent patent network analysis method, to make a visual network with great precision. Based on artificial intelligence techniques, the proposed method provides an automated procedure for searching patent documents, extracting patent keywords, and determining the weight of each patent keyword in order to generate a sophisticated visualization of the patent network. This study proposes a detailed procedure for generating an intelligent patent network that is helpful for improving the efficiency and quality of patent analysis. Furthermore, patents in the field of Carbon Nanotube Backlight Unit (CNT-BLU were analyzed to verify the utility of the proposed method.

  1. Modeling and analysis of advanced binary cycles

    Energy Technology Data Exchange (ETDEWEB)

    Gawlik, K.

    1997-12-31

    A computer model (Cycle Analysis Simulation Tool, CAST) and a methodology have been developed to perform value analysis for small, low- to moderate-temperature binary geothermal power plants. The value analysis method allows for incremental changes in the levelized electricity cost (LEC) to be determined between a baseline plant and a modified plant. Thermodynamic cycle analyses and component sizing are carried out in the model followed by economic analysis which provides LEC results. The emphasis of the present work is on evaluating the effect of mixed working fluids instead of pure fluids on the LEC of a geothermal binary plant that uses a simple Organic Rankine Cycle. Four resources were studied spanning the range of 265{degrees}F to 375{degrees}F. A variety of isobutane and propane based mixtures, in addition to pure fluids, were used as working fluids. This study shows that the use of propane mixtures at a 265{degrees}F resource can reduce the LEC by 24% when compared to a base case value that utilizes commercial isobutane as its working fluid. The cost savings drop to 6% for a 375{degrees}F resource, where an isobutane mixture is favored. Supercritical cycles were found to have the lowest cost at all resources.

  2. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    Energy Technology Data Exchange (ETDEWEB)

    E. Blanford; E. Keldrauk; M. Laufer; M. Mieler; J. Wei; B. Stojadinovic; P.F. Peterson

    2010-09-20

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement, and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using

  3. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    International Nuclear Information System (INIS)

    Blanford, E.; Keldrauk, E.; Laufer, M.; Mieler, M.; Wei, J.; Stojadinovic, B.; Peterson, P.F.

    2010-01-01

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement, and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using

  4. Advanced soft computing diagnosis method for tumour grading.

    Science.gov (United States)

    Papageorgiou, E I; Spyridonos, P P; Stylios, C D; Ravazoula, P; Groumpos, P P; Nikiforidis, G N

    2006-01-01

    To develop an advanced diagnostic method for urinary bladder tumour grading. A novel soft computing modelling methodology based on the augmentation of fuzzy cognitive maps (FCMs) with the unsupervised active Hebbian learning (AHL) algorithm is applied. One hundred and twenty-eight cases of urinary bladder cancer were retrieved from the archives of the Department of Histopathology, University Hospital of Patras, Greece. All tumours had been characterized according to the classical World Health Organization (WHO) grading system. To design the FCM model for tumour grading, three experts histopathologists defined the main histopathological features (concepts) and their impact on grade characterization. The resulted FCM model consisted of nine concepts. Eight concepts represented the main histopathological features for tumour grading. The ninth concept represented the tumour grade. To increase the classification ability of the FCM model, the AHL algorithm was applied to adjust the weights of the FCM. The proposed FCM grading model achieved a classification accuracy of 72.5%, 74.42% and 95.55% for tumours of grades I, II and III, respectively. An advanced computerized method to support tumour grade diagnosis decision was proposed and developed. The novelty of the method is based on employing the soft computing method of FCMs to represent specialized knowledge on histopathology and on augmenting FCMs ability using an unsupervised learning algorithm, the AHL. The proposed method performs with reasonably high accuracy compared to other existing methods and at the same time meets the physicians' requirements for transparency and explicability.

  5. Multivariate analysis: models and method

    International Nuclear Information System (INIS)

    Sanz Perucha, J.

    1990-01-01

    Data treatment techniques are increasingly used since computer methods result of wider access. Multivariate analysis consists of a group of statistic methods that are applied to study objects or samples characterized by multiple values. A final goal is decision making. The paper describes the models and methods of multivariate analysis

  6. Multivariate analysis methods in physics

    International Nuclear Information System (INIS)

    Wolter, M.

    2007-01-01

    A review of multivariate methods based on statistical training is given. Several multivariate methods useful in high-energy physics analysis are discussed. Selected examples from current research in particle physics are discussed, both from the on-line trigger selection and from the off-line analysis. Also statistical training methods are presented and some new application are suggested [ru

  7. Methods in algorithmic analysis

    CERN Document Server

    Dobrushkin, Vladimir A

    2009-01-01

    …helpful to any mathematics student who wishes to acquire a background in classical probability and analysis … This is a remarkably beautiful book that would be a pleasure for a student to read, or for a teacher to make into a year's course.-Harvey Cohn, Computing Reviews, May 2010

  8. Communication Network Analysis Methods.

    Science.gov (United States)

    Farace, Richard V.; Mabee, Timothy

    This paper reviews a variety of analytic procedures that can be applied to network data, discussing the assumptions and usefulness of each procedure when applied to the complexity of human communication. Special attention is paid to the network properties measured or implied by each procedure. Factor analysis and multidimensional scaling are among…

  9. Complementing Gender Analysis Methods.

    Science.gov (United States)

    Kumar, Anant

    2016-01-01

    The existing gender analysis frameworks start with a premise that men and women are equal and should be treated equally. These frameworks give emphasis on equal distribution of resources between men and women and believe that this will bring equality which is not always true. Despite equal distribution of resources, women tend to suffer and experience discrimination in many areas of their lives such as the power to control resources within social relationships, and the need for emotional security and reproductive rights within interpersonal relationships. These frameworks believe that patriarchy as an institution plays an important role in women's oppression, exploitation, and it is a barrier in their empowerment and rights. Thus, some think that by ensuring equal distribution of resources and empowering women economically, institutions like patriarchy can be challenged. These frameworks are based on proposed equality principle which puts men and women in competing roles. Thus, the real equality will never be achieved. Contrary to the existing gender analysis frameworks, the Complementing Gender Analysis framework proposed by the author provides a new approach toward gender analysis which not only recognizes the role of economic empowerment and equal distribution of resources but suggests to incorporate the concept and role of social capital, equity, and doing gender in gender analysis which is based on perceived equity principle, putting men and women in complementing roles that may lead to equality. In this article the author reviews the mainstream gender theories in development from the viewpoint of the complementary roles of gender. This alternative view is argued based on existing literature and an anecdote of observations made by the author. While criticizing the equality theory, the author offers equity theory in resolving the gender conflict by using the concept of social and psychological capital.

  10. Advanced method of double contrast examination of the stomach

    International Nuclear Information System (INIS)

    Vlasov, P.V.; Yakimenko, V.F.

    1981-01-01

    An advanced method of double contrast examination of the stomach with the use of high concentrated barium suspension is described. It is shown that concentration of barium suspension must be not less than 200 mass/volume per cent to obtain the sharp image of the mucosal microrelief 6 standard position are recommended for the double contrast examination of all stomach walls. 200 patients with different digestive system diseases are examined with the help of developed methods. The sharp image of the mucosal microrelief is obtained in 70% cases [ru

  11. Advanced Concept Architecture Design and Integrated Analysis (ACADIA)

    Science.gov (United States)

    2017-11-03

    1 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) Submitted to the National Institute of Aerospace (NIA) on...Research Report 20161001 - 20161030 Advanced Concept Architecture Design and Integrated Analysis (ACADIA) W911NF-16-2-0229 8504Cedric Justin, Youngjun

  12. Energy, Exergy and Advanced Exergy Analysis of a Milk Processing Factory

    DEFF Research Database (Denmark)

    Bühler, Fabian; Nguyen, Tuong-Van; Jensen, Jonas Kjær

    2016-01-01

    integration, an exergy analysis pinpoints the locations, causes and magnitudes of thermodynamic losses. The advanced exergy analysis further identifies the real potential for thermodynamic improvements of the system by splitting exergy destruction into its avoidable and unavoidable parts, which are related......, cream and milk powder. The results show the optimisation potential based on 1st and 2nd law analyses. An evaluation and comparison of the applicability of exergy methods, including advanced exergy methods, to the dairy industry is made. The comparison includes typical energy mappings conducted onsite......, and discusses the benefits and challenges of applying advanced thermodynamic methods to industrial processes....

  13. STOCHASTIC METHODS IN RISK ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladimíra OSADSKÁ

    2017-06-01

    Full Text Available In this paper, we review basic stochastic methods which can be used to extend state-of-the-art deterministic analytical methods for risk analysis. We can conclude that the standard deterministic analytical methods highly depend on the practical experience and knowledge of the evaluator and therefore, the stochastic methods should be introduced. The new risk analysis methods should consider the uncertainties in input values. We present how large is the impact on the results of the analysis solving practical example of FMECA with uncertainties modelled using Monte Carlo sampling.

  14. Advanced Infantry Training: An Empirical Analysis Of (0341) Mortarman Success While Attending Advanced Mortarman Course

    Science.gov (United States)

    2017-12-01

    system MCT Marine Combat Training MEF Marine Expeditionary Force MK Math knowledge MOS Military occupational specialty MSG Marine Security Guard...to advanced level training, specifically, the Advanced Mortarman Course (AMC). Prospective students’ success is predicated on an effective command...survival. It is evident through survival analysis that increased levels of cognitive ability have significant impacts on a Marine’s probability to

  15. NATO Advanced Research Workshop on Vectorization of Advanced Methods for Molecular Electronic Structure

    CERN Document Server

    1984-01-01

    That there have been remarkable advances in the field of molecular electronic structure during the last decade is clear not only to those working in the field but also to anyone else who has used quantum chemical results to guide their own investiga­ tions. The progress in calculating the electronic structures of molecules has occurred through the truly ingenious theoretical and methodological developments that have made computationally tractable the underlying physics of electron distributions around a collection of nuclei. At the same time there has been consider­ able benefit from the great advances in computer technology. The growing sophistication, declining costs and increasing accessibi­ lity of computers have let theorists apply their methods to prob­ lems in virtually all areas of molecular science. Consequently, each year witnesses calculations on larger molecules than in the year before and calculations with greater accuracy and more com­ plete information on molecular properties. We can surel...

  16. Basic methods of isotope analysis

    International Nuclear Information System (INIS)

    Ochkin, A.V.; Rozenkevich, M.B.

    2000-01-01

    The bases of the most applied methods of the isotope analysis are briefly presented. The possibilities and analytical characteristics of the mass-spectrometric, spectral, radiochemical and special methods of the isotope analysis, including application of the magnetic resonance, chromatography and refractometry, are considered [ru

  17. Systemization of Design and Analysis Technology for Advanced Reactor

    International Nuclear Information System (INIS)

    Kim, Keung Koo; Lee, J.; Zee, S. K.

    2009-01-01

    The present study is performed to establish the base for the license application of the original technology by systemization and enhancement of the technology that is indispensable for the design and analysis of the advanced reactors including integral reactors. Technical reports and topical reports are prepared for this purpose on some important design/analysis methodology; design and analysis computer programs, structural integrity evaluation of main components and structures, digital I and C systems and man-machine interface design. PPS design concept is complemented reflecting typical safety analysis results. And test plans and requirements are developed for the verification of the advanced reactor technology. Moreover, studies are performed to draw up plans to apply to current or advanced power reactors the original technologies or base technologies such as patents, computer programs, test results, design concepts of the systems and components of the advanced reactors. Finally, pending issues are studied of the advanced reactors to improve the economics and technology realization

  18. Advanced codes and methods supporting improved fuel cycle economics - 5493

    International Nuclear Information System (INIS)

    Curca-Tivig, F.; Maupin, K.; Thareau, S.

    2015-01-01

    AREVA's code development program was practically completed in 2014. The basic codes supporting a new generation of advanced methods are the followings. GALILEO is a state-of-the-art fuel rod performance code for PWR and BWR applications. Development is completed, implementation started in France and the U.S.A. ARCADIA-1 is a state-of-the-art neutronics/ thermal-hydraulics/ thermal-mechanics code system for PWR applications. Development is completed, implementation started in Europe and in the U.S.A. The system thermal-hydraulic codes S-RELAP5 and CATHARE-2 are not really new but still state-of-the-art in the domain. S-RELAP5 was completely restructured and re-coded such that its life cycle increases by further decades. CATHARE-2 will be replaced in the future by the new CATHARE-3. The new AREVA codes and methods are largely based on first principles modeling with an extremely broad international verification and validation data base. This enables AREVA and its customers to access more predictable licensing processes in a fast evolving regulatory environment (new safety criteria, requests for enlarged qualification databases, statistical applications, uncertainty propagation...). In this context, the advanced codes and methods and the associated verification and validation represent the key to avoiding penalties on products, on operational limits, or on methodologies themselves

  19. Advances in product family and product platform design methods & applications

    CERN Document Server

    Jiao, Jianxin; Siddique, Zahed; Hölttä-Otto, Katja

    2014-01-01

    Advances in Product Family and Product Platform Design: Methods & Applications highlights recent advances that have been made to support product family and product platform design and successful applications in industry. This book provides not only motivation for product family and product platform design—the “why” and “when” of platforming—but also methods and tools to support the design and development of families of products based on shared platforms—the “what”, “how”, and “where” of platforming. It begins with an overview of recent product family design research to introduce readers to the breadth of the topic and progresses to more detailed topics and design theory to help designers, engineers, and project managers plan, architect, and implement platform-based product development strategies in their companies. This book also: Presents state-of-the-art methods and tools for product family and product platform design Adopts an integrated, systems view on product family and pro...

  20. Combinatorial methods for advanced materials research and development

    Energy Technology Data Exchange (ETDEWEB)

    Cremer, R.; Dondorf, S.; Hauck, M.; Horbach, D.; Kaiser, M.; Krysta, S.; Kyrylov, O.; Muenstermann, E.; Philipps, M.; Reichert, K.; Strauch, G. [Rheinisch-Westfaelische Technische Hochschule Aachen (Germany). Lehrstuhl fuer Theoretische Huettenkunde

    2001-10-01

    The applicability of combinatorial methods in developing advanced materials is illustrated presenting four examples for the deposition and characterization of one- and two-dimensionally laterally graded coatings, which were deposited by means of (reactive) magnetron sputtering and plasma-enhanced chemical vapor deposition. To emphasize the advantages of combinatorial approaches, metastable hard coatings like (Ti,Al)N and (Ti,Al,Hf)N respectively, as well as Ge-Sb-Te based films for rewritable optical data storage were investigated with respect to the relations between structure, composition, and the desired materials properties. (orig.)

  1. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance

    Energy Technology Data Exchange (ETDEWEB)

    Harris, C.E.

    1994-09-01

    International technical experts in durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The symposium focused on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure, criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and advanced approaches to resist corrosion and environmentally assisted fatigue. Separate abstracts have been indexed for articles from this report.

  2. Advanced spot quality analysis in two-colour microarray experiments

    Directory of Open Access Journals (Sweden)

    Vetter Guillaume

    2008-09-01

    Full Text Available Abstract Background Image analysis of microarrays and, in particular, spot quantification and spot quality control, is one of the most important steps in statistical analysis of microarray data. Recent methods of spot quality control are still in early age of development, often leading to underestimation of true positive microarray features and, consequently, to loss of important biological information. Therefore, improving and standardizing the statistical approaches of spot quality control are essential to facilitate the overall analysis of microarray data and subsequent extraction of biological information. Findings We evaluated the performance of two image analysis packages MAIA and GenePix (GP using two complementary experimental approaches with a focus on the statistical analysis of spot quality factors. First, we developed control microarrays with a priori known fluorescence ratios to verify the accuracy and precision of the ratio estimation of signal intensities. Next, we developed advanced semi-automatic protocols of spot quality evaluation in MAIA and GP and compared their performance with available facilities of spot quantitative filtering in GP. We evaluated these algorithms for standardised spot quality analysis in a whole-genome microarray experiment assessing well-characterised transcriptional modifications induced by the transcription regulator SNAI1. Using a set of RT-PCR or qRT-PCR validated microarray data, we found that the semi-automatic protocol of spot quality control we developed with MAIA allowed recovering approximately 13% more spots and 38% more differentially expressed genes (at FDR = 5% than GP with default spot filtering conditions. Conclusion Careful control of spot quality characteristics with advanced spot quality evaluation can significantly increase the amount of confident and accurate data resulting in more meaningful biological conclusions.

  3. Advanced data analysis in neuroscience integrating statistical and computational models

    CERN Document Server

    Durstewitz, Daniel

    2017-01-01

    This book is intended for use in advanced graduate courses in statistics / machine learning, as well as for all experimental neuroscientists seeking to understand statistical methods at a deeper level, and theoretical neuroscientists with a limited background in statistics. It reviews almost all areas of applied statistics, from basic statistical estimation and test theory, linear and nonlinear approaches for regression and classification, to model selection and methods for dimensionality reduction, density estimation and unsupervised clustering.  Its focus, however, is linear and nonlinear time series analysis from a dynamical systems perspective, based on which it aims to convey an understanding also of the dynamical mechanisms that could have generated observed time series. Further, it integrates computational modeling of behavioral and neural dynamics with statistical estimation and hypothesis testing. This way computational models in neuroscience are not only explanat ory frameworks, but become powerfu...

  4. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    Science.gov (United States)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream

  5. Advanced compositional gradient and compartmentalization analysis

    Energy Technology Data Exchange (ETDEWEB)

    Canas, Jesus A.; Petti, Daniela; Mullins, Oliver [Schlumberger Servicos de Petroleo Ltda., Rio de Janeiro, RJ (Brazil)

    2008-07-01

    Acquisition of hydrocarbons samples from the reservoir prior to oil or gas production is essential in order to design production strategies and production facilities. In addition, reservoir compartmentalization and hydrocarbon compositional grading magnify the necessity to map fluid properties vertically and laterally in the reservoir prior to production. Formation testers supply a wealth of information to observe and predict the state of fluids in hydrocarbon reservoirs, through detailed pressure and fluid analysis measurements. With the correct understanding of the state of fluids in the reservoirs, reserve calculations and adequate development plans can be prepared. Additionally, flow barriers may then be revealed. This paper describes a new Downhole Fluid Analysis technology (DFA) for improved reservoir management. DFA is a unique process that combines new fluid identification sensors, which allow real time monitoring of a wide range of parameters as GOR, fluid density, viscosity, fluorescence and composition (CH{sub 4}, C2- C5, C6 +, CO{sub 2}), free gas and liquid phases detection, saturation pressure, as well WBM and OBM filtrate differentiation and pH. This process is not limited to light fluid evaluation and we extended to heavy oil (HO) reservoirs analysis successfully. The combination of DFA Fluid Profiling with pressure measurements has shown to be very effective for compartmentalization characterization. The ability of thin barriers to hold off large depletion pressures has been established, as the gradual variation of hydrocarbon quality in biodegraded oils. In addition, heavy oils can show large compositional variation due to variations in source rock charging but without fluid mixing. Our findings indicates that steep gradients are common in gas condensates or volatile oils, and that biodegradation is more common in HO than in other hydrocarbons, which generate fluid gradients and heavy ends tars near the OWC, limiting the aquifer activity and

  6. Advanced Fingerprint Analysis Project Fingerprint Constituents

    Energy Technology Data Exchange (ETDEWEB)

    GM Mong; CE Petersen; TRW Clauss

    1999-10-29

    The work described in this report was focused on generating fundamental data on fingerprint components which will be used to develop advanced forensic techniques to enhance fluorescent detection, and visualization of latent fingerprints. Chemical components of sweat gland secretions are well documented in the medical literature and many chemical techniques are available to develop latent prints, but there have been no systematic forensic studies of fingerprint sweat components or of the chemical and physical changes these substances undergo over time.

  7. Advanced methods for fabrication of PHWR and LMFBR fuels

    International Nuclear Information System (INIS)

    Ganguly, C.

    1988-01-01

    For self-reliance in nuclear power, the Department of Atomic Energy (DAE), India is pursuing two specific reactor systems, namely the pressurised heavy water reactors (PHWR) and the liquid metal cooled fast breeder reactors (LMFBR). The reference fuel for PHWR is zircaloy-4 clad high density (≤ 96 per cent T.D.) natural UO 2 pellet-pins. The advanced PHWR fuels are UO 2 -PuO 2 (≤ 2 per cent), ThO 2 -PuO 2 (≤ 4 per cent) and ThO 2 -U 233 O 2 (≤ 2 per cent). Similarly, low density (≤ 85 per cent T.D.) (UPu)O 2 pellets clad in SS 316 or D9 is the reference fuel for the first generation of prototype and commercial LMFBRs all over the world. However, (UPu)C and (UPu)N are considered as advanced fuels for LMFBRs mainly because of their shorter doubling time. The conventional method of fabrication of both high and low density oxide, carbide and nitride fuel pellets starting from UO 2 , PuO 2 and ThO 2 powders is 'powder metallurgy (P/M)'. The P/M route has, however, the disadvantage of generation and handling of fine powder particles of the fuel and the associated problem of 'radiotoxic dust hazard'. The present paper summarises the state-of-the-art of advanced methods of fabrication of oxide, carbide and nitride fuels and highlights the author's experience on sol-gel-microsphere-pelletisation (SGMP) route for preparation of these materials. The SGMP process uses sol gel derived, dust-free and free-flowing microspheres of oxides, carbide or nitride for direct pelletisation and sintering. Fuel pellets of both low and high density, excellent microhomogeneity and controlled 'open' or 'closed' porosity could be fabricated via the SGMP route. (author). 5 tables, 14 figs., 15 refs

  8. Advancing our thinking in presence-only and used-available analysis.

    Science.gov (United States)

    Warton, David; Aarts, Geert

    2013-11-01

    1. The problems of analysing used-available data and presence-only data are equivalent, and this paper uses this equivalence as a platform for exploring opportunities for advancing analysis methodology. 2. We suggest some potential methodological advances in used-available analysis, made possible via lessons learnt in the presence-only literature, for example, using modern methods to improve predictive performance. We also consider the converse - potential advances in presence-only analysis inspired by used-available methodology. 3. Notwithstanding these potential advances in methodology, perhaps a greater opportunity is in advancing our thinking about how to apply a given method to a particular data set. 4. It is shown by example that strikingly different results can be achieved for a single data set by applying a given method of analysis in different ways - hence having chosen a method of analysis, the next step of working out how to apply it is critical to performance. 5. We review some key issues to consider in deciding how to apply an analysis method: apply the method in a manner that reflects the study design; consider data properties; and use diagnostic tools to assess how reasonable a given analysis is for the data at hand. © 2013 The Authors. Journal of Animal Ecology © 2013 British Ecological Society.

  9. Some advanced parametric methods for assessing waveform distortion in a smart grid with renewable generation

    Science.gov (United States)

    Alfieri, Luisa

    2015-12-01

    Power quality (PQ) disturbances are becoming an important issue in smart grids (SGs) due to the significant economic consequences that they can generate on sensible loads. However, SGs include several distributed energy resources (DERs) that can be interconnected to the grid with static converters, which lead to a reduction of the PQ levels. Among DERs, wind turbines and photovoltaic systems are expected to be used extensively due to the forecasted reduction in investment costs and other economic incentives. These systems can introduce significant time-varying voltage and current waveform distortions that require advanced spectral analysis methods to be used. This paper provides an application of advanced parametric methods for assessing waveform distortions in SGs with dispersed generation. In particular, the Standard International Electrotechnical Committee (IEC) method, some parametric methods (such as Prony and Estimation of Signal Parameters by Rotational Invariance Technique (ESPRIT)), and some hybrid methods are critically compared on the basis of their accuracy and the computational effort required.

  10. The application of advanced rotor (performance) methods for design calculations

    Energy Technology Data Exchange (ETDEWEB)

    Bussel, G.J.W. van [Delft Univ. of Technology, Inst. for Wind Energy, Delft (Netherlands)

    1997-08-01

    The calculation of loads and performance of wind turbine rotors has been a topic for research over the last century. The principles for the calculation of loads on rotor blades with a given specific geometry, as well as the development of optimal shaped rotor blades have been published in the decades that significant aircraft development took place. Nowadays advanced computer codes are used for specific problems regarding modern aircraft, and application to wind turbine rotors has also been performed occasionally. The engineers designing rotor blades for wind turbines still use methods based upon global principles developed in the beginning of the century. The question what to expect in terms of the type of methods to be applied in a design environment for the near future is addressed here. (EG) 14 refs.

  11. Methods and Systems for Advanced Spaceport Information Management

    Science.gov (United States)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  12. Advances in Photopletysmography Signal Analysis for Biomedical Applications

    Directory of Open Access Journals (Sweden)

    Jermana L. Moraes

    2018-06-01

    Full Text Available Heart Rate Variability (HRV is an important tool for the analysis of a patient’s physiological conditions, as well a method aiding the diagnosis of cardiopathies. Photoplethysmography (PPG is an optical technique applied in the monitoring of the HRV and its adoption has been growing significantly, compared to the most commonly used method in medicine, Electrocardiography (ECG. In this survey, definitions of these technique are presented, the different types of sensors used are explained, and the methods for the study and analysis of the PPG signal (linear and nonlinear methods are described. Moreover, the progress, and the clinical and practical applicability of the PPG technique in the diagnosis of cardiovascular diseases are evaluated. In addition, the latest technologies utilized in the development of new tools for medical diagnosis are presented, such as Internet of Things, Internet of Health Things, genetic algorithms, artificial intelligence and biosensors which result in personalized advances in e-health and health care. After the study of these technologies, it can be noted that PPG associated with them is an important tool for the diagnosis of some diseases, due to its simplicity, its cost–benefit ratio, the easiness of signals acquisition, and especially because it is a non-invasive technique.

  13. A Meta-Analysis of Advance-Organizer Studies.

    Science.gov (United States)

    Stone, Carol Leth

    Long term studies of advance organizers (AO) were analyzed with Glass's meta-analysis technique. AO's were defined as bridges from reader's previous knowledge to what is to be learned. The results were compared with predictions from Ausubel's model of assimilative learning. The results of the study indicated that advance organizers were associated…

  14. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  15. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2001-01-01

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2 nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  16. Calculation methods for advanced concept light water reactor lattices

    International Nuclear Information System (INIS)

    Carmona, S.

    1986-01-01

    In the last few years s several advanced concepts for fuel rod lattices have been studied. Improved fuel utilization is one of the major aims in the development of new fuel rod designs and lattice modifications. By these changes s better performance in fuel economics s fuel burnup and material endurance can be achieved in the frame of the well-known basic Light Water Reactor technology. Among the new concepts involved in these studies that have attracted serious attention are lattices consisting of arrays of annular rods duplex pellet rods or tight multicells. These new designs of fuel rods and lattices present several computational problems. The treatment of resonance shielded cross sections is a crucial point in the analyses of these advanced concepts . The purpose of this study was to assess adequate approximation methods for calculating as accurately as possible, resonance shielding for these new lattices. Although detailed and exact computational methods for the evaluation of the resonance shielding in these lattices are possible, they are quite inefficient when used in lattice codes. The computer time and memory required for this kind of computations are too large to be used in an acceptable routine manner. In order to over- come these limitations and to make the analyses possible with reasonable use of computer resources s approximation methods are necessary. Usual approximation methods, for the resonance energy regions used in routine lattice computer codes, can not adequately handle the evaluation of these new fuel rod lattices. The main contribution of the present work to advanced lattice concepts is the development of an equivalence principle for the calculation of resonance shielding in the annular fuel pellet zone of duplex pellets; the duplex pellet in this treatment consists of two fuel zones with the same absorber isotope in both regions. In the transition from a single duplex rod to an infinite array of this kind of fuel rods, the similarity of the

  17. Analysis of Precision of Activation Analysis Method

    DEFF Research Database (Denmark)

    Heydorn, Kaj; Nørgaard, K.

    1973-01-01

    The precision of an activation-analysis method prescribes the estimation of the precision of a single analytical result. The adequacy of these estimates to account for the observed variation between duplicate results from the analysis of different samples and materials, is tested by the statistic T...

  18. Analysis apparatus and method of analysis

    International Nuclear Information System (INIS)

    1976-01-01

    A continuous streaming method developed for the excution of immunoassays is described in this patent. In addition, a suitable apparatus for the method was developed whereby magnetic particles are automatically employed for the consecutive analysis of a series of liquid samples via the RIA technique

  19. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  20. Chemical methods of rock analysis

    National Research Council Canada - National Science Library

    Jeffery, P. G; Hutchison, D

    1981-01-01

    A practical guide to the methods in general use for the complete analysis of silicate rock material and for the determination of all those elements present in major, minor or trace amounts in silicate...

  1. Shielding analysis of the advanced voloxidation process

    Energy Technology Data Exchange (ETDEWEB)

    Park, Chang Je; Park, J. J.; Lee, J. W.; Shin, J. M.; Park, G. I.; Song, K. C

    2008-09-15

    This report deals describes how much a shielding benefit can be obtained by the Advanced Voloxidation process. The calculation was performed with the MCNPX code and a simple problem was modeled with a spent fuel source which was surrounded by a concrete wall. The source terms were estimated with the ORIGEN-ARP code and the gamma spectrum and the neutron spectrum were also obtained. The thickness of the concrete wall was estimated before and after the voloxidation process. From the results, the gamma spectrum after the voloxidation process was estimated as a 67% reduction compared with that of before the voloxidation process due to the removal of several gamma emission elements such as cesium and rubidium. The MCNPX calculations provided that the thickness of the general concrete wall could be reduced by 12% after the voloxidation process. And the heavy concrete wall provided a 28% reduction in the shielding of the source term after the voloxidation process. This can be explained in that there lots of gamma emission isotopes still exist after the advanced voloxidation process such as Pu-241, Y-90, and Sr-90 which are independent of the voloxidation process.

  2. Advances in power system modelling, control and stability analysis

    CERN Document Server

    Milano, Federico

    2016-01-01

    Advances in Power System Modelling, Control and Stability Analysis captures the variety of new methodologies and technologies that are changing the way modern electric power systems are modelled, simulated and operated.

  3. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  4. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  5. Uncertainty analysis of LBLOCA for Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Lele, H.G.; Ghosh, A.K.; Kushwaha, H.S.

    2008-01-01

    , uncertainty analysis for the Large Break LOCA (200% Inlet Header Break) of Advanced Heavy Water Reactor (AHWR) has been carried out. The uncertainty analysis was carried out for the peak cladding temperature (PCT), based on the two different methods i.e., Wilk's method and the response surface technique. Their findings have also been compared

  6. Human-system safety methods for development of advanced air traffic management systems

    International Nuclear Information System (INIS)

    Nelson, William R.

    1999-01-01

    The Idaho National Engineering and Environmental Laboratory (INEEL) is supporting the National Aeronautics and Space Administration in the development of advanced air traffic management (ATM) systems as part of the Advanced Air Transportation Technologies program. As part of this program INEEL conducted a survey of human-system safety methods that have been applied to complex technical systems, to identify lessons learned from these applications and provide recommendations for the development of advanced ATM systems. The domains that were surveyed included offshore oil and gas, commercial nuclear power, commercial aviation, and military. The survey showed that widely different approaches are used in these industries, and that the methods used range from very high-level, qualitative approaches to very detailed quantitative methods such as human reliability analysis (HRA) and probabilistic safety assessment (PSA). In addition, the industries varied widely in how effectively they incorporate human-system safety assessment in the design, development, and testing of complex technical systems. In spite of the lack of uniformity in the approaches and methods used, it was found that methods are available that can be combined and adapted to support the development of advanced air traffic management systems (author) (ml)

  7. Advanced concepts, analysis approaches and criteria for nuclear piping system design

    International Nuclear Information System (INIS)

    Tang, H.T.; Tagart, S.W. Jr.; Tang, Y.K.

    1992-01-01

    Recent research in piping system design and analysis has resulted in advancements on damping values, independent support motion (ISM), static coefficient method, simplified inelastic method and ASME code criteria changes. In the support area, passive type of supports such as energy-absorbing device and gap stopper have been developed. These advancements provide bases for improved and cost-effective design of future nuclear piping systems. (author)

  8. Advanced Analysis Cognition: Improving the Cognition of Intelligence Analysis

    Science.gov (United States)

    2013-09-01

    Reviews, 3rd ed., Sage Publications, Thousand Oaks, CA, 1998. 5 Higgins, J.P.T. & Green , S. (eds) Cochrane Handbook for Systematic Reviews of...Structured Analytic Techniques for Intelligence Analysis, CQ Press, Washington, D.C., 2011. Higgins, J.P.T. & Green , S. (eds) Cochrane Handbook...RW 3989) Bleicher, J. Contemporary Hermeneutics: Hermeneutics as Method, Philosophy, and Critique, Routledge & Kegan Paul, London; Boston, 1980

  9. Advances in zymography techniques and patents regarding protease analysis.

    Science.gov (United States)

    Wilkesman, Jeff; Kurz, Liliana

    2012-08-01

    Detection of enzymatic activity on gel electrophoresis, namely zymography, is a technique that has received increasing attention in the last 10 years, according to the number of articles published. A growing amount of enzymes, mainly proteases, are now routinely detected by zymography. Detailed analytical studies are beginning to be published, as well as new patents have been developed. This new article updates the information covered in our last review, condensing the recent publications dealing with the identification of proteolytic enzymes in electrophoretic gel supports and its variations. The new advances of this method are basically focused towards two dimensional zymography and transfer zymography. Though comparatively fewer patents have been published, they basically coincide in the study of matrix metalloproteases. The tendency is foreseen to be very productive in the area of zymoproteomics, combining electrophoresis and mass spectrometry for the analysis of proteases.

  10. Advanced analysis and design for fire safety of steel structures

    CERN Document Server

    Li, Guoqiang

    2013-01-01

    Advanced Analysis and Design for Fire Safety of Steel Structures systematically presents the latest findings on behaviours of steel structural components in a fire, such as the catenary actions of restrained steel beams, the design methods for restrained steel columns, and the membrane actions of concrete floor slabs with steel decks. Using a systematic description of structural fire safety engineering principles, the authors illustrate the important difference between behaviours of an isolated structural element and the restrained component in a complete structure under fire conditions. The book will be an essential resource for structural engineers who wish to improve their understanding of steel buildings exposed to fires. It is also an ideal textbook for introductory courses in fire safety for master’s degree programs in structural engineering, and is excellent reading material for final-year undergraduate students in civil engineering and fire safety engineering. Furthermore, it successfully bridges th...

  11. Advanced functional network analysis in the geosciences: The pyunicorn package

    Science.gov (United States)

    Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen

    2013-04-01

    Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.

  12. Seismic design and analysis methods

    International Nuclear Information System (INIS)

    Varpasuo, P.

    1993-01-01

    Seismic load is in many areas of the world the most important loading situation from the point of view of structural strength. Taking this into account it is understandable, that there has been a strong allocation of resources in the seismic analysis during the past ten years. In this study there are three areas of the center of gravity: (1) Random vibrations; (2) Soil-structure interaction and (3) The methods for determining structural response. The solution of random vibration problems is clarified with the aid of applications in this study and from the point of view of mathematical treatment and mathematical formulations it is deemed sufficient to give the relevant sources. In the soil-structure interaction analysis the focus has been the significance of frequency dependent impedance functions. As a result it was obtained, that the description of the soil with the aid of frequency dependent impedance functions decreases the structural response and it is thus always the preferred method when compared to more conservative analysis types. From the methods to determine the C structural response the following four were tested: (1) The time history method; (2) The complex frequency-response method; (3) Response spectrum method and (4) The equivalent static force method. The time history appeared to be the most accurate method and the complex frequency-response method did have the widest area of application. (orig.). (14 refs., 35 figs.)

  13. Advanced methods for the study of PWR cores

    International Nuclear Information System (INIS)

    Lambert, M.; Salvatores, St.; Ferrier, A.; Pelet, J.; Nicaise, N.; Pouliquen, J.Y.; Foret, F.; Chauliac, C.; Johner, J.; Cohen, Ch.

    2003-01-01

    This document gathers the transparencies presented at the 6. technical session of the French nuclear energy society (SFEN) in October 2003. The transparencies of the annual meeting are presented in the introductive part: 1 - status of the French nuclear park: nuclear energy results, management of an exceptional climatic situation: the heat wave of summer 2003 and the power generation (J.C. Barral); 2 - status of the research on controlled thermonuclear fusion (J. Johner). Then follows the technical session about the advanced methods for the study of PWR reactor cores: 1 - the evolution approach of study methodologies (M. Lambert, J. Pelet); 2 - the point of view of the nuclear safety authority (D. Brenot); 3 - the improved decoupled methodology for the steam pipe rupture (S. Salvatores, J.Y. Pouliquen); 4 - the MIR method for the pellet-clad interaction (renovated IPG methodology) (E. Baud, C. Royere); 5 - the improved fuel management (IFM) studies for Koeberg (C. Cohen); 6 - principle of the methods of accident study implemented for the European pressurized reactor (EPR) (F. Foret, A. Ferrier); 7 - accident studies with the EPR, steam pipe rupture (N. Nicaise, S. Salvatores); 8 - the co-development platform, a new generation of software tools for the new methodologies (C. Chauliac). (J.S.)

  14. Advancing Family Business Research Through Narrative Analysis

    DEFF Research Database (Denmark)

    Dawson, Alexandra; Hjorth, Daniel

    2012-01-01

    business. This interpretive perspective is appropriate for family business studies, which address multifaceted and complex social constructs that are performed by different actors in multiple contexts. The analysis highlights five key themes centering on leadership style and succession, trust...

  15. Correlation analysis in chemistry: recent advances

    National Research Council Canada - National Science Library

    Shorter, John; Chapman, Norman Bellamy

    1978-01-01

    ..., and applications of LFER to polycyclic arenes, heterocyclic compounds, and olefinic systems. Of particular interest is the extensive critical compilation of substituent constants and the numerous applications of correlation analysis to spectroscopy...

  16. A strategy for evaluating pathway analysis methods.

    Science.gov (United States)

    Yu, Chenggang; Woo, Hyung Jun; Yu, Xueping; Oyama, Tatsuya; Wallqvist, Anders; Reifman, Jaques

    2017-10-13

    Researchers have previously developed a multitude of methods designed to identify biological pathways associated with specific clinical or experimental conditions of interest, with the aim of facilitating biological interpretation of high-throughput data. Before practically applying such pathway analysis (PA) methods, we must first evaluate their performance and reliability, using datasets where the pathways perturbed by the conditions of interest have been well characterized in advance. However, such 'ground truths' (or gold standards) are often unavailable. Furthermore, previous evaluation strategies that have focused on defining 'true answers' are unable to systematically and objectively assess PA methods under a wide range of conditions. In this work, we propose a novel strategy for evaluating PA methods independently of any gold standard, either established or assumed. The strategy involves the use of two mutually complementary metrics, recall and discrimination. Recall measures the consistency of the perturbed pathways identified by applying a particular analysis method to an original large dataset and those identified by the same method to a sub-dataset of the original dataset. In contrast, discrimination measures specificity-the degree to which the perturbed pathways identified by a particular method to a dataset from one experiment differ from those identifying by the same method to a dataset from a different experiment. We used these metrics and 24 datasets to evaluate six widely used PA methods. The results highlighted the common challenge in reliably identifying significant pathways from small datasets. Importantly, we confirmed the effectiveness of our proposed dual-metric strategy by showing that previous comparative studies corroborate the performance evaluations of the six methods obtained by our strategy. Unlike any previously proposed strategy for evaluating the performance of PA methods, our dual-metric strategy does not rely on any ground truth

  17. Advanced Color Image Processing and Analysis

    CERN Document Server

    2013-01-01

    This volume does much more than survey modern advanced color processing. Starting with a historical perspective on ways we have classified color, it sets out the latest numerical techniques for analyzing and processing colors, the leading edge in our search to accurately record and print what we see. The human eye perceives only a fraction of available light wavelengths, yet we live in a multicolor world of myriad shining hues. Colors rich in metaphorical associations make us “purple with rage” or “green with envy” and cause us to “see red.” Defining colors has been the work of centuries, culminating in today’s complex mathematical coding that nonetheless remains a work in progress: only recently have we possessed the computing capacity to process the algebraic matrices that reproduce color more accurately. With chapters on dihedral color and image spectrometers, this book provides technicians and researchers with the knowledge they need to grasp the intricacies of today’s color imaging.

  18. Applications of advances in nonlinear sensitivity analysis

    Energy Technology Data Exchange (ETDEWEB)

    Werbos, P J

    1982-01-01

    The following paper summarizes the major properties and applications of a collection of algorithms involving differentiation and optimization at minimum cost. The areas of application include the sensitivity analysis of models, new work in statistical or econometric estimation, optimization, artificial intelligence and neuron modelling.

  19. 2D automatic body-fitted structured mesh generation using advancing extraction method

    Science.gov (United States)

    Zhang, Yaoxin; Jia, Yafei

    2018-01-01

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like topography with extrusion-like structures (i.e., branches or tributaries) and intrusion-like structures (i.e., peninsula or dikes). With the AEM, the hierarchical levels of sub-domains can be identified, and the block boundary of each sub-domain in convex polygon shape in each level can be extracted in an advancing scheme. In this paper, several examples were used to illustrate the effectiveness and applicability of the proposed algorithm for automatic structured mesh generation, and the implementation of the method.

  20. Advanced transport systems analysis, modeling, and evaluation of performances

    CERN Document Server

    Janić, Milan

    2014-01-01

    This book provides a systematic analysis, modeling and evaluation of the performance of advanced transport systems. It offers an innovative approach by presenting a multidimensional examination of the performance of advanced transport systems and transport modes, useful for both theoretical and practical purposes. Advanced transport systems for the twenty-first century are characterized by the superiority of one or several of their infrastructural, technical/technological, operational, economic, environmental, social, and policy performances as compared to their conventional counterparts. The advanced transport systems considered include: Bus Rapid Transit (BRT) and Personal Rapid Transit (PRT) systems in urban area(s), electric and fuel cell passenger cars, high speed tilting trains, High Speed Rail (HSR), Trans Rapid Maglev (TRM), Evacuated Tube Transport system (ETT), advanced commercial subsonic and Supersonic Transport Aircraft (STA), conventionally- and Liquid Hydrogen (LH2)-fuelled commercial air trans...

  1. Numerical methods and analysis of multiscale problems

    CERN Document Server

    Madureira, Alexandre L

    2017-01-01

    This book is about numerical modeling of multiscale problems, and introduces several asymptotic analysis and numerical techniques which are necessary for a proper approximation of equations that depend on different physical scales. Aimed at advanced undergraduate and graduate students in mathematics, engineering and physics – or researchers seeking a no-nonsense approach –, it discusses examples in their simplest possible settings, removing mathematical hurdles that might hinder a clear understanding of the methods. The problems considered are given by singular perturbed reaction advection diffusion equations in one and two-dimensional domains, partial differential equations in domains with rough boundaries, and equations with oscillatory coefficients. This work shows how asymptotic analysis can be used to develop and analyze models and numerical methods that are robust and work well for a wide range of parameters.

  2. Advance in research on aerosol deposition simulation methods

    International Nuclear Information System (INIS)

    Liu Keyang; Li Jingsong

    2011-01-01

    A comprehensive analysis of the health effects of inhaled toxic aerosols requires exact data on airway deposition. A knowledge of the effect of inhaled drugs is essential to the optimization of aerosol drug delivery. Sophisticated analytical deposition models can be used for the computation of total, regional and generation specific deposition efficiencies. The continuously enhancing computer seem to allow us to study the particle transport and deposition in more and more realistic airway geometries with the help of computational fluid dynamics (CFD) simulation method. In this article, the trends in aerosol deposition models and lung models, and the methods for achievement of deposition simulations are also reviewed. (authors)

  3. Advanced CMOS Radiation Effects Testing and Analysis

    Science.gov (United States)

    Pellish, J. A.; Marshall, P. W.; Rodbell, K. P.; Gordon, M. S.; LaBel, K. A.; Schwank, J. R.; Dodds, N. A.; Castaneda, C. M.; Berg, M. D.; Kim, H. S.; hide

    2014-01-01

    Presentation at the annual NASA Electronic Parts and Packaging (NEPP) Program Electronic Technology Workshop (ETW). The material includes an update of progress in this NEPP task area over the past year, which includes testing, evaluation, and analysis of radiation effects data on the IBM 32 nm silicon-on-insulator (SOI) complementary metal oxide semiconductor (CMOS) process. The testing was conducted using test vehicles supplied by directly by IBM.

  4. Advanced aircraft service life monitoring method via flight-by-flight load spectra

    Science.gov (United States)

    Lee, Hongchul

    This research is an effort to understand current method and to propose an advanced method for Damage Tolerance Analysis (DTA) for the purpose of monitoring the aircraft service life. As one of tasks in the DTA, the current indirect Individual Aircraft Tracking (IAT) method for the F-16C/D Block 32 does not properly represent changes in flight usage severity affecting structural fatigue life. Therefore, an advanced aircraft service life monitoring method based on flight-by-flight load spectra is proposed and recommended for IAT program to track consumed fatigue life as an alternative to the current method which is based on the crack severity index (CSI) value. Damage Tolerance is one of aircraft design philosophies to ensure that aging aircrafts satisfy structural reliability in terms of fatigue failures throughout their service periods. IAT program, one of the most important tasks of DTA, is able to track potential structural crack growth at critical areas in the major airframe structural components of individual aircraft. The F-16C/D aircraft is equipped with a flight data recorder to monitor flight usage and provide the data to support structural load analysis. However, limited memory of flight data recorder allows user to monitor individual aircraft fatigue usage in terms of only the vertical inertia (NzW) data for calculating Crack Severity Index (CSI) value which defines the relative maneuver severity. Current IAT method for the F-16C/D Block 32 based on CSI value calculated from NzW is shown to be not accurate enough to monitor individual aircraft fatigue usage due to several problems. The proposed advanced aircraft service life monitoring method based on flight-by-flight load spectra is recommended as an improved method for the F-16C/D Block 32 aircraft. Flight-by-flight load spectra was generated from downloaded Crash Survival Flight Data Recorder (CSFDR) data by calculating loads for each time hack in selected flight data utilizing loads equations. From

  5. Computational intelligence for big data analysis frontier advances and applications

    CERN Document Server

    Dehuri, Satchidananda; Sanyal, Sugata

    2015-01-01

    The work presented in this book is a combination of theoretical advancements of big data analysis, cloud computing, and their potential applications in scientific computing. The theoretical advancements are supported with illustrative examples and its applications in handling real life problems. The applications are mostly undertaken from real life situations. The book discusses major issues pertaining to big data analysis using computational intelligence techniques and some issues of cloud computing. An elaborate bibliography is provided at the end of each chapter. The material in this book includes concepts, figures, graphs, and tables to guide researchers in the area of big data analysis and cloud computing.

  6. Holistic safety analysis for advanced nuclear power plants

    International Nuclear Information System (INIS)

    Alvarenga, M.A.B.; Guimaraes, A.C.F.

    1992-01-01

    This paper reviews the basic methodology of safety analysis used in the ANGRA-I and ANGRA-II nuclear power plants, its weaknesses, the problems with public acceptance of the risks, the future of the nuclear energy in Brazil, as well as recommends a new methodology, HOLISTIC SAFETY ANALYSIS, to be used both in the design and licensing phases, for advanced reactors. (author)

  7. Monte Carlo simulations to advance characterisation of landmines by pulsed fast/thermal neutron analysis

    NARCIS (Netherlands)

    Maucec, M.; Rigollet, C.

    The performance of a detection system based on the pulsed fast/thermal neutron analysis technique was assessed using Monte Carlo simulations. The aim was to develop and implement simulation methods, to support and advance the data analysis techniques of the characteristic gamma-ray spectra,

  8. Spatial Analysis of Depots for Advanced Biomass Processing

    Energy Technology Data Exchange (ETDEWEB)

    Hilliard, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Webb, Erin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sokhansanj, Shahabaddine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Martinez Gonzalez, Maria I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-09-01

    The objective of this work was to perform a spatial analysis of the total feedstock cost at the conversion reactor for biomass supplied by a conventional system and an advanced system with depots to densify biomass into pellets. From these cost estimates, the conditions (feedstock cost and availability) for which advanced processing depots make it possible to achieve cost and volume targets can be identified.

  9. Molecular Analysis of Microbial Diversity in Advanced Caries

    OpenAIRE

    Chhour, Kim-Ly; Nadkarni, Mangala A.; Byun, Roy; Martin, F. Elizabeth; Jacques, Nicholas A.; Hunter, Neil

    2005-01-01

    Real-time PCR analysis of the total bacterial load in advanced carious lesions has shown that the total load exceeds the number of cultivable bacteria. This suggests that an unresolved complexity exists in bacteria associated with advanced caries. In this report, the profile of the microflora of carious dentine was explored by using DNA extracted from 10 lesions selected on the basis of comparable total microbial load and on the relative abundance of Prevotella spp. Using universal primers fo...

  10. Advanced calculus an introduction to classical analysis

    CERN Document Server

    Brand, Louis

    2006-01-01

    A course in analysis that focuses on the functions of a real variable, this text is geared toward upper-level undergraduate students. It introduces the basic concepts in their simplest setting and illustrates its teachings with numerous examples, practical theorems, and coherent proofs.Starting with the structure of the system of real and complex numbers, the text deals at length with the convergence of sequences and series and explores the functions of a real variable and of several variables. Subsequent chapters offer a brief and self-contained introduction to vectors that covers important a

  11. Basic methods of linear functional analysis

    CERN Document Server

    Pryce, John D

    2011-01-01

    Introduction to the themes of mathematical analysis, geared toward advanced undergraduate and graduate students. Topics include operators, function spaces, Hilbert spaces, and elementary Fourier analysis. Numerous exercises and worked examples.1973 edition.

  12. Advances in Mössbauer data analysis

    Science.gov (United States)

    de Souza, Paulo A.

    1998-08-01

    The whole Mössbauer community generates a huge amount of data in several fields of human knowledge since the first publication of Rudolf Mössbauer. Interlaboratory measurements of the same substance may result in minor differences in the Mössbauer Parameters (MP) of isomer shift, quadrupole splitting and internal magnetic field. Therefore, a conventional data bank of published MP will be of limited help in identification of substances. Data bank search for exact information became incapable to differentiate the values of Mössbauer parameters within the experimental errors (e.g., IS = 0.22 mm/s from IS = 0.23 mm/s), but physically both values may be considered the same. An artificial neural network (ANN) is able to identify a substance and its crystalline structure from measured MP, and its slight variations do not represent an obstacle for the ANN identification. A barrier to the popularization of Mössbauer spectroscopy as an analytical technique is the absence of a full automated equipment, since the analysis of a Mössbauer spectrum normally is time-consuming and requires a specialist. In this work, the fitting process of a Mössbauer spectrum was completely automated through the use of genetic algorithms and fuzzy logic. Both software and hardware systems were implemented turning out to be a fully automated Mössbauer data analysis system. The developed system will be presented.

  13. Advanced quantitative magnetic nondestructive evaluation methods - Theory and experiment

    Science.gov (United States)

    Barton, J. R.; Kusenberger, F. N.; Beissner, R. E.; Matzkanin, G. A.

    1979-01-01

    The paper reviews the scale of fatigue crack phenomena in relation to the size detection capabilities of nondestructive evaluation methods. An assessment of several features of fatigue in relation to the inspection of ball and roller bearings suggested the use of magnetic methods; magnetic domain phenomena including the interaction of domains and inclusions, and the influence of stress and magnetic field on domains are discussed. Experimental results indicate that simplified calculations can be used to predict many features of these results; the data predicted by analytic models which use finite element computer analysis predictions do not agree with respect to certain features. Experimental analyses obtained on rod-type fatigue specimens which show experimental magnetic measurements in relation to the crack opening displacement and volume and crack depth should provide methods for improved crack characterization in relation to fracture mechanics and life prediction.

  14. Advanced overlay analysis through design based metrology

    Science.gov (United States)

    Ji, Sunkeun; Yoo, Gyun; Jo, Gyoyeon; Kang, Hyunwoo; Park, Minwoo; Kim, Jungchan; Park, Chanha; Yang, Hyunjo; Yim, Donggyu; Maruyama, Kotaro; Park, Byungjun; Yamamoto, Masahiro

    2015-03-01

    As design rule shrink, overlay has been critical factor for semiconductor manufacturing. However, the overlay error which is determined by a conventional measurement with an overlay mark based on IBO and DBO often does not represent the physical placement error in the cell area. The mismatch may arise from the size or pitch difference between the overlay mark and the cell pattern. Pattern distortion caused by etching or CMP also can be a source of the mismatch. In 2014, we have demonstrated that method of overlay measurement in the cell area by using DBM (Design Based Metrology) tool has more accurate overlay value than conventional method by using an overlay mark. We have verified the reproducibility by measuring repeatable patterns in the cell area, and also demonstrated the reliability by comparing with CD-SEM data. We have focused overlay mismatching between overlay mark and cell area until now, further more we have concerned with the cell area having different pattern density and etch loading. There appears a phenomenon which has different overlay values on the cells with diverse patterning environment. In this paper, the overlay error was investigated from cell edge to center. For this experiment, we have verified several critical layers in DRAM by using improved(Better resolution and speed) DBM tool, NGR3520.

  15. Advanced analysis of forest fire clustering

    Science.gov (United States)

    Kanevski, Mikhail; Pereira, Mario; Golay, Jean

    2017-04-01

    Analysis of point pattern clustering is an important topic in spatial statistics and for many applications: biodiversity, epidemiology, natural hazards, geomarketing, etc. There are several fundamental approaches used to quantify spatial data clustering using topological, statistical and fractal measures. In the present research, the recently introduced multi-point Morisita index (mMI) is applied to study the spatial clustering of forest fires in Portugal. The data set consists of more than 30000 fire events covering the time period from 1975 to 2013. The distribution of forest fires is very complex and highly variable in space. mMI is a multi-point extension of the classical two-point Morisita index. In essence, mMI is estimated by covering the region under study by a grid and by computing how many times more likely it is that m points selected at random will be from the same grid cell than it would be in the case of a complete random Poisson process. By changing the number of grid cells (size of the grid cells), mMI characterizes the scaling properties of spatial clustering. From mMI, the data intrinsic dimension (fractal dimension) of the point distribution can be estimated as well. In this study, the mMI of forest fires is compared with the mMI of random patterns (RPs) generated within the validity domain defined as the forest area of Portugal. It turns out that the forest fires are highly clustered inside the validity domain in comparison with the RPs. Moreover, they demonstrate different scaling properties at different spatial scales. The results obtained from the mMI analysis are also compared with those of fractal measures of clustering - box counting and sand box counting approaches. REFERENCES Golay J., Kanevski M., Vega Orozco C., Leuenberger M., 2014: The multipoint Morisita index for the analysis of spatial patterns. Physica A, 406, 191-202. Golay J., Kanevski M. 2015: A new estimator of intrinsic dimension based on the multipoint Morisita index

  16. Advances in Airborne and Ground Geophysical Methods for Uranium Exploration

    International Nuclear Information System (INIS)

    2013-01-01

    through the use of effective exploration techniques. Geophysical methods with the capability of mapping surface and subsurface parameters in relation to uranium deposition and accumulation are proving to be vital components of current exploration efforts around the world. There is continuous development and improvement of technical and scientific disciplines using measuring instruments and spatially referenced data processing techniques. Newly designed geophysical instruments and their applications in uranium exploration are contributing to an increased probability of successful discoveries. Dissemination of information on advances in geophysical techniques encourages new strategies and promotes new approaches toward uranium exploration. Meetings and conferences organized by the IAEA, collecting the experience of participating countries, as well as its publications and the International Nuclear Information System, play an important role in the dissemination of knowledge of all aspects of the nuclear fuel cycle. The purpose of this report is to highlight advances in airborne and ground geophysical techniques, succinctly describing modern geophysical methods and demonstrating the application of techniques through examples. The report also provides some basic concepts of radioactivity, nuclear radiation and interaction with matter.

  17. Linking advanced fracture models to structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chiesa, Matteo

    2001-07-01

    Shell structures with defects occur in many situations. The defects are usually introduced during the welding process necessary for joining different parts of the structure. Higher utilization of structural materials leads to a need for accurate numerical tools for reliable prediction of structural response. The direct discretization of the cracked shell structure with solid finite elements in order to perform an integrity assessment of the structure in question leads to large size problems, and makes such analysis infeasible in structural application. In this study a link between local material models and structural analysis is outlined. An ''ad hoc'' element formulation is used in order to connect complex material models to the finite element framework used for structural analysis. An improved elasto-plastic line spring finite element formulation, used in order to take cracks into account, is linked to shell elements which are further linked to beam elements. In this way one obtain a global model of the shell structure that also accounts for local flexibilities and fractures due to defects. An important advantage with such an approach is a direct fracture mechanics assessment e.g. via computed J-integral or CTOD. A recent development in this approach is the notion of two-parameter fracture assessment. This means that the crack tip stress tri-axiality (constraint) is employed in determining the corresponding fracture toughness, giving a much more realistic capacity of cracked structures. The present thesis is organized in six research articles and an introductory chapter that reviews important background literature related to this work. Paper I and II address the performance of shell and line spring finite elements as a cost effective tool for performing the numerical calculation needed to perform a fracture assessment. In Paper II a failure assessment, based on the testing of a constraint-corrected fracture mechanics specimen under tension, is

  18. Value analysis for advanced technology products

    Science.gov (United States)

    Soulliere, Mark

    2011-03-01

    Technology by itself can be wondrous, but buyers of technology factor in the price they have to pay along with performance in their decisions. As a result, the ``best'' technology may not always win in the marketplace when ``good enough'' can be had at a lower price. Technology vendors often set pricing by ``cost plus margin,'' or by competitors' offerings. What if the product is new (or has yet to be invented)? Value pricing is a methodology to price products based on the value generated (e.g. money saved) by using one product vs. the next best technical alternative. Value analysis can often clarify what product attributes generate the most value. It can also assist in identifying market forces outside of the control of the technology vendor that also influence pricing. These principles are illustrated with examples.

  19. Advanced exergetic analysis of five natural gas liquefaction processes

    International Nuclear Information System (INIS)

    Vatani, Ali; Mehrpooya, Mehdi; Palizdar, Ali

    2014-01-01

    Highlights: • Advanced exergetic analysis was investigated for five LNG processes. • Avoidable/unavoidable and endogenous/exogenous irreversibilities were calculated. • Advanced exergetic analysis identifies the potentials for improving the system. - Abstract: Conventional exergy analysis cannot identify portion of inefficiencies which can be avoided. Also this analysis does not have ability to calculate a portion of exergy destruction which has been produced through performance of a component alone. In this study advanced exergetic analysis was performed for five mixed refrigerant LNG processes and four parts of irreversibility (avoidable/unavoidable) and (endogenous/exogenous) were calculated for the components with high inefficiencies. The results showed that portion of endogenous exergy destruction in the components is higher than the exogenous one. In fact interactions among the components do not affect the inefficiencies significantly. Also this analysis showed that structural optimization cannot be useful to decrease the overall process irreversibilities. In compressors high portion of the exergy destruction is related to the avoidable one, thus they have high potential to improve. But in multi stream heat exchangers and air coolers, unavoidable inefficiencies were higher than the other parts. Advanced exergetic analysis can identify the potentials and strategies to improve thermodynamic performance of energy intensive processes

  20. Advanced methods of quality control in nuclear fuel fabrication

    International Nuclear Information System (INIS)

    Onoufriev, Vladimir

    2004-01-01

    Under pressure of current economic and electricity market situation utilities implement more demanding fuel utilization schemes including higher burn ups and thermal rates, longer fuel cycles and usage of Mo fuel. Therefore, fuel vendors have recently initiated new R and D programmes aimed at improving fuel quality, design and materials to produce robust and reliable fuel. In the beginning of commercial fuel fabrication, emphasis was given to advancements in Quality Control/Quality Assurance related mainly to product itself. During recent years, emphasis was transferred to improvements in process control and to implementation of overall Total Quality Management (TQM) programmes. In the area of fuel quality control, statistical control methods are now widely implemented replacing 100% inspection. This evolution, some practical examples and IAEA activities are described in the paper. The paper presents major findings of the latest IAEA Technical Meetings (TMs) and training courses in the area with emphasis on information received at the TM and training course held in 1999 and other latest publications to provide an overview of new developments in process/quality control, their implementation and results obtained including new approaches to QC

  1. Underwater Photosynthesis of Submerged Plants – Recent Advances and Methods

    Science.gov (United States)

    Pedersen, Ole; Colmer, Timothy D.; Sand-Jensen, Kaj

    2013-01-01

    We describe the general background and the recent advances in research on underwater photosynthesis of leaf segments, whole communities, and plant dominated aquatic ecosystems and present contemporary methods tailor made to quantify photosynthesis and carbon fixation under water. The majority of studies of aquatic photosynthesis have been carried out with detached leaves or thalli and this selectiveness influences the perception of the regulation of aquatic photosynthesis. We thus recommend assessing the influence of inorganic carbon and temperature on natural aquatic communities of variable density in addition to studying detached leaves in the scenarios of rising CO2 and temperature. Moreover, a growing number of researchers are interested in tolerance of terrestrial plants during flooding as torrential rains sometimes result in overland floods that inundate terrestrial plants. We propose to undertake studies to elucidate the importance of leaf acclimation of terrestrial plants to facilitate gas exchange and light utilization under water as these acclimations influence underwater photosynthesis as well as internal aeration of plant tissues during submergence. PMID:23734154

  2. Striking against bioterrorism with advanced proteomics and reference methods.

    Science.gov (United States)

    Armengaud, Jean

    2017-01-01

    The intentional use by terrorists of biological toxins as weapons has been of great concern for many years. Among the numerous toxins produced by plants, animals, algae, fungi, and bacteria, ricin is one of the most scrutinized by the media because it has already been used in biocrimes and acts of bioterrorism. Improving the analytical toolbox of national authorities to monitor these potential bioweapons all at once is of the utmost interest. MS/MS allows their absolute quantitation and exhibits advantageous sensitivity, discriminative power, multiplexing possibilities, and speed. In this issue of Proteomics, Gilquin et al. (Proteomics 2017, 17, 1600357) present a robust multiplex assay to quantify a set of eight toxins in the presence of a complex food matrix. This MS/MS reference method is based on scheduled SRM and high-quality standards consisting of isotopically labeled versions of these toxins. Their results demonstrate robust reliability based on rather loose scheduling of SRM transitions and good sensitivity for the eight toxins, lower than their oral median lethal doses. In the face of an increased threat from terrorism, relevant reference assays based on advanced proteomics and high-quality companion toxin standards are reliable and firm answers. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. CFD Analysis for Advanced Integrated Head Assembly

    Energy Technology Data Exchange (ETDEWEB)

    Jo, Won Ho; Kang, Tae Kyo; Cho, Yeon Ho; Kim, Hyun Min [KEPCO Engineering and Construction Co., Daejeon (Korea, Republic of)

    2016-10-15

    The Integrated Head Assembly (IHA) is permanently installed on the reactor vessel closure head during the normal plant operation and refueling operation. It consists of a number of systems and components such as the head lifting system, seismic support system, Control Element Drive Mechanism (CEDM) cooling system, cable support system, cooling shroud assemblies. With the operating experiences of the IHA, the needs for the design change to the current APR1400 IHA arouse to improve the seismic resistance and to accommodate the convenient maintenance. In this paper, the effects of the design changes were rigorously studied for the various sizes of the inlet openings to assure the proper cooling of the CEDMs. And the system pressure differentials and required flow rate for the CEDM cooling fan were analyzed regarding the various operating conditions for determining the capacity of the fan. As a part of the design process of the AIHA, the number of air inlets and baffle regions are reduced by simplifying the design of the APR1400 IHA. The design change of the baffle regions has been made such that the maximum possible space are occupied inside the IHA cooling shroud shell while avoiding the interference with CEDMs. So, only the air inlet opening was studied for the design change to supply a sufficient cooling air flow for each CEDM. The size and location of the air inlets in middle cooling shroud assembly were determined by the CFD analyses of the AIHA. And the case CFD analyses were performed depending on the ambient air temperature and fan operating conditions. The size of the air inlet openings is increased by comparison with the initial AIHA design, and it is confirmed that the cooling air flow rate for each CEDM meet the design requirement of 800 SCFM ± 10% with the increased air inlets. At the initial analysis, the fan outlet flow rate was assumed as 48.3 lbm/s, but the result revealed that the less outflow rate at the fan is enough to meet the design requirement

  4. Advanced high conversion PWR: preliminary analysis

    International Nuclear Information System (INIS)

    Golfier, H.; Bellanger, V.; Bergeron, A.; Dolci, F.; Gastaldi, B.; Koberl, O.; Mignot, G.; Thevenot, C.

    2007-01-01

    In this paper, physical aspects of a HCPWR (High Conversion Light Water Reactor), which is an innovative PWR fuelled with mixed oxide and having a higher conversion ratio due to a lower moderation ratio. Moderation ratios lower than unity are considered which has led to low moderation PWR fuel assembly designs. The objectives of this parametric study are to define a feasibility area with regard to the following neutronic aspects: moderation ratio, Pu loading, reactor spectrum, irradiation time, and neutronic coefficients. Important thermohydraulic parameters are the pressure drop, the critical heat flux, the maximum temperature in the fuel rod and the pumping power. The thermohydraulic analysis shows that a range of moderation ratios from 0.8 to 1.2 is technically possible. A compromise between improved fuel utilization and research and development effort has been found for the moderation ration of about 1. The parametric study shows that there are 2 ranges of interest for the moderation ratio: -) moderation ratio between 0.8 and 1.2 with reduced fissile heights (> 3 m), hexagonal arrangement fuel assembly and square arrangement fuel assembly are possible; and -) moderation between 0.6 and 0.7 with a modification of the reactor operating conditions (reduction of the primary flow and of the thermal power), the fuel rods could be arranged inside a hexagonal fuel rod assembly. (A.C.)

  5. Advances in carbonate exploration and reservoir analysis

    Science.gov (United States)

    Garland, J.; Neilson, J.; Laubach, S.E.; Whidden, Katherine J.

    2012-01-01

    The development of innovative techniques and concepts, and the emergence of new plays in carbonate rocks are creating a resurgence of oil and gas discoveries worldwide. The maturity of a basin and the application of exploration concepts have a fundamental influence on exploration strategies. Exploration success often occurs in underexplored basins by applying existing established geological concepts. This approach is commonly undertaken when new basins ‘open up’ owing to previous political upheavals. The strategy of using new techniques in a proven mature area is particularly appropriate when dealing with unconventional resources (heavy oil, bitumen, stranded gas), while the application of new play concepts (such as lacustrine carbonates) to new areas (i.e. ultra-deep South Atlantic basins) epitomizes frontier exploration. Many low-matrix-porosity hydrocarbon reservoirs are productive because permeability is controlled by fractures and faults. Understanding basic fracture properties is critical in reducing geological risk and therefore reducing well costs and increasing well recovery. The advent of resource plays in carbonate rocks, and the long-standing recognition of naturally fractured carbonate reservoirs means that new fracture and fault analysis and prediction techniques and concepts are essential.

  6. Application of the Advanced Distillation Curve Method to Fuels for Advanced Combustion Engine Gasolines

    KAUST Repository

    Burger, Jessica L.

    2015-07-16

    © This article not subject to U.S. Copyright. Published 2015 by the American Chemical Society. Incremental but fundamental changes are currently being made to fuel composition and combustion strategies to diversify energy feedstocks, decrease pollution, and increase engine efficiency. The increase in parameter space (by having many variables in play simultaneously) makes it difficult at best to propose strategic changes to engine and fuel design by use of conventional build-and-test methodology. To make changes in the most time- and cost-effective manner, it is imperative that new computational tools and surrogate fuels are developed. Currently, sets of fuels are being characterized by industry groups, such as the Coordinating Research Council (CRC) and other entities, so that researchers in different laboratories have access to fuels with consistent properties. In this work, six gasolines (FACE A, C, F, G, I, and J) are characterized by the advanced distillation curve (ADC) method to determine the composition and enthalpy of combustion in various distillate volume fractions. Tracking the composition and enthalpy of distillate fractions provides valuable information for determining structure property relationships, and moreover, it provides the basis for the development of equations of state that can describe the thermodynamic properties of these complex mixtures and lead to development of surrogate fuels composed of major hydrocarbon classes found in target fuels.

  7. SWOT ANALYSIS ON SAMPLING METHOD

    Directory of Open Access Journals (Sweden)

    CHIS ANCA OANA

    2014-07-01

    Full Text Available Audit sampling involves the application of audit procedures to less than 100% of items within an account balance or class of transactions. Our article aims to study audit sampling in audit of financial statements. As an audit technique largely used, in both its statistical and nonstatistical form, the method is very important for auditors. It should be applied correctly for a fair view of financial statements, to satisfy the needs of all financial users. In order to be applied correctly the method must be understood by all its users and mainly by auditors. Otherwise the risk of not applying it correctly would cause loose of reputation and discredit, litigations and even prison. Since there is not a unitary practice and methodology for applying the technique, the risk of incorrectly applying it is pretty high. The SWOT analysis is a technique used that shows the advantages, disadvantages, threats and opportunities. We applied SWOT analysis in studying the sampling method, from the perspective of three players: the audit company, the audited entity and users of financial statements. The study shows that by applying the sampling method the audit company and the audited entity both save time, effort and money. The disadvantages of the method are difficulty in applying and understanding its insight. Being largely used as an audit method and being a factor of a correct audit opinion, the sampling method’s advantages, disadvantages, threats and opportunities must be understood by auditors.

  8. Curing Characterisation of Spruce Tannin-based Foams using the Advanced Isoconversional Method

    Directory of Open Access Journals (Sweden)

    Matjaž Čop

    2014-06-01

    Full Text Available The curing kinetics of foam prepared from the tannin of spruce tree bark was investigated using differential scanning calorimetry (DSC and the advanced isoconversional method. An analysis of the formulations with differing amounts of components (furfuryl alcohol, glycerol, tannin, and a catalyst showed that curing was delayed with increasing proportions of glycerol or tannins. An optimum amount of the catalyst constituent was also found during the study. The curing of the foam system was accelerated with increasing temperatures. Finally, the advanced isoconversional method, based on the model-free kinetic algorithm developed by Vyazovkin, appeared to be an appropriate model for the characterisation of the curing kinetics of tannin-based foams.

  9. Advanced methods for image registration applied to JET videos

    Energy Technology Data Exchange (ETDEWEB)

    Craciunescu, Teddy, E-mail: teddy.craciunescu@jet.uk [EURATOM-MEdC Association, NILPRP, Bucharest (Romania); Murari, Andrea [Consorzio RFX, Associazione EURATOM-ENEA per la Fusione, Padova (Italy); Gelfusa, Michela [Associazione EURATOM-ENEA – University of Rome “Tor Vergata”, Roma (Italy); Tiseanu, Ion; Zoita, Vasile [EURATOM-MEdC Association, NILPRP, Bucharest (Romania); Arnoux, Gilles [EURATOM/CCFE Fusion Association, Culham Science Centre, Abingdon, Oxon (United Kingdom)

    2015-10-15

    Graphical abstract: - Highlights: • Development of an image registration method for JET IR and fast visible cameras. • Method based on SIFT descriptors and coherent point drift points set registration technique. • Method able to deal with extremely noisy images and very low luminosity images. • Computation time compatible with the inter-shot analysis. - Abstract: The last years have witnessed a significant increase in the use of digital cameras on JET. They are routinely applied for imaging in the IR and visible spectral regions. One of the main technical difficulties in interpreting the data of camera based diagnostics is the presence of movements of the field of view. Small movements occur due to machine shaking during normal pulses while large ones may arise during disruptions. Some cameras show a correlation of image movement with change of magnetic field strength. For deriving unaltered information from the videos and for allowing correct interpretation an image registration method, based on highly distinctive scale invariant feature transform (SIFT) descriptors and on the coherent point drift (CPD) points set registration technique, has been developed. The algorithm incorporates a complex procedure for rejecting outliers. The method has been applied for vibrations correction to videos collected by the JET wide angle infrared camera and for the correction of spurious rotations in the case of the JET fast visible camera (which is equipped with an image intensifier). The method has proved to be able to deal with the images provided by this camera frequently characterized by low contrast and a high level of blurring and noise.

  10. A CTSA Agenda to Advance Methods for Comparative Effectiveness Research

    Science.gov (United States)

    Helfand, Mark; Tunis, Sean; Whitlock, Evelyn P.; Pauker, Stephen G.; Basu, Anirban; Chilingerian, Jon; Harrell Jr., Frank E.; Meltzer, David O.; Montori, Victor M.; Shepard, Donald S.; Kent, David M.

    2011-01-01

    Abstract Clinical research needs to be more useful to patients, clinicians, and other decision makers. To meet this need, more research should focus on patient‐centered outcomes, compare viable alternatives, and be responsive to individual patients’ preferences, needs, pathobiology, settings, and values. These features, which make comparative effectiveness research (CER) fundamentally patient‐centered, challenge researchers to adopt or develop methods that improve the timeliness, relevance, and practical application of clinical studies. In this paper, we describe 10 priority areas that address 3 critical needs for research on patient‐centered outcomes (PCOR): (1) developing and testing trustworthy methods to identify and prioritize important questions for research; (2) improving the design, conduct, and analysis of clinical research studies; and (3) linking the process and outcomes of actual practice to priorities for research on patient‐centered outcomes. We argue that the National Institutes of Health, through its clinical and translational research program, should accelerate the development and refinement of methods for CER by linking a program of methods research to the broader portfolio of large, prospective clinical and health system studies it supports. Insights generated by this work should be of enormous value to PCORI and to the broad range of organizations that will be funding and implementing CER. Clin Trans Sci 2011; Volume 4: 188–198 PMID:21707950

  11. Advanced methods in evaluation of thermal power systems effectiveness

    International Nuclear Information System (INIS)

    Barnak, N.; Jakubcek, P.; Zadrazil, J.

    1993-01-01

    The universal method for thermodynamic systems process irreversibility evaluation based on exergetic approach is elaborated in this article. The method uses the basic property of exergy as extensive state parameter -additivity. Division of the system onto some hierarchic levels is considered and relation between exergetic system characteristics and its parts is defined. There are system structure coefficients in common form expressed article they are analysed. The criteria for technical and economical optimization of the system using expressed structure coefficients are defined. In the article, there are common approaches defined for the method application in the area of nuclear power plant secondary circuits and the method is used for nuclear power plant WWER-1000 secondary circuit analysis. For this, individual exergetic characteristics of secondary circuit and its parts are expressed and some of secondary circuit parameters are optimized. Proposals for practical realisation of the results are stated in the conclusions of the article, mainly in the area of computerized evaluation of technical and economical parameters of nuclear power plant and effectiveness of its operation

  12. Linking advanced biofuels policies with stakeholder interests: A method building on Quality Function Deployment

    International Nuclear Information System (INIS)

    Schillo, R. Sandra; Isabelle, Diane A.; Shakiba, Abtin

    2017-01-01

    The field of renewable energy policy is inherently complex due to the long-term impacts of its policies, the broad range of potential stakeholders, the intricacy of scientific, engineering and technological developments, and the interplay of complex policy mixes that may result in unintended consequences. Quality Function Deployment (QFD) provides a systematic consideration of all relevant stakeholders, a rigorous analysis of the needs of stakeholders, and a prioritization of design features based on stakeholders needs. We build on QFD combined with Analytical Hierarchy Process (AHP) to develop a novel method applied to the area of advanced biofuel policies. This Multi-Stakeholder Policy QFD (MSP QFD) provides a systematic approach to capture the voice of the stakeholders and align it with the broad range of potential advanced biofuels policies. To account for the policy environment, the MSP QFD utilizes a novel approach to stakeholder importance weights. This MSP QFD adds to the literature as it permits the analysis of the broad range of relevant national policies with regards to the development of advanced biofuels, as compared to more narrowly focused typical QFD applications. It also allows policy developers to gain additional insights into the perceived impacts of policies, as well as international comparisons. - Highlights: • Advanced biofuels are mostly still in research and early commercialization stages. • Government policies are expected to support biofuels stakeholders in market entry. • A Multi-Stakeholder Policy QFD (MSP QFD) links biofuels policies with stakeholders. • MSP QFD employs novel stakeholder weights method. • The case of advanced biofuels in Canada shows comparative importance of policies.

  13. Advanced Methods for Direct Ink Write Additive Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Compel, W. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lewicki, J. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2018-01-24

    Lawrence Livermore National Laboratory is one of the world’s premier labs for research and development of additive manufacturing processes. Out of these many processes, direct ink write (DIW) is arguably one of the most relevant for the manufacture of architected polymeric materials, components and hardware. However, a bottleneck in this pipeline that has largely been ignored to date is the lack of advanced software implementation with respect to toolpath execution. There remains to be a convenient, automated method to design and produce complex parts that is user-friendly and enabling for the realization of next generation designs and structures. For a material to be suitable as a DIW ink it must possess the appropriate rheological properties for this process. Most importantly, the material must exhibit shear-thinning in order to extrude through a print head and have a rapid recovery of its static shear modulus. This makes it possible for the extrudate to be self-supporting upon exiting the print head. While this and other prerequisites narrow the scope of ‘offthe- shelf’ printable materials directly amenable to DIW, the process still tolerates a wide range of potential feedstock materials. These include metallic alloys, inorganic solvent borne dispersions, polymeric melts, filler stabilized monomer compositions, pre-elastomeric feedstocks and thermoset resins each of which requires custom print conditions tailored to the individual ink. As such, an ink perfectly suited for DIW may be prematurely determined to be undesirable for the process if printed under the wrong conditions. Defining appropriate print conditions such as extrusion rate, layer height, and maximum bridge length is a vital first step in validating an ink’s DIW capability.

  14. Flows method in global analysis

    International Nuclear Information System (INIS)

    Duong Minh Duc.

    1994-12-01

    We study the gradient flows method for W r,p (M,N) where M and N are Riemannian manifold and r may be less than m/p. We localize some global analysis problem by constructing gradient flows which only change the value of any u in W r,p (M,N) in a local chart of M. (author). 24 refs

  15. Advanced stress analysis of PWR containments in the region of nozzles

    International Nuclear Information System (INIS)

    Schauer, G.

    1977-01-01

    As an example of the stress analysis of a nozzle in a PWR steel containment, an advanced stress analysis of a personnel lock is presented. Contrary to the calculations by means of numerical shell programs usual till now, this advanced stress analysis was executed with the finite-element-method. Because of their theory, the shell programs compute mathematically exact results, but at the intersection of two shells the notch stresses cannot be analyzed well. A further disadvantage must be seen in the fact that there is a great distance between the real critical region near the intersection line and the calculation point, which lies on the neutral axis of the shell

  16. The status of nuclear fuel cycle system analysis for the development of advanced nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kim, Seong Ki; Lee, Hyo Jik; Chang, Hong Rae; Kwon, Eun Ha; Lee, Yoon Hee; Gao, Fanxing [KAERI, Daejeon (Korea, Republic of)

    2011-11-15

    The system analysis has been used with different system and objectives in various fields. In the nuclear field, the system can be applied from uranium mining to spent fuel reprocessing or disposal which is called the nuclear fuel cycle. The analysis of nuclear fuel cycle can be guideline for development of advanced fuel cycle through integrating and evaluating the technologies. For this purpose, objective approach is essential and modeling and simulation can be useful. In this report, several methods which can be applicable for development of advanced nuclear fuel cycle, such as TRL, simulation and trade analysis were explained with case study

  17. Advanced methods in NDE using machine learning approaches

    Science.gov (United States)

    Wunderlich, Christian; Tschöpe, Constanze; Duckhorn, Frank

    2018-04-01

    Machine learning (ML) methods and algorithms have been applied recently with great success in quality control and predictive maintenance. Its goal to build new and/or leverage existing algorithms to learn from training data and give accurate predictions, or to find patterns, particularly with new and unseen similar data, fits perfectly to Non-Destructive Evaluation. The advantages of ML in NDE are obvious in such tasks as pattern recognition in acoustic signals or automated processing of images from X-ray, Ultrasonics or optical methods. Fraunhofer IKTS is using machine learning algorithms in acoustic signal analysis. The approach had been applied to such a variety of tasks in quality assessment. The principal approach is based on acoustic signal processing with a primary and secondary analysis step followed by a cognitive system to create model data. Already in the second analysis steps unsupervised learning algorithms as principal component analysis are used to simplify data structures. In the cognitive part of the software further unsupervised and supervised learning algorithms will be trained. Later the sensor signals from unknown samples can be recognized and classified automatically by the algorithms trained before. Recently the IKTS team was able to transfer the software for signal processing and pattern recognition to a small printed circuit board (PCB). Still, algorithms will be trained on an ordinary PC; however, trained algorithms run on the Digital Signal Processor and the FPGA chip. The identical approach will be used for pattern recognition in image analysis of OCT pictures. Some key requirements have to be fulfilled, however. A sufficiently large set of training data, a high signal-to-noise ratio, and an optimized and exact fixation of components are required. The automated testing can be done subsequently by the machine. By integrating the test data of many components along the value chain further optimization including lifetime and durability

  18. Advanced scientific computational methods and their applications to nuclear technologies. (3) Introduction of continuum simulation methods and their applications (3)

    International Nuclear Information System (INIS)

    Satake, Shin-ichi; Kunugi, Tomoaki

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the third issue showing the introduction of continuum simulation methods and their applications. Spectral methods and multi-interface calculation methods in fluid dynamics are reviewed. (T. Tanaka)

  19. A Meta-Analysis of Advanced Organizer Studies.

    Science.gov (United States)

    Stone, Carol Leth

    1983-01-01

    Twenty-nine reports yielding 112 studies were analyzed with Glass's meta-analysis technique, and results were compared with predictions from Ausubel's model of assimilative learning. Overall, advance organizers were shown to be associated with increased learning and retention of material to be learned. (Author)

  20. Homotopy analysis method for neutron diffusion calculations

    International Nuclear Information System (INIS)

    Cavdar, S.

    2009-01-01

    The Homotopy Analysis Method (HAM), proposed in 1992 by Shi Jun Liao and has been developed since then, is based on a fundamental concept in differential geometry and topology, the homotopy. It has proved useful for problems involving algebraic, linear/non-linear, ordinary/partial differential and differential-integral equations being an analytic, recursive method that provides a series sum solution. It has the advantage of offering a certain freedom for the choice of its arguments such as the initial guess, the auxiliary linear operator and the convergence control parameter, and it allows us to effectively control the rate and region of convergence of the series solution. HAM is applied for the fixed source neutron diffusion equation in this work, which is a part of our research motivated by the question of whether methods for solving the neutron diffusion equation that yield straightforward expressions but able to provide a solution of reasonable accuracy exist such that we could avoid analytic methods that are widely used but either fail to solve the problem or provide solutions through many intricate expressions that are likely to contain mistakes or numerical methods that require powerful computational resources and advanced programming skills due to their very nature or intricate mathematical fundamentals. Fourier basis are employed for expressing the initial guess due to the structure of the problem and its boundary conditions. We present the results in comparison with other widely used methods of Adomian Decomposition and Variable Separation.

  1. Recent Advances in Multidisciplinary Analysis and Optimization, part 3

    Science.gov (United States)

    Barthelemy, Jean-Francois M. (Editor)

    1989-01-01

    This three-part document contains a collection of technical papers presented at the Second NASA/Air Force Symposium on Recent Advances in Multidisciplinary Analysis and Optimization, held September 28-30, 1988 in Hampton, Virginia. The topics covered include: aircraft design, aeroelastic tailoring, control of aeroelastic structures, dynamics and control of flexible structures, structural design, design of large engineering systems, application of artificial intelligence, shape optimization, software development and implementation, and sensitivity analysis.

  2. Preliminary study of clinical staging of moderately advanced and advanced thoracic esophageal carcinoma treated by non-surgical methods

    International Nuclear Information System (INIS)

    Zhu Shuchai; Li Ren; Li Juan; Qiu Rong; Han Chun; Wan Jun

    2004-01-01

    Objective: To explore the clinical staging of moderately advanced and advanced thoracic esophageal carcinoma by evaluating the prognosis and provide criteria for individual treatment. Methods: The authors retrospectively analyzed 500 patients with moderately advanced and advanced thoracic esophageal carcinoma treated by radiotherapy alone. According to the primary lesion length by barium meal X-ray film, the invasion range and the relation between location and the surrounding organs by CT scans the disease category was classified by a 6 stage method and a 4 stage method. With the primary lesion divide into T1, T2a, T2b, T3a, T3b and T4 incorporating the locregional lymph node metastasis, a 6 stage system was obtained, I, IIa , IIb, IIIa, IIIb and IV. The results of this as compared with those of 4 stage system, the following data were finally arrived at. Results: Among the 500 cases, there were T1 23, T2a 111, T2b 157, T3a 84, T3b 82 and T4 43. The survival rates of these six categories showed significant differences (χ 2 =63.32, P 2 =56.29, P 2 =94.29, P 2 =83.48, P<0.05). Conclusions: Both the 6 stage and 4 stage systems are adaptable to predict prognosis of moderately advanced and advanced esophageal carcinoma treated by radiotherapy alone. For simplicity and convenience, the 4 stage classification is recommended. (authors)

  3. NATO Advanced Study Institute on Evolving Methods for Macromolecular Gystallography

    CERN Document Server

    Read, Randy J

    2007-01-01

    X-ray crystallography is the pre-eminent technique for visualizing the structures of macromolecules at atomic resolution. These structures are central to understanding the detailed mechanisms of biological processes, and to discovering novel therapeutics using a structure-based approach. As yet, structures are known for only a small fraction of the proteins encoded by human and pathogenic genomes. To counter the myriad modern threats of disease, there is an urgent need to determine the structures of the thousands of proteins whose structure and function remain unknown. This volume draws on the expertise of leaders in the field of macromolecular crystallography to illuminate the dramatic developments that are accelerating progress in structural biology. Their contributions span the range of techniques from crystallization through data collection, structure solution and analysis, and show how modern high-throughput methods are contributing to a deeper understanding of medical problems.

  4. Rasch Analysis of the Fullerton Advanced Balance (FAB) Scale

    Science.gov (United States)

    Fiedler, Roger C.; Rose, Debra J.

    2011-01-01

    ABSTRACT Purpose: This cross-sectional study explores the psychometric properties and dimensionality of the Fullerton Advanced Balance (FAB) Scale, a multi-item balance test for higher-functioning older adults. Methods: Participants (n=480) were community-dwelling adults able to ambulate independently. Data gathering consisted of survey and balance performance assessment. Psychometric properties were assessed using Rasch analysis. Results: Mean age of participants was 76.4 (SD=7.1) years. Mean FAB Scale scores were 24.7/40 (SD=7.5). Analyses for scale dimensionality showed that 9 of the 10 items fit a unidimensional measure of balance. Item 10 (Reactive Postural Control) did not fit the model. The reliability of the scale to separate persons was 0.81 out of 1.00; the reliability of the scale to separate items in terms of their difficulty was 0.99 out of 1.00. Cronbach's alpha for a 10-item model was 0.805. Items of differing difficulties formed a useful ordinal hierarchy for scaling patterns of expected balance ability scoring for a normative population. Conclusion: The FAB Scale appears to be a reliable and valid tool to assess balance function in higher-functioning older adults. The test was found to discriminate among participants of varying balance abilities. Further exploration of concurrent validity of Rasch-generated expected item scoring patterns should be undertaken to determine the test's diagnostic and prescriptive utility. PMID:22210989

  5. Nuclear Fuel Cycle Analysis Technology to Develop Advanced Nuclear Fuel Cycle

    Energy Technology Data Exchange (ETDEWEB)

    Park, Byung Heung [Chungju National University, Chungju (Korea, Republic of); Ko, Won IL [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2011-12-15

    The nuclear fuel cycle (NFC) analysis is a study to set a NFC policy and to promote systematic researches by analyzing technologies and deriving requirements at each stage of a fuel cycle. System analysis techniques are utilized for comparative analysis and assessment of options on a considered system. In case that NFC is taken into consideration various methods of the system analysis techniques could be applied depending on the range of an interest. This study presented NFC analysis strategies for the development of a domestic advanced NFC and analysis techniques applicable to different phases of the analysis. Strategically, NFC analysis necessitates the linkage with technology analyses, domestic and international interests, and a national energy program. In this respect, a trade-off study is readily applicable since it includes various aspects on NFC as metrics and then analyzes the considered NFC options according to the derived metrics. In this study, the trade-off study was identified as a method for NFC analysis with the derived strategies and it was expected to be used for development of an advanced NFC. A technology readiness level (TRL) method and NFC simulation codes could be utilized to obtain the required metrics and data for assessment in the trade-off study. The methodologies would guide a direction of technology development by comparing and assessing technological, economical, environmental, and other aspects on the alternatives. Consequently, they would contribute for systematic development and deployment of an appropriate advanced NFC.

  6. Nuclear Fuel Cycle Analysis Technology to Develop Advanced Nuclear Fuel Cycle

    International Nuclear Information System (INIS)

    Park, Byung Heung; Ko, Won IL

    2011-01-01

    The nuclear fuel cycle (NFC) analysis is a study to set a NFC policy and to promote systematic researches by analyzing technologies and deriving requirements at each stage of a fuel cycle. System analysis techniques are utilized for comparative analysis and assessment of options on a considered system. In case that NFC is taken into consideration various methods of the system analysis techniques could be applied depending on the range of an interest. This study presented NFC analysis strategies for the development of a domestic advanced NFC and analysis techniques applicable to different phases of the analysis. Strategically, NFC analysis necessitates the linkage with technology analyses, domestic and international interests, and a national energy program. In this respect, a trade-off study is readily applicable since it includes various aspects on NFC as metrics and then analyzes the considered NFC options according to the derived metrics. In this study, the trade-off study was identified as a method for NFC analysis with the derived strategies and it was expected to be used for development of an advanced NFC. A technology readiness level (TRL) method and NFC simulation codes could be utilized to obtain the required metrics and data for assessment in the trade-off study. The methodologies would guide a direction of technology development by comparing and assessing technological, economical, environmental, and other aspects on the alternatives. Consequently, they would contribute for systematic development and deployment of an appropriate advanced NFC.

  7. Recent advances of capillary electrophoresis in pharmaceutical analysis.

    Science.gov (United States)

    Suntornsuk, Leena

    2010-09-01

    This review covers recent advances of capillary electrophoresis (CE) in pharmaceutical analysis. The principle, instrumentation, and conventional modes of CE are briefly discussed. Advances in the different CE techniques (non-aqueous CE, microemulsion electrokinetic chromatography, capillary isotachophoresis, capillary electrochromatography, and immunoaffinity CE), detection techniques (mass spectrometry, light-emitting diode, fluorescence, chemiluminescence, and contactless conductivity), on-line sample pretreatment (flow injection) and chiral separation are described. Applications of CE to assay of active pharmaceutical ingredients (APIs), drug impurity testing, chiral drug separation, and determination of APIs in biological fluids published from 2008 to 2009 are tabulated.

  8. Advanced image based methods for structural integrity monitoring: Review and prospects

    Science.gov (United States)

    Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.

    2018-02-01

    There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.

  9. Issues affecting advanced passive light-water reactor safety analysis

    International Nuclear Information System (INIS)

    Beelman, R.J.; Fletcher, C.D.; Modro, S.M.

    1992-01-01

    Next generation commercial reactor designs emphasize enhanced safety through improved safety system reliability and performance by means of system simplification and reliance on immutable natural forces for system operation. Simulating the performance of these safety systems will be central to analytical safety evaluation of advanced passive reactor designs. Yet the characteristically small driving forces of these safety systems pose challenging computational problems to current thermal-hydraulic systems analysis codes. Additionally, the safety systems generally interact closely with one another, requiring accurate, integrated simulation of the nuclear steam supply system, engineered safeguards and containment. Furthermore, numerical safety analysis of these advanced passive reactor designs wig necessitate simulation of long-duration, slowly-developing transients compared with current reactor designs. The composite effects of small computational inaccuracies on induced system interactions and perturbations over long periods may well lead to predicted results which are significantly different than would otherwise be expected or might actually occur. Comparisons between the engineered safety features of competing US advanced light water reactor designs and analogous present day reactor designs are examined relative to the adequacy of existing thermal-hydraulic safety codes in predicting the mechanisms of passive safety. Areas where existing codes might require modification, extension or assessment relative to passive safety designs are identified. Conclusions concerning the applicability of these codes to advanced passive light water reactor safety analysis are presented

  10. Data Analysis Methods for Paleogenomics

    DEFF Research Database (Denmark)

    Avila Arcos, Maria del Carmen

    (Danmarks Grundforskningfond) 'Centre of Excellence in GeoGenetics' grant, with additional funding provided by the Danish Council for Independent Research 'Sapere Aude' programme. The thesis comprises five chapters, all of which represent different projects that involved the analysis of massive amounts......, thanks to the introduction of NGS and the implementation of data analysis methods specific for each project. Chapters 1 to 3 have been published in peer-reviewed journals and Chapter 4 is currently in review. Chapter 5 consists of a manuscript describing initial results of an ongoing research project......The work presented in this thesis is the result of research carried out during a three-year PhD at the Centre for GeoGenetics, Natural History Museum of Denmark, University of Copenhagen, under supervision of Professor Tom Gilbert. The PhD was funded by the Danish National Research Foundation...

  11. Development of advanced spent fuel management process. System analysis of advanced spent fuel management process

    International Nuclear Information System (INIS)

    Ro, S.G.; Kang, D.S.; Seo, C.S.; Lee, H.H.; Shin, Y.J.; Park, S.W.

    1999-03-01

    The system analysis of an advanced spent fuel management process to establish a non-proliferation model for the long-term spent fuel management is performed by comparing the several dry processes, such as a salt transport process, a lithium process, the IFR process developed in America, and DDP developed in Russia. In our system analysis, the non-proliferation concept is focused on the separation factor between uranium and plutonium and decontamination factors of products in each process, and the non-proliferation model for the long-term spent fuel management has finally been introduced. (Author). 29 refs., 17 tabs., 12 figs

  12. A New Boron Analysis Method

    Energy Technology Data Exchange (ETDEWEB)

    Weitman, J; Daaverhoeg, N; Farvolden, S

    1970-07-01

    In connection with fast neutron (n, {alpha}) cross section measurements a novel boron analysis method has been developed. The boron concentration is inferred from the mass spectrometrically determined number of helium atoms produced in the thermal and epithermal B-10 (n, {alpha}) reaction. The relation between helium amount and boron concentration is given, including corrections for self shielding effects and background levels. Direct and diffusion losses of helium are calculated and losses due to gettering, adsorption and HF-ionization in the release stage are discussed. A series of boron determinations is described and the results are compared with those obtained by other methods, showing excellent agreement. The lower limit of boron concentration which can be measured varies with type of sample. In e.g. steel, concentrations below 10-5 % boron in samples of 0.1-1 gram may be determined.

  13. Advancing multilevel thinking and methods in HRM research

    NARCIS (Netherlands)

    Renkema, Maarten; Meijerink, Jeroen Gerard; Bondarouk, Tatiana

    2016-01-01

    Purpose Despite the growing belief that multilevel research is necessary to advance HRM understanding, there remains a lack of multilevel thinking – the application of principles for multilevel theory building. The purpose of this paper is to propose a systematic approach for multilevel HRM

  14. Combination of retrograde superselective intra-arterial chemotherapy and Seldinger method in locally advanced oral cancer

    Directory of Open Access Journals (Sweden)

    Masataka Uehara

    2015-01-01

    Full Text Available The nonsurgical strategies for locally advanced oral cancer are desirable. Superselective intra-arterial infusion with radiotherapy was utilized for this purpose, and there are two types of superselective intra-arterial infusion methods: The Seldinger method and the retrograde superselective intra-arterial chemotherapy (HFT method. In one case, the HFT method was applied to locally advanced tongue cancer, and the Seldinger method was used for additional administration of cisplatin (CDDP to compensate for a lack of drug flow in the HFT method. In another case, the HFT method was applied to locally advanced lower gingival cancer. The Seldinger method was applied to metastatic lymph nodes. In both cases, additional administration of CDDP using the Seldinger method resulted in a complete response. The combination of the HFT and Seldinger methods was useful to eradicate locally advanced oral cancer because each method compensated for the defects of the other.

  15. Advanced Post-Irradiation Examination Capabilities Alternatives Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    Jeff Bryan; Bill Landman; Porter Hill

    2012-12-01

    An alternatives analysis was performed for the Advanced Post-Irradiation Capabilities (APIEC) project in accordance with the U.S. Department of Energy (DOE) Order DOE O 413.3B, “Program and Project Management for the Acquisition of Capital Assets”. The Alternatives Analysis considered six major alternatives: ? No Action ? Modify Existing DOE Facilities – capabilities distributed among multiple locations ? Modify Existing DOE Facilities – capabilities consolidated at a few locations ? Construct New Facility ? Commercial Partnership ? International Partnerships Based on the alternatives analysis documented herein, it is recommended to DOE that the advanced post-irradiation examination capabilities be provided by a new facility constructed at the Materials and Fuels Complex at the Idaho National Laboratory.

  16. Statistical trend analysis methods for temporal phenomena

    International Nuclear Information System (INIS)

    Lehtinen, E.; Pulkkinen, U.; Poern, K.

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods

  17. Statistical trend analysis methods for temporal phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Lehtinen, E.; Pulkkinen, U. [VTT Automation, (Finland); Poern, K. [Poern Consulting, Nykoeping (Sweden)

    1997-04-01

    We consider point events occurring in a random way in time. In many applications the pattern of occurrence is of intrinsic interest as indicating a trend or some other systematic feature in the rate of occurrence. The purpose of this report is to survey briefly different statistical trend analysis methods and illustrate their applicability to temporal phenomena in particular. The trend testing of point events is usually seen as the testing of the hypotheses concerning the intensity of the occurrence of events. When the intensity function is parametrized, the testing of trend is a typical parametric testing problem. In industrial applications the operational experience generally does not suggest any specified model and method in advance. Therefore, and particularly, if the Poisson process assumption is very questionable, it is desirable to apply tests that are valid for a wide variety of possible processes. The alternative approach for trend testing is to use some non-parametric procedure. In this report we have presented four non-parametric tests: The Cox-Stuart test, the Wilcoxon signed ranks test, the Mann test, and the exponential ordered scores test. In addition to the classical parametric and non-parametric approaches we have also considered the Bayesian trend analysis. First we discuss a Bayesian model, which is based on a power law intensity model. The Bayesian statistical inferences are based on the analysis of the posterior distribution of the trend parameters, and the probability of trend is immediately seen from these distributions. We applied some of the methods discussed in an example case. It should be noted, that this report is a feasibility study rather than a scientific evaluation of statistical methods, and the examples can only be seen as demonstrations of the methods. 14 refs, 10 figs.

  18. Advanced PWR technology development -Development of advanced PWR system analysis technology-

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Moon Heui; Hwang, Yung Dong; Kim, Sung Oh; Yoon, Joo Hyun; Jung, Bub Dong; Choi, Chul Jin; Lee, Yung Jin; Song, Jin Hoh [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    The primary scope of this study is to establish the analysis technology for the advanced reactor designed on the basis of the passive and inherent safety concepts. This study is extended to the application of these technology to the safety analysis of the passive reactor. The study was performed for the small and medium sized reactor and the large sized reactor by focusing on the development of the analysis technology for the passive components. Among the identified concepts the once-through steam generator, the natural circulation of the integral reactor, heat pipe for containment cooling, and hydraulic valve were selected as the high priority items to be developed and the related studies are being performed for these items. For the large sized passive reactor, the study plans to extend the applicability of the best estimate computer code RELAP5/MOD3 which is widely used for the safety analyses of the reactor system. The improvement and supplementation study of the analysis modeling and the methodology is planned to be carried out for these purpose. The newly developed technologies are expected to be applied to the domestic advanced reactor design and analysis and these technologies will play a key role in extending the domestic nuclear base technology and consolidating self-reliance in the essential nuclear technology. 72 figs, 15 tabs, 124 refs. (Author).

  19. Earthquake Hazard Analysis Methods: A Review

    Science.gov (United States)

    Sari, A. M.; Fakhrurrozi, A.

    2018-02-01

    One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.

  20. CHF predictor derived from a 3D thermal-hydraulic code and an advanced statistical method

    International Nuclear Information System (INIS)

    Banner, D.; Aubry, S.

    2004-01-01

    A rod bundle CHF predictor has been determined by using a 3D code (THYC) to compute local thermal-hydraulic conditions at the boiling crisis location. These local parameters have been correlated to the critical heat flux by using an advanced statistical method based on spline functions. The main characteristics of the predictor are presented in conjunction with a detailed analysis of predictions (P/M ratio) in order to prove that the usual safety methodology can be applied with such a predictor. A thermal-hydraulic design criterion is obtained (1.13) and the predictor is compared with the WRB-1 correlation. (author)

  1. Recent advances in quantitative high throughput and high content data analysis.

    Science.gov (United States)

    Moutsatsos, Ioannis K; Parker, Christian N

    2016-01-01

    High throughput screening has become a basic technique with which to explore biological systems. Advances in technology, including increased screening capacity, as well as methods that generate multiparametric readouts, are driving the need for improvements in the analysis of data sets derived from such screens. This article covers the recent advances in the analysis of high throughput screening data sets from arrayed samples, as well as the recent advances in the analysis of cell-by-cell data sets derived from image or flow cytometry application. Screening multiple genomic reagents targeting any given gene creates additional challenges and so methods that prioritize individual gene targets have been developed. The article reviews many of the open source data analysis methods that are now available and which are helping to define a consensus on the best practices to use when analyzing screening data. As data sets become larger, and more complex, the need for easily accessible data analysis tools will continue to grow. The presentation of such complex data sets, to facilitate quality control monitoring and interpretation of the results will require the development of novel visualizations. In addition, advanced statistical and machine learning algorithms that can help identify patterns, correlations and the best features in massive data sets will be required. The ease of use for these tools will be important, as they will need to be used iteratively by laboratory scientists to improve the outcomes of complex analyses.

  2. Analysis instruments for the performance of Advanced Practice Nursing.

    Science.gov (United States)

    Sevilla-Guerra, Sonia; Zabalegui, Adelaida

    2017-11-29

    Advanced Practice Nursing has been a reality in the international context for several decades and recently new nursing profiles have been developed in Spain as well that follow this model. The consolidation of these advanced practice roles has also led to of the creation of tools that attempt to define and evaluate their functions. This study aims to identify and explore the existing instruments that enable the domains of Advanced Practice Nursing to be defined. A review of existing international questionnaires and instruments was undertaken, including an analysis of the design process, the domains/dimensions defined, the main results and an exploration of clinimetric properties. Seven studies were analysed but not all proved to be valid, stable or reliable tools. One included tool was able to differentiate between the functions of the general nurse and the advanced practice nurse by the level of activities undertaken within the five domains described. These tools are necessary to evaluate the scope of advanced practice in new nursing roles that correspond to other international models of competencies and practice domains. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  3. Analysis of live cell images: Methods, tools and opportunities.

    Science.gov (United States)

    Nketia, Thomas A; Sailem, Heba; Rohde, Gustavo; Machiraju, Raghu; Rittscher, Jens

    2017-02-15

    Advances in optical microscopy, biosensors and cell culturing technologies have transformed live cell imaging. Thanks to these advances live cell imaging plays an increasingly important role in basic biology research as well as at all stages of drug development. Image analysis methods are needed to extract quantitative information from these vast and complex data sets. The aim of this review is to provide an overview of available image analysis methods for live cell imaging, in particular required preprocessing image segmentation, cell tracking and data visualisation methods. The potential opportunities recent advances in machine learning, especially deep learning, and computer vision provide are being discussed. This review includes overview of the different available software packages and toolkits. Copyright © 2017. Published by Elsevier Inc.

  4. Steam leak detection in advance reactors via acoustics method

    International Nuclear Information System (INIS)

    Singh, Raj Kumar; Rao, A. Rama

    2011-01-01

    Highlights: → Steam leak detection system is developed to detect any leak inside the reactor vault. → The technique uses leak noise frequency spectrum for leak detection. → Testing of system and method to locate the leak is also developed and discussed in present paper. - Abstract: Prediction of LOCA (loss of coolant activity) plays very important role in safety of nuclear reactor. Coolant is responsible for heat transfer from fuel bundles. Loss of coolant is an accidental situation which requires immediate shut down of reactor. Fall in system pressure during LOCA is the trip parameter used for initiating automatic reactor shut down. However, in primary heat transport system operating in two phase regimes, detection of small break LOCA is not simple. Due to very slow leak rates, time for the fall of pressure is significantly slow. From reactor safety point of view, it is extremely important to find reliable and effective alternative for detecting slow pressure drop in case of small break LOCA. One such technique is the acoustic signal caused by LOCA in small breaks. In boiling water reactors whose primary heat transport is to be driven by natural circulation, small break LOCA detection is important. For prompt action on post small break LOCA, steam leak detection system is developed to detect any leak inside the reactor vault. The detection technique is reliable and plays a very important role in ensuring safety of the reactor. Methodology developed for steam leak detection is discussed in present paper. The methods to locate the leak is also developed and discussed in present paper which is based on analysis of the signal.

  5. Seismically induced accident sequence analysis of the advanced test reactor

    International Nuclear Information System (INIS)

    Khericha, S.T.; Henry, D.M.; Ravindra, M.K.; Hashimoto, P.S.; Griffin, M.J.; Tong, W.H.; Nafday, A.M.

    1991-01-01

    A seismic probabilistic risk assessment (PRA) was performed for the Department of Energy (DOE) Advanced Test Reactor (ATR) as part of the external events analysis. The risk from seismic events to the fuel in the core and in the fuel storage canal was evaluated. The key elements of this paper are the integration of seismically induced internal flood and internal fire, and the modeling of human error rates as a function of the magnitude of earthquake. The systems analysis was performed by EG ampersand G Idaho, Inc. and the fragility analysis and quantification were performed by EQE International, Inc. (EQE)

  6. The integrated code system CASCADE-3D for advanced core design and safety analysis

    International Nuclear Information System (INIS)

    Neufert, A.; Van de Velde, A.

    1999-01-01

    The new program system CASCADE-3D (Core Analysis and Safety Codes for Advanced Design Evaluation) links some of Siemens advanced code packages for in-core fuel management and accident analysis: SAV95, PANBOX/COBRA and RELAP5. Consequently by using CASCADE-3D the potential of modern fuel assemblies and in-core fuel management strategies can be much better utilized because safety margins which had been reduced due to conservative methods are now predicted more accurately. By this innovative code system the customers can now take full advantage of the recent progress in fuel assembly design and in-core fuel management.(author)

  7. Exploring biomolecular dynamics and interactions using advanced sampling methods

    International Nuclear Information System (INIS)

    Luitz, Manuel; Bomblies, Rainer; Ostermeir, Katja; Zacharias, Martin

    2015-01-01

    Molecular dynamics (MD) and Monte Carlo (MC) simulations have emerged as a valuable tool to investigate statistical mechanics and kinetics of biomolecules and synthetic soft matter materials. However, major limitations for routine applications are due to the accuracy of the molecular mechanics force field and due to the maximum simulation time that can be achieved in current simulations studies. For improving the sampling a number of advanced sampling approaches have been designed in recent years. In particular, variants of the parallel tempering replica-exchange methodology are widely used in many simulation studies. Recent methodological advancements and a discussion of specific aims and advantages are given. This includes improved free energy simulation approaches and conformational search applications. (topical review)

  8. A Statistical-Probabilistic Pattern for Determination of Tunnel Advance Step by Quantitative Risk Analysis

    Directory of Open Access Journals (Sweden)

    sasan ghorbani

    2017-12-01

    Full Text Available One of the main challenges faced in design and construction phases of tunneling projects is the determination of maximum allowable advance step to maximize excavation rate and reduce project delivery time. Considering the complexity of determining this factor and unexpected risks associated with inappropriate determination of that, it is necessary to employ a method which is capable of accounting for interactions among uncertain geotechnical parameters and advance step. The main objective in the present research is to undertake optimization and risk management of advance step length in water diversion tunnel at Shahriar Dam based on uncertainty of geotechnical parameters following a statistic-probabilistic approach. In the present research, in order to determine optimum advance step for excavation operation, two hybrid methods were used: strength reduction method-discrete element method- Monte Carlo simulation (SRM/DEM/MCS and strength reduction method- discrete element method- point estimate method (SRM/DEM/PEM. Moreover, Taguchi analysis was used to investigate the sensitivity of advance step to changes in statistical distribution function of input parameters under three tunneling scenarios at sections of poor to good qualities (as per RMR classification system. Final results implied the optimality of the advance step defined in scenario 2 where 2 m advance per excavation round was proposed, according to shear strain criterion and SRM/DEM/MCS, with minimum failure probability and risk of 8.05% and 75281.56 $, respectively, at 95% confidence level. Moreover, in either of normal, lognormal, and gamma distributions, as the advance step increased from Scenario 1 to 2, failure probability was observed to increase at lower rate than that observed when advance step in scenario 2 was increased to that In Scenario 3. In addition, Taguchi tests were subjected to signal-to-noise analysis and the results indicated that, considering the three statistical

  9. Gravimetric and titrimetric methods of analysis

    International Nuclear Information System (INIS)

    Rives, R.D.; Bruks, R.R.

    1983-01-01

    Gravimetric and titrimetric methods of analysis are considered. Methods of complexometric titration are mentioned, as well as methods of increasing sensitivity in titrimetry. Gravimetry and titrimetry are applied during analysis for traces of geological materials

  10. Development of the advanced PHWR technology -Design and analysis of CANDU advanced fuel-

    Energy Technology Data Exchange (ETDEWEB)

    Suk, Hoh Chun; Shim, Kee Sub; Byun, Taek Sang; Park, Kwang Suk; Kang, Heui Yung; Kim, Bong Kee; Jung, Chang Joon; Lee, Yung Wook; Bae, Chang Joon; Kwon, Oh Sun; Oh, Duk Joo; Im, Hong Sik; Ohn, Myung Ryong; Lee, Kang Moon; Park, Joo Hwan; Lee, Eui Joon [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1995-07-01

    This is the `94 annual report of the CANDU advanced fuel design and analysis project, and describes CANFLEX fuel design and mechanical integrity analysis, reactor physics analysis and safety analysis of the CANDU-6 with the CANFLEX-NU. The following is the R and D scope of this fiscal year : (1) Detail design of CANFLEX-NU and detail analysis on the fuel integrity, reactor physics and safety. (a) Detail design and mechanical integrity analysis of the bundle (b) CANDU-6 refueling simulation, and analysis on the Xe transients and adjuster system capability (c) Licensing strategy establishment and safety analysis for the CANFLEX-NU demonstration demonstration irradiation in a commercial CANDU-6. (2) Production and revision of CANFLEX-NU fuel design documents (a) Production and approval of CANFLEX-NU reference drawing, and revisions of fuel design manual and technical specifications (b) Production of draft physics design manual. (3) Basic research on CANFLEX-SEU fuel. 55 figs, 21 tabs, 45 refs. (Author).

  11. Using Statistical Analysis Software to Advance Nitro Plasticizer Wettability

    Energy Technology Data Exchange (ETDEWEB)

    Shear, Trevor Allan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-29

    Statistical analysis in science is an extremely powerful tool that is often underutilized. Additionally, it is frequently the case that data is misinterpreted or not used to its fullest extent. Utilizing the advanced software JMP®, many aspects of experimental design and data analysis can be evaluated and improved. This overview will detail the features of JMP® and how they were used to advance a project, resulting in time and cost savings, as well as the collection of scientifically sound data. The project analyzed in this report addresses the inability of a nitro plasticizer to coat a gold coated quartz crystal sensor used in a quartz crystal microbalance. Through the use of the JMP® software, the wettability of the nitro plasticizer was increased by over 200% using an atmospheric plasma pen, ensuring good sample preparation and reliable results.

  12. Structural weights analysis of advanced aerospace vehicles using finite element analysis

    Science.gov (United States)

    Bush, Lance B.; Lentz, Christopher A.; Rehder, John J.; Naftel, J. Chris; Cerro, Jeffrey A.

    1989-01-01

    A conceptual/preliminary level structural design system has been developed for structural integrity analysis and weight estimation of advanced space transportation vehicles. The system includes a three-dimensional interactive geometry modeler, a finite element pre- and post-processor, a finite element analyzer, and a structural sizing program. Inputs to the system include the geometry, surface temperature, material constants, construction methods, and aerodynamic and inertial loads. The results are a sized vehicle structure capable of withstanding the static loads incurred during assembly, transportation, operations, and missions, and a corresponding structural weight. An analysis of the Space Shuttle external tank is included in this paper as a validation and benchmark case of the system.

  13. Advanced gamma ray balloon experiment ground checkout and data analysis

    Science.gov (United States)

    Blackstone, M.

    1976-01-01

    A software programming package to be used in the ground checkout and handling of data from the advanced gamma ray balloon experiment is described. The Operator's Manual permits someone unfamiliar with the inner workings of the software system (called LEO) to operate on the experimental data as it comes from the Pulse Code Modulation interface, converting it to a form for later analysis, and monitoring the program of an experiment. A Programmer's Manual is included.

  14. Advances in oriental document analysis and recognition techniques

    CERN Document Server

    Lee, Seong-Whan

    1999-01-01

    In recent years, rapid progress has been made in computer processing of oriental languages, and the research developments in this area have resulted in tremendous changes in handwriting processing, printed oriental character recognition, document analysis and recognition, automatic input methodologies for oriental languages, etc. Advances in computer processing of oriental languages can also be seen in multimedia computing and the World Wide Web. Many of the results in those domains are presented in this book.

  15. Experiences from introduction of peer-to-peer teaching methods in Advanced Biochemistry E2010

    DEFF Research Database (Denmark)

    Brodersen, Ditlev; Etzerodt, Michael; Rasmussen, Jan Trige

    2012-01-01

    During the autumn semester 2010, we experimented with a range of active teaching methods on the course, Advanced Biochemistry, at the Department of Molecular Biology and Genetics.......During the autumn semester 2010, we experimented with a range of active teaching methods on the course, Advanced Biochemistry, at the Department of Molecular Biology and Genetics....

  16. Advanced exergoeconomic analysis of the multistage mixed refrigerant systems

    International Nuclear Information System (INIS)

    Mehrpooya, Mehdi; Ansarinasab, Hojat

    2015-01-01

    Highlights: • Advanced exergoeconomic analysis is performed for mixed refrigerant systems. • Cost of investment is divided into avoidable/unavoidable and endogenous/exogenous. • Results show that interactions between the components is not considerable. - Abstract: Advanced exergoeconomic analysis is applied on three multi stage mixed refrigerant liquefaction processes. They are propane precooled mixed refrigerant, dual mixed refrigerant and mixed fluid cascade. Cost of investment and exergy destruction for the components with high inefficiencies are divided into avoidable/unavoidable and endogenous/exogenous parts. According to the avoidable exergy destruction cost in propane precooled mixed refrigerant process, C-2 compressor with 455.5 ($/h), in dual mixed refrigerant process, C-1 compressor with 510.8 ($/h) and in mixed fluid cascade process, C-2/1 compressor with 338.8 ($/h) should be considered first. A comparison between the conventional and advanced exergoeconomic analysis is done by three important parameters: Exergy efficiency, exergoeconomic factor and total costs. Results show that interactions between the process components are not considerable because cost of investment and exergy destruction in most of them are endogenous. Exergy destruction cost of the compressors is avoidable while heat exchangers and air coolers destruction cost are unavoidable. Investment cost of heat exchangers and air coolers are avoidable while compressor’s are unavoidable

  17. Advanced construction methods for new nuclear power plants

    International Nuclear Information System (INIS)

    Bilbao y Leon, Sama; Cleveland, John; Moon, Seong-Gyun; Tyobeka, Bismark

    2009-01-01

    The length of the construction and commissioning phases of nuclear power plants have historically been longer than for conventional fossil fuelled plants, often having a record of delays and cost overruns as a result from several factors including legal interventions and revisions of safety regulations. Recent nuclear construction projects however, have shown that long construction periods for nuclear power plants are no longer the norm. While there are several inter-related factors that influence the construction time, the use of advanced construction techniques has contributed significantly to reducing the construction length of recent nuclear projects. (author)

  18. Advanced 3D inverse method for designing turbomachine blades

    Energy Technology Data Exchange (ETDEWEB)

    Dang, T. [Syracuse Univ., NY (United States)

    1995-10-01

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  19. Advances in beam position monitoring methods at GSI synchrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Rahul; Reiter, Andreas; Forck, Peter; Kowina, Piotr; Lang, Kevin; Miedzik, Piotr [GSI, Darmstadt (Germany)

    2016-07-01

    At the GSI synchrotron facilities, capacitive beam pick-up signals for position evaluation are immediately digitized within the acquisition electronics due to availability of reliable, fast and high resolution ADCs. The signal processing aspects are therefore fully dealt with in the digital domain. Novel digital techniques for asynchronous and synchronous (bunch-by-bunch) beam position estimation have been developed at GSI SIS-18 and CRYRING as part of FAIR development program. This contribution will highlight the advancements and its impact on the operational ease and high availability of the BPM systems.

  20. Nonlinear dynamics of rotating shallow water methods and advances

    CERN Document Server

    Zeitlin, Vladimir

    2007-01-01

    The rotating shallow water (RSW) model is of wide use as a conceptual tool in geophysical fluid dynamics (GFD), because, in spite of its simplicity, it contains all essential ingredients of atmosphere and ocean dynamics at the synoptic scale, especially in its two- (or multi-) layer version. The book describes recent advances in understanding (in the framework of RSW and related models) of some fundamental GFD problems, such as existence of the slow manifold, dynamical splitting of fast (inertia-gravity waves) and slow (vortices, Rossby waves) motions, nonlinear geostrophic adjustment and wa

  1. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  2. An evolutionary method for synthesizing technological planning and architectural advance

    Science.gov (United States)

    Cole, Bjorn Forstrom

    a graph-based representation of architecture, provides the rationale for choosing a given technology forecasting technique, and lays out the implementation of the optimization algorithm, named Sindri, within a commercial analysis code, Pacelab. The fifth chapter of the thesis then tests the Sindri code. The first test applied is a series of standardized combinatorial spaces, which are meant to be analogous to test problems traditionally posed to optimizers (e.g., Rosenbrock's valley function). The results from this test assess the value of various operators used to transform the architecture graph in the course of conducting a genetic search. Finally, this method is employed on a test case involving the transition of a miniature helicopter from glow engine to battery propulsion, and finally to a design where the battery functions as both structure and power source. The final two chapters develop conclusions based on the body of work conducted within this thesis and issue some prescriptions for future work. The future work primarily concerns improving the continuous optimization processes undertaken within Sindri and in further refining the graph-based structure for physical architectures.

  3. Advances in Probes and Methods for Clinical EPR Oximetry

    Science.gov (United States)

    Hou, Huagang; Khan, Nadeem; Jarvis, Lesley A.; Chen, Eunice Y.; Williams, Benjamin B.; Kuppusamy, Periannan

    2015-01-01

    EPR oximetry, which enables reliable, accurate, and repeated measurements of the partial pressure of oxygen in tissues, provides a unique opportunity to investigate the role of oxygen in the pathogenesis and treatment of several diseases including cancer, stroke, and heart failure. Building on significant advances in the in vivo application of EPR oximetry for small animal models of disease, we are developing suitable probes and instrumentation required for use in human subjects. Our laboratory has established the feasibility of clinical EPR oximetry in cancer patients using India ink, the only material presently approved for clinical use. We now are developing the next generation of probes, which are both superior in terms of oxygen sensitivity and biocompatibility including an excellent safety profile for use in humans. Further advances include the development of implantable oxygen sensors linked to an external coupling loop for measurements of deep-tissue oxygenations at any depth, overcoming the current limitation of 10 mm. This paper presents an overview of recent developments in our ability to make meaningful measurements of oxygen partial pressures in human subjects under clinical settings. PMID:24729217

  4. Advanced Signal Analysis for Forensic Applications of Ground Penetrating Radar

    Energy Technology Data Exchange (ETDEWEB)

    Steven Koppenjan; Matthew Streeton; Hua Lee; Michael Lee; Sashi Ono

    2004-06-01

    Ground penetrating radar (GPR) systems have traditionally been used to image subsurface objects. The main focus of this paper is to evaluate an advanced signal analysis technique. Instead of compiling spatial data for the analysis, this technique conducts object recognition procedures based on spectral statistics. The identification feature of an object type is formed from the training vectors by a singular-value decomposition procedure. To illustrate its capability, this procedure is applied to experimental data and compared to the performance of the neural-network approach.

  5. Development of design and analysis software for advanced nuclear system

    International Nuclear Information System (INIS)

    Wu Yican; Hu Liqin; Long Pengcheng; Luo Yuetong; Li Yazhou; Zeng Qin; Lu Lei; Zhang Junjun; Zou Jun; Xu Dezheng; Bai Yunqing; Zhou Tao; Chen Hongli; Peng Lei; Song Yong; Huang Qunying

    2010-01-01

    A series of professional codes, which are necessary software tools and data libraries for advanced nuclear system design and analysis, were developed by the FDS Team, including the codes of automatic modeling, physics and engineering calculation, virtual simulation and visualization, system engineering and safety analysis and the related database management etc. The development of these software series was proposed as an exercise of development of nuclear informatics. This paper introduced the main functions and key techniques of the software series, as well as some tests and practical applications. (authors)

  6. Advances in passive cooling design and performance analysis

    International Nuclear Information System (INIS)

    Woodcock, J.

    1994-01-01

    The Third International Conference on Containment Design and Operation continues the trend of rapidly extending the state of the art in containment methodology, joining other conferences, OECD-sponsored International Standard Problem exercises, and vendor licensing submittals. Methodology developed for use on plants with passive features is under increasing scrutiny for advanced designs, since the passive features are often the only deviation from existing operating base of the past 30 years of commercial nuclear power. This session, 'Containment Passive Safety Systems Design and Operation,' offers papers on a wide range of topics, with authors from six organizations from around the world, dealing with general passive containments, Westinghouse AP600, large (>1400 MWe) passive plants, and the AECL advanced CANDU reactor. This level and variety of participation underscores the high interest and accelerated methods development associated with advanced passive containment heat removal. The papers presented in this session demonstrate that significant contributions are being made to the advancement of technology necessary for building a new generation of safer, more economical nuclear plants. (author)

  7. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  8. Image analysis and modeling in medical image computing. Recent developments and advances.

    Science.gov (United States)

    Handels, H; Deserno, T M; Meinzer, H-P; Tolxdorff, T

    2012-01-01

    Medical image computing is of growing importance in medical diagnostics and image-guided therapy. Nowadays, image analysis systems integrating advanced image computing methods are used in practice e.g. to extract quantitative image parameters or to support the surgeon during a navigated intervention. However, the grade of automation, accuracy, reproducibility and robustness of medical image computing methods has to be increased to meet the requirements in clinical routine. In the focus theme, recent developments and advances in the field of modeling and model-based image analysis are described. The introduction of models in the image analysis process enables improvements of image analysis algorithms in terms of automation, accuracy, reproducibility and robustness. Furthermore, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients. Selected contributions are assembled to present latest advances in the field. The authors were invited to present their recent work and results based on their outstanding contributions to the Conference on Medical Image Computing BVM 2011 held at the University of Lübeck, Germany. All manuscripts had to pass a comprehensive peer review. Modeling approaches and model-based image analysis methods showing new trends and perspectives in model-based medical image computing are described. Complex models are used in different medical applications and medical images like radiographic images, dual-energy CT images, MR images, diffusion tensor images as well as microscopic images are analyzed. The applications emphasize the high potential and the wide application range of these methods. The use of model-based image analysis methods can improve segmentation quality as well as the accuracy and reproducibility of quantitative image analysis. Furthermore, image-based models enable new insights and can lead to a deeper understanding of complex dynamic mechanisms in the human body

  9. Advanced methods of microscope control using μManager software.

    Science.gov (United States)

    Edelstein, Arthur D; Tsuchida, Mark A; Amodaj, Nenad; Pinkard, Henry; Vale, Ronald D; Stuurman, Nico

    μManager is an open-source, cross-platform desktop application, to control a wide variety of motorized microscopes, scientific cameras, stages, illuminators, and other microscope accessories. Since its inception in 2005, μManager has grown to support a wide range of microscopy hardware and is now used by thousands of researchers around the world. The application provides a mature graphical user interface and offers open programming interfaces to facilitate plugins and scripts. Here, we present a guide to using some of the recently added advanced μManager features, including hardware synchronization, simultaneous use of multiple cameras, projection of patterned light onto a specimen, live slide mapping, imaging with multi-well plates, particle localization and tracking, and high-speed imaging.

  10. Proceedings of national workshop on advanced methods for materials characterization

    International Nuclear Information System (INIS)

    2004-10-01

    During the past two decades there had been tremendous growth in the field of material science and a variety of new materials with user specific properties have been developed such as smart shape memory alloys, hybrid materials like glass-ceramics, cermets, met-glasses, inorganic- organic composite layered structures, mixed oxides with negative thermal expansion, functional polymer materials etc. Study of nano-particles and the materials assembled from such particles is another area of active research being pursued all over the world. Preparation and characterization of nano-sized materials is a challenge because of their dimensions and size dependent properties. This has led to the emergence of a variety of advanced techniques, which need to be brought to the attention of the researchers working in the field of material science which requires the expertise of physics, chemistry and process engineering. This volume deals with above aspects and papers relevant to INIS are indexed separately

  11. Recent advances in neutral particle transport methods and codes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    An overview of ORNL's three-dimensional neutral particle transport code, TORT, is presented. Special features of the code that make it invaluable for large applications are summarized for the prospective user. Advanced capabilities currently under development and installation in the production release of TORT are discussed; they include: multitasking on Cray platforms running the UNICOS operating system; Adjacent cell Preconditioning acceleration scheme; and graphics codes for displaying computed quantities such as the flux. Further developments for TORT and its companion codes to enhance its present capabilities, as well as expand its range of applications are disucssed. Speculation on the next generation of neutron particle transport codes at ORNL, especially regarding unstructured grids and high order spatial approximations, are also mentioned

  12. Advanced methods of microscope control using μManager software

    Directory of Open Access Journals (Sweden)

    Arthur D Edelstein

    2014-07-01

    Full Text Available µManager is an open-source, cross-platform desktop application, to control a wide variety of motorized microscopes, scientific cameras, stages, illuminators, and other microscope accessories. Since its inception in 2005, µManager has grown to support a wide range of microscopy hardware and is now used by thousands of researchers around the world. The application provides a mature graphical user interface and offers open programming interfaces to facilitate plugins and scripts. Here, we present a guide to using some of the recently added advanced µManager features, including hardware synchronization, simultaneous use of multiple cameras, projection of patterned light onto a specimen, live slide mapping, imaging with multi-well plates, particle localization and tracking, and high-speed imaging.

  13. Statistical data analysis using SAS intermediate statistical methods

    CERN Document Server

    Marasinghe, Mervyn G

    2018-01-01

    The aim of this textbook (previously titled SAS for Data Analytics) is to teach the use of SAS for statistical analysis of data for advanced undergraduate and graduate students in statistics, data science, and disciplines involving analyzing data. The book begins with an introduction beyond the basics of SAS, illustrated with non-trivial, real-world, worked examples. It proceeds to SAS programming and applications, SAS graphics, statistical analysis of regression models, analysis of variance models, analysis of variance with random and mixed effects models, and then takes the discussion beyond regression and analysis of variance to conclude. Pedagogically, the authors introduce theory and methodological basis topic by topic, present a problem as an application, followed by a SAS analysis of the data provided and a discussion of results. The text focuses on applied statistical problems and methods. Key features include: end of chapter exercises, downloadable SAS code and data sets, and advanced material suitab...

  14. Advances in surface wave methods: Cascaded MASW-SASW

    NARCIS (Netherlands)

    Westerhoff, R.S.; Brouwer, J.H.; Meekes, J.A.C.

    2005-01-01

    The application of the MASW method in areas that show strong lateral variations in subsurface properties is limited. Traditional SASW may yield a better lateral resolution but the dispersion curves (and thus the subsurface models) obtained with the method may be poor. The joint application of MASW

  15. Adherence to Scientific Method while Advancing Exposure Science

    Science.gov (United States)

    Paul Lioy was simultaneously a staunch adherent to the scientific method and an innovator of new ways to conduct science, particularly related to human exposure. Current challenges to science and the application of the scientific method are presented as they relate the approaches...

  16. Ultrasonic and advanced methods for nondestructive testing and material characterization

    National Research Council Canada - National Science Library

    Chen, C. H

    2007-01-01

    ... and physics among others. There are at least two dozen NDT methods in use. In fact any sensor that can examine the inside of material nondestructively is useful for NDT. However the ultrasonic methods are still most popular because of its capability, flexibility, and relative cost effectiveness. For this reason this book places a heavy emphasis...

  17. Recent advances in computational methods and clinical applications for spine imaging

    CERN Document Server

    Glocker, Ben; Klinder, Tobias; Li, Shuo

    2015-01-01

    This book contains the full papers presented at the MICCAI 2014 workshop on Computational Methods and Clinical Applications for Spine Imaging. The workshop brought together scientists and clinicians in the field of computational spine imaging. The chapters included in this book present and discuss the new advances and challenges in these fields, using several methods and techniques in order to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modeling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis. The book also includes papers and reports from the first challenge on vertebra segmentation held at the workshop.

  18. Analysis methods (from 301 to 351)

    International Nuclear Information System (INIS)

    Analysis methods of materials used in the nuclear field (uranium, plutonium and their compounds, zirconium, magnesium, water...) and determination of impurities. Only reliable methods are selected [fr

  19. Advanced RF-KO slow-extraction method for the reduction of spill ripple

    CERN Document Server

    Noda, K; Shibuya, S; Uesugi, T; Muramatsu, M; Kanazawa, M; Takada, E; Yamada, S

    2002-01-01

    Two advanced RF-knockout (RF-KO) slow-extraction methods have been developed at HIMAC in order to reduce the spill ripple for accurate heavy-ion cancer therapy: the dual frequency modulation (FM) method and the separated function method. As a result of simulations and experiments, it was verified that the spill ripple could be considerably reduced using these advanced methods, compared with the ordinary RF-KO method. The dual FM method and the separated function method bring about a low spill ripple within standard deviations of around 25% and of 15% during beam extraction within around 2 s, respectively, which are in good agreement with the simulation results.

  20. Weathering Patterns of Ignitable Liquids with the Advanced Distillation Curve Method.

    Science.gov (United States)

    Bruno, Thomas J; Allen, Samuel

    2013-01-01

    One can take advantage of the striking similarity of ignitable liquid vaporization (or weathering) patterns and the separation observed during distillation to predict the composition of residual compounds in fire debris. This is done with the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. Analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, Karl Fischer coulombic titrimetry, refractometry, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. We have applied this method on product streams such as finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this paper, we present results on a variety of ignitable liquids that are not commodity fuels, chosen from the Ignitable Liquids Reference Collection (ILRC). These measurements are assembled into a preliminary database. From this selection, we discuss the significance and forensic application of the temperature data grid and the composition explicit data channel of the ADC.

  1. Introduction to the Special Issue on Advancing Methods for Analyzing Dialect Variation.

    Science.gov (United States)

    Clopper, Cynthia G

    2017-07-01

    Documenting and analyzing dialect variation is traditionally the domain of dialectology and sociolinguistics. However, modern approaches to acoustic analysis of dialect variation have their roots in Peterson and Barney's [(1952). J. Acoust. Soc. Am. 24, 175-184] foundational work on the acoustic analysis of vowels that was published in the Journal of the Acoustical Society of America (JASA) over 6 decades ago. Although Peterson and Barney (1952) were not primarily concerned with dialect variation, their methods laid the groundwork for the acoustic methods that are still used by scholars today to analyze vowel variation within and across languages. In more recent decades, a number of methodological advances in the study of vowel variation have been published in JASA, including work on acoustic vowel overlap and vowel normalization. The goal of this special issue was to honor that tradition by bringing together a set of papers describing the application of emerging acoustic, articulatory, and computational methods to the analysis of dialect variation in vowels and beyond.

  2. Weathering Patterns of Ignitable Liquids with the Advanced Distillation Curve Method

    Science.gov (United States)

    Bruno, Thomas J; Allen, Samuel

    2013-01-01

    One can take advantage of the striking similarity of ignitable liquid vaporization (or weathering) patterns and the separation observed during distillation to predict the composition of residual compounds in fire debris. This is done with the advanced distillation curve (ADC) metrology, which separates a complex fluid by distillation into fractions that are sampled, and for which thermodynamically consistent temperatures are measured at atmospheric pressure. The collected sample fractions can be analyzed by any method that is appropriate. Analytical methods we have applied include gas chromatography (with flame ionization, mass spectrometric and sulfur chemiluminescence detection), thin layer chromatography, FTIR, Karl Fischer coulombic titrimetry, refractometry, corrosivity analysis, neutron activation analysis and cold neutron prompt gamma activation analysis. We have applied this method on product streams such as finished fuels (gasoline, diesel fuels, aviation fuels, rocket propellants), crude oils (including a crude oil made from swine manure) and waste oils streams (used automotive and transformer oils). In this paper, we present results on a variety of ignitable liquids that are not commodity fuels, chosen from the Ignitable Liquids Reference Collection (ILRC). These measurements are assembled into a preliminary database. From this selection, we discuss the significance and forensic application of the temperature data grid and the composition explicit data channel of the ADC. PMID:26401423

  3. Method of advancing research and development of fast breeder reactors

    International Nuclear Information System (INIS)

    1988-01-01

    In the long term plan of atomic energy development and utilization, fast breeder reactors are to be developed as the main of the future nuclear power generation in Japan, and when their development is advanced, it has been decided to positively aim at building up the plutonium utilization system using FBRs superior to the uranium utilization system using LWRs. Also it has been decided that the development of FBRs requires to exert incessant efforts for a considerable long period under the proper cooperation system of government and people, and as for its concrete development, hereafter the deliberation is to be carried out in succession by the expert subcommittee on FBR development projects of the Atomic Energy Commission. The subcommittee was founded in May, 1986, to deliberate on the long term promotion measures for FBR development, the measures for promoting the research and development, the examination of the basic specification of a demonstration FBR, the measures for promoting international cooperation, and other important matters. As the results of investigation, the situation around the development of FBRs, the fundamentals at the time of promoting the research and development, the subjects of the research and development and so on are reported. (Kako, I.)

  4. Energy and advanced exergy analysis of an existing hydrocarbon recovery process

    International Nuclear Information System (INIS)

    Mehrpooya, Mehdi; Lazemzade, Roozbeh; Sadaghiani, Mirhadi S.; Parishani, Hossein

    2016-01-01

    Highlights: • Advanced exergoeconomic analysis is performed for propane refrigerant system. • Avoidable/unavoidable & endogenous/exogenous irreversibilities were calculated. • Advanced exergetic analysis identifies the potentials for improving the system. - Abstract: An advanced exergy analysis of the Ethane recovery plant in the South Pars gas field is presented. An industrial refrigeration cycle with propane refrigerant is investigated by the exergy analysis method. The equations of exergy destruction and exergetic efficiency for the main cycle units such as evaporators, condensers, compressors, and expansion valves are developed. Exergetic efficiency of the refrigeration cycle is determined to be 33.9% indicating a high potential for improvements. The simulation results reveal that the exergy loss and exergetic efficiencies of the air cooler and expansion sections respectively are the lowest among the compartments of the cycle. The coefficient of performance (COP) is obtained as 2.05. Four parts of irreversibility (avoidable/unavoidable) and (endogenous/exogenous) are calculated for the units with highest inefficiencies. The advanced exergy analysis reveals that the exergy destruction has two major contributors: (1) 59.61% of the exergy is lost in the unavoidable form in all units and (2) compressors contribute to 25.47% of the exergy destruction. So there is a high potential for improvement for these units, since 63.38% of this portion is avoidable.

  5. Advanced Steel Microstructural Classification by Deep Learning Methods.

    Science.gov (United States)

    Azimi, Seyed Majid; Britz, Dominik; Engstler, Michael; Fritz, Mario; Mücklich, Frank

    2018-02-01

    The inner structure of a material is called microstructure. It stores the genesis of a material and determines all its physical and chemical properties. While microstructural characterization is widely spread and well known, the microstructural classification is mostly done manually by human experts, which gives rise to uncertainties due to subjectivity. Since the microstructure could be a combination of different phases or constituents with complex substructures its automatic classification is very challenging and only a few prior studies exist. Prior works focused on designed and engineered features by experts and classified microstructures separately from the feature extraction step. Recently, Deep Learning methods have shown strong performance in vision applications by learning the features from data together with the classification step. In this work, we propose a Deep Learning method for microstructural classification in the examples of certain microstructural constituents of low carbon steel. This novel method employs pixel-wise segmentation via Fully Convolutional Neural Network (FCNN) accompanied by a max-voting scheme. Our system achieves 93.94% classification accuracy, drastically outperforming the state-of-the-art method of 48.89% accuracy. Beyond the strong performance of our method, this line of research offers a more robust and first of all objective way for the difficult task of steel quality appreciation.

  6. Development and application of advanced methods for electronic structure calculations

    DEFF Research Database (Denmark)

    Schmidt, Per Simmendefeldt

    . For this reason, part of this thesis relates to developing and applying a new method for constructing so-called norm-conserving PAW setups, that are applicable to GW calculations by using a genetic algorithm. The effect of applying the new setups significantly affects the absolute band positions, both for bulk......This thesis relates to improvements and applications of beyond-DFT methods for electronic structure calculations that are applied in computational material science. The improvements are of both technical and principal character. The well-known GW approximation is optimized for accurate calculations...... of electronic excitations in two-dimensional materials by exploiting exact limits of the screened Coulomb potential. This approach reduces the computational time by an order of magnitude, enabling large scale applications. The GW method is further improved by including so-called vertex corrections. This turns...

  7. Design and analysis of CANDU advanced fuel -Development of the advanced CANDU technology-

    International Nuclear Information System (INIS)

    Seok, Ho Cheon; Shim, Ki Seop; Byeon, Taek Sang; Park, Kwang Seok; Kim, Bong Ki; Lee, Yeong Uk; Jeong, Chang Joon; Oh, Deok Joo; Lee, Ui Joo; Park, Joo Hwan; Lee, Sang Yong; Jeong, Beop Dong; Choi, Han Rim; Lee, Yeong Jin; Choi, Cheol Jin; Choi, Jong Ho; Lee, Kwang Won; Cho, Cheon Hyi; On, Myeong Ryong; Kim, Taek Mo; Lim, Hong Sik; Lee, Kang Moon; Lee, Nam Ho; Lee, Kyu Hyeong

    1994-07-01

    It has been projected that a total of 5 pressurized heavy water reactors (PHWR) including Wolsong 1 under operation and Wolsong 2, 3 and 4 under construction will be operated by 2006, and so about 500 ton of natural uranium will be consumed every year and a lot of spent fuels will be generated. Therefore, the ultimate goal of this R and D project is to develop the CANDU advanced fuel having the following capabilities compared with existing standard fuel: (1) To reduce linear heat generation rating by more than 15% (i.e., less than 50 kW/m), (2) To extend fuel burnup by more than 3 times (i.e., higher than 21,000 MWD/MTU), and (3) To increase critical channel power by more than 5%. In accordance, the followings are performed in this fiscal year: (1) Undertake CANFLEX-NU design and thermalmechanical performance analysis, and prepare design documents, (2) Establish reactor physics analysis code system, and investigate the compativility of the CANFLEX-NU fuel with the standard 37-element fuel in the CANDU-6 reactor. (3) Establish safety analysis methodology with the assumption of the CANFLEX-NU loaded CANDU-6 reactor, and perform the preliminary thermalhydraulic and fuel behavior for the selected DBA accidents, (4) Investigate reactor physics analysis code system as pre-study for CANFLEX-SEU loaded reactors

  8. DTI analysis methods : Voxel-based analysis

    NARCIS (Netherlands)

    Van Hecke, Wim; Leemans, Alexander; Emsell, Louise

    2016-01-01

    Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does

  9. Advanced evaluation method of SG TSP BEC hole blockage rate

    International Nuclear Information System (INIS)

    Izumida, Hiroyuki; Nagata, Yasuyuki; Harada, Yutaka; Murakami, Ryuji

    2003-01-01

    In spite of the control of the water chemistry of SG secondary feed-water in PWR-SG, SG TSP BEC holes, which are the flow path of secondary water, are often clogged. In the past, the trending of BEC hole blockage rate has conducted by evaluating ECT original signals and visual inspections. However, the ECT original signals of deposits are diversified, it becomes difficult to analyze them with the existing evaluation method using the ECT original signals. In this regard, we have developed the secondary side visual inspection system, which enables the high-accuracy evaluation of BEC hole blockage rate, and new ECT signal evaluation method. (author)

  10. Advanced FDTD methods parallelization, acceleration, and engineering applications

    CERN Document Server

    Yu, Wenhua

    2011-01-01

    The finite-difference time-domain (FDTD) method has revolutionized antenna design and electromagnetics engineering. Here's a cutting-edge book that focuses on the performance optimization and engineering applications of FDTD simulation systems. Covering the latest developments in this area, this unique resource offer you expert advice on the FDTD method, hardware platforms, and network systems. Moreover the book offers guidance in distinguishing between the many different electromagnetics software packages on the market today. You also find a complete chapter dedicated to large multi-scale pro

  11. Bayesian methods for data analysis

    CERN Document Server

    Carlin, Bradley P.

    2009-01-01

    Approaches for statistical inference Introduction Motivating Vignettes Defining the Approaches The Bayes-Frequentist Controversy Some Basic Bayesian Models The Bayes approach Introduction Prior Distributions Bayesian Inference Hierarchical Modeling Model Assessment Nonparametric Methods Bayesian computation Introduction Asymptotic Methods Noniterative Monte Carlo Methods Markov Chain Monte Carlo Methods Model criticism and selection Bayesian Modeling Bayesian Robustness Model Assessment Bayes Factors via Marginal Density Estimation Bayes Factors

  12. Composite Structure Modeling and Analysis of Advanced Aircraft Fuselage Concepts

    Science.gov (United States)

    Mukhopadhyay, Vivek; Sorokach, Michael R.

    2015-01-01

    NASA Environmentally Responsible Aviation (ERA) project and the Boeing Company are collabrating to advance the unitized damage arresting composite airframe technology with application to the Hybrid-Wing-Body (HWB) aircraft. The testing of a HWB fuselage section with Pultruded Rod Stitched Efficient Unitized Structure (PRSEUS) construction is presently being conducted at NASA Langley. Based on lessons learned from previous HWB structural design studies, improved finite-element models (FEM) of the HWB multi-bay and bulkhead assembly are developed to evaluate the performance of the PRSEUS construction. In order to assess the comparative weight reduction benefits of the PRSEUS technology, conventional cylindrical skin-stringer-frame models of a cylindrical and a double-bubble section fuselage concepts are developed. Stress analysis with design cabin-pressure load and scenario based case studies are conducted for design improvement in each case. Alternate analysis with stitched composite hat-stringers and C-frames are also presented, in addition to the foam-core sandwich frame and pultruded rod-stringer construction. The FEM structural stress, strain and weights are computed and compared for relative weight/strength benefit assessment. The structural analysis and specific weight comparison of these stitched composite advanced aircraft fuselage concepts demonstrated that the pressurized HWB fuselage section assembly can be structurally as efficient as the conventional cylindrical fuselage section with composite stringer-frame and PRSEUS construction, and significantly better than the conventional aluminum construction and the double-bubble section concept.

  13. 20% inlet header break analysis of Advanced Heavy Water Reactor

    International Nuclear Information System (INIS)

    Srivastava, A.; Gupta, S.K.; Venkat Raj, V.; Singh, R.; Iyer, K.

    2001-01-01

    The proposed Advanced Heavy Water Reactor (AHWR) is a 750 MWt vertical pressure tube type boiling light water cooled and heavy water moderated reactor. A passive design feature of this reactor is that the heat removal is achieved through natural circulation of primary coolant at all power levels, with no primary coolant pumps. Loss of coolant due to failure of inlet header results in depressurization of primary heat transport (PHT) system and containment pressure rise. Depressurization activates various protective and engineered safety systems like reactor trip, isolation condenser and advanced accumulator, limiting the consequences of the event. This paper discusses the thermal hydraulic transient analysis for evaluating the safety of the reactor, following 20% inlet header break using RELAP5/MOD3.2. For the analysis, the system is discretized appropriately to simulate possible flow reversal in one of the core paths during the transient. Various modeling aspects are discussed in this paper and predictions are made for different parameters like pressure, temperature, steam quality and flow in different parts of the Primary Heat Transport (PHT) system. Flow and energy discharges into the containment are also estimated for use in containment analysis. (author)

  14. Geometrical methods for power network analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bellucci, Stefano; Tiwari, Bhupendra Nath [Istituto Nazioneale di Fisica Nucleare, Frascati, Rome (Italy). Lab. Nazionali di Frascati; Gupta, Neeraj [Indian Institute of Technology, Kanpur (India). Dept. of Electrical Engineering

    2013-02-01

    Uses advanced geometrical methods to analyse power networks. Provides a self-contained and tutorial introduction. Includes a fully worked-out example for the IEEE 5 bus system. This book is a short introduction to power system planning and operation using advanced geometrical methods. The approach is based on well-known insights and techniques developed in theoretical physics in the context of Riemannian manifolds. The proof of principle and robustness of this approach is examined in the context of the IEEE 5 bus system. This work addresses applied mathematicians, theoretical physicists and power engineers interested in novel mathematical approaches to power network theory.

  15. Substoichiometric method in the simple radiometric analysis

    International Nuclear Information System (INIS)

    Ikeda, N.; Noguchi, K.

    1979-01-01

    The substoichiometric method is applied to simple radiometric analysis. Two methods - the standard reagent method and the standard sample method - are proposed. The validity of the principle of the methods is verified experimentally in the determination of silver by the precipitation method, or of zinc by the ion-exchange or solvent-extraction method. The proposed methods are simple and rapid compared with the conventional superstoichiometric method. (author)

  16. Advances in computational methods for Quantum Field Theory calculations

    NARCIS (Netherlands)

    Ruijl, B.J.G.

    2017-01-01

    In this work we describe three methods to improve the performance of Quantum Field Theory calculations. First, we simplify large expressions to speed up numerical integrations. Second, we design Forcer, a program for the reduction of four-loop massless propagator integrals. Third, we extend the R*

  17. Origins, Methods and Advances in Qualitative Meta-Synthesis

    Science.gov (United States)

    Nye, Elizabeth; Melendez-Torres, G. J.; Bonell, Chris

    2016-01-01

    Qualitative research is a broad term encompassing many methods. Critiques of the field of qualitative research argue that while individual studies provide rich descriptions and insights, the absence of connections drawn between studies limits their usefulness. In response, qualitative meta-synthesis serves as a design to interpret and synthesise…

  18. Method of public support evaluation for advanced NPP deployment

    International Nuclear Information System (INIS)

    Zezula, L.; Hermansky, B.

    2005-01-01

    Public support of nuclear power could be fully recovered only if the public would, from the very beginning of the new power source selection process, receive transparent information and was made a part of interactive dialogue. The presented method was developed with the objective to facilitate the complex process of the utilities - public interaction. Our method of the public support evaluation allows to classify designs of new nuclear power plants taking into consideration the public attitude to continued nuclear power deployment in the Czech Republic as well as the preference of a certain plant design. The method is based on the model with a set of probabilistic input metrics, which permits to compare the offered concepts with the reference one, with a high degree of objectivity. This method is a part of the more complex evaluation procedure applicable for the new designs assessment that uses the computer code ''Potencial'' developed at the NRI Rez plc. The metrics of the established public support criteria are discussed. (author)

  19. New advanced in alpha spectrometry by liquid scintillation methods

    International Nuclear Information System (INIS)

    McDowell, W.J.; Case, G.N.

    1979-01-01

    Although the ability to count alpha particles by liquid scintillation methods has been long recognized, limited use has been made of the method because of problems of high background and alpha energy identification. In recent years some new developments in methods of introducing the alpha-emitting nuclide to the scintillator, in detector construction, and in electronics for processing the energy analog and time analog signals from the detector have allowed significant alleviation of the problems of alpha spectrometry by liquid scintillation. Energy resolutions of 200 to 300 keV full peak width at half maximum and background counts of 99% of all beta plus gamma interference is now possible. Alpha liquid scintillation spectrometry is now suitable for a wide range of applications, from the accurate quantitative determination of relatively large amounts of known nuclides in laboratory-generated samples to the detection and identification of very small, subpicocurie amounts of alpha emitters in environmental-type samples. Suitable nuclide separation procedures, sample preparation methods, and instrument configurations are available for a variety of analyses

  20. Structural analysis of advanced spent fuel conditioning process

    International Nuclear Information System (INIS)

    Gu, J. H.; Jung, W. M.; Jo, I. J.; Gug, D. H.; Yoo, K. S.

    2003-01-01

    An advanced spent fuel conditioning process (ACP) is developing for the safe and effective management of spent fuels which arising from the domestic nuclear power plants. And its demonstration facility is under design. This facility will be prepared by modifying IMEF's reserve hot cell facility which reserved for future usage by considering the characteristics of ACP. This study presents a basic structural architecture design and analysis results of ACP hot cell including modification of the IMEF. The results of this study will be used for the detail design of ACP demonstration facility, and utilized as basic data for the licensing of the ACP facility

  1. Advanced Wireless Power Transfer Vehicle and Infrastructure Analysis (Presentation)

    Energy Technology Data Exchange (ETDEWEB)

    Gonder, J.; Brooker, A.; Burton, E.; Wang, J.; Konan, A.

    2014-06-01

    This presentation discusses current research at NREL on advanced wireless power transfer vehicle and infrastructure analysis. The potential benefits of E-roadway include more electrified driving miles from battery electric vehicles, plug-in hybrid electric vehicles, or even properly equipped hybrid electric vehicles (i.e., more electrified miles could be obtained from a given battery size, or electrified driving miles could be maintained while using smaller and less expensive batteries, thereby increasing cost competitiveness and potential market penetration). The system optimization aspect is key given the potential impact of this technology on the vehicles, the power grid and the road infrastructure.

  2. Creep analysis of fuel plates for the Advanced Neutron Source

    International Nuclear Information System (INIS)

    Swinson, W.F.; Yahr, G.T.

    1994-11-01

    The reactor for the planned Advanced Neutron Source will use closely spaced arrays of fuel plates. The plates are thin and will have a core containing enriched uranium silicide fuel clad in aluminum. The heat load caused by the nuclear reactions within the fuel plates will be removed by flowing high-velocity heavy water through narrow channels between the plates. However, the plates will still be at elevated temperatures while in service, and the potential for excessive plate deformation because of creep must be considered. An analysis to include creep for deformation and stresses because of temperature over a given time span has been performed and is reported herein

  3. Structural Analysis of Advanced Refueling Machine of APR1400

    International Nuclear Information System (INIS)

    Cho, J. R.; Kim, Y. H.; Park, B. T.; Park, J. B.; Jung, J. H.

    2007-01-01

    The Refueling Machine (RM) consists of two structural parts of bridge and trolley. The bridge structure is approximately 8.5 m long and 5 m wide and is primarily composed of two deep wide flange sections spanning the rector area at the operating level. The trolley is mounted on wheels that roll on the rails of the bridge. Vertical movements of trolley and bridge are restricted by guide rollers. In this paper, dynamic and structural analyses based on the earthquake spectrum are carried out to verify the structural integrity of advanced refueling machine. It is done by 3-dimensional finite element analysis using ANSYS software

  4. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    International Nuclear Information System (INIS)

    Tsiflikas, Ilias; Biermann, Christina; Thomas, Christoph; Ketelsen, Dominik; Claussen, Claus D.; Heuschmid, Martin

    2012-01-01

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable

  5. Carotid artery stenosis: Performance of advanced vessel analysis software in evaluating CTA

    Energy Technology Data Exchange (ETDEWEB)

    Tsiflikas, Ilias, E-mail: ilias.tsiflikas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Biermann, Christina, E-mail: christina.biermann@siemens.com [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Siemens AG, Siemens Healthcare Consulting, Allee am Röthelheimpark 3A, 91052 Erlangen (Germany); Thomas, Christoph, E-mail: christoph.thomas@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Ketelsen, Dominik, E-mail: dominik.ketelsen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Claussen, Claus D., E-mail: claus.claussen@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany); Heuschmid, Martin, E-mail: martin.heuschmid@med.uni-tuebingen.de [University Hospital of Tuebingen, Diagnostic and Interventional Radiology, Hoppe-Seyler-Str. 3, 72076 Tuebingen (Germany)

    2012-09-15

    Objectives: The aim of this study was to evaluate time efficiency and diagnostic reproducibility of an advanced vessel analysis software for diagnosis of carotid artery stenosis. Material and methods: 40 patients with suspected carotid artery stenosis received head and neck DE-CTA as part of their pre-interventional workup. Acquired data were evaluated by 2 independent radiologists. Stenosis grading was performed by MPR eyeballing with freely adjustable MPRs and with a preliminary prototype of the meanwhile available client-server and advanced visualization software syngo.via CT Vascular (Siemens Healthcare, Erlangen, Germany). Stenoses were graded according to the following 5 categories: I: 0%, II: 1–50%, III: 51–69%, IV: 70–99% and V: total occlusion. Furthermore, time to diagnosis for each carotid artery was recorded. Results: Both readers achieved very good specificity values and good respectively very good sensitivity values without significant differences between both reading methods. Furthermore, there was a very good correlation between both readers for both reading methods without significant differences (kappa value: standard image interpretation k = 0.809; advanced vessel analysis software k = 0.863). Using advanced vessel analysis software resulted in a significant time saving (p < 0.0001) for both readers. Time to diagnosis could be decreased by approximately 55%. Conclusions: Advanced vessel analysis application CT Vascular of the new imaging software syngo.via (Siemens Healthcare, Forchheim, Germany) provides a high rate of reproducibility in assessment of carotid artery stenosis. Furthermore a significant time saving in comparison to standard image interpretation is achievable.

  6. Investigation and Development of Advanced Surface Microanalysis Techniques and Methods

    Science.gov (United States)

    1983-04-01

    California 94402 and Stephen L. Grube Watkins-Johnson 440 Kings Village Road Scotts Valley, California 95066 as published in Analytical Chemistry , 1985, 57...34 E. Silberg , T. Y. Chang, E. A. Caridi, C. A. Evans Jr. and C. J. Hitzman in Gallium Arsenide and Related Compounds 1982, 10th International Symposium...Spectrometry," P. K. Chu and S. L. Grube, Analytical Chemistry . 13. "Direct Lateral and In-Depth Distributional Analysis for Ionic - Contaminants in

  7. Advanced hydraulic fracturing methods to create in situ reactive barriers

    International Nuclear Information System (INIS)

    Murdoch, L.

    1997-01-01

    This article describes the use of hydraulic fracturing to increase permeability in geologic formations where in-situ remedial action of contaminant plumes will be performed. Several in-situ treatment strategies are discussed including the use of hydraulic fracturing to create in situ redox zones for treatment of organics and inorganics. Hydraulic fracturing methods offer a mechanism for the in-situ treatment of gently dipping layers of reactive compounds. Specialized methods using real-time monitoring and a high-energy jet during fracturing allow the form of the fracture to be influenced, such as creation of assymmetric fractures beneath potential sources (i.e. tanks, pits, buildings) that should not be penetrated by boring. Some examples of field applications of this technique such as creating fractures filled with zero-valent iron to reductively dechlorinate halogenated hydrocarbons, and the use of granular activated carbon to adsorb compounds are discussed

  8. Advanced Numerical and Theoretical Methods for Photonic Crystals and Metamaterials

    Science.gov (United States)

    Felbacq, Didier

    2016-11-01

    This book provides a set of theoretical and numerical tools useful for the study of wave propagation in metamaterials and photonic crystals. While concentrating on electromagnetic waves, most of the material can be used for acoustic (or quantum) waves. For each presented numerical method, numerical code written in MATLAB® is presented. The codes are limited to 2D problems and can be easily translated in Python or Scilab, and used directly with Octave as well.

  9. Advanced methods for scattering amplitudes in gauge theories

    Energy Technology Data Exchange (ETDEWEB)

    Peraro, Tiziano

    2014-09-24

    We present new techniques for the evaluation of multi-loop scattering amplitudes and their application to gauge theories, with relevance to the Standard Model phenomenology. We define a mathematical framework for the multi-loop integrand reduction of arbitrary diagrams, and elaborate algebraic approaches, such as the Laurent expansion method, implemented in the software Ninja, and the multivariate polynomial division technique by means of Groebner bases.

  10. Advanced methods for scattering amplitudes in gauge theories

    International Nuclear Information System (INIS)

    Peraro, Tiziano

    2014-01-01

    We present new techniques for the evaluation of multi-loop scattering amplitudes and their application to gauge theories, with relevance to the Standard Model phenomenology. We define a mathematical framework for the multi-loop integrand reduction of arbitrary diagrams, and elaborate algebraic approaches, such as the Laurent expansion method, implemented in the software Ninja, and the multivariate polynomial division technique by means of Groebner bases.

  11. Advanced scoring method of eco-efficiency in European cities.

    Science.gov (United States)

    Moutinho, Victor; Madaleno, Mara; Robaina, Margarita; Villar, José

    2018-01-01

    This paper analyzes a set of selected German and French cities' performance in terms of the relative behavior of their eco-efficiencies, computed as the ratio of their gross domestic product (GDP) over their CO 2 emissions. For this analysis, eco-efficiency scores of the selected cities are computed using the data envelopment analysis (DEA) technique, taking the eco-efficiencies as outputs, and the inputs being the energy consumption, the population density, the labor productivity, the resource productivity, and the patents per inhabitant. Once DEA results are analyzed, the Malmquist productivity indexes (MPI) are used to assess the time evolution of the technical efficiency, technological efficiency, and productivity of the cities over the window periods 2000 to 2005 and 2005 to 2008. Some of the main conclusions are that (1) most of the analyzed cities seem to have suboptimal scales, being one of the causes of their inefficiency; (2) there is evidence that high GDP over CO 2 emissions does not imply high eco-efficiency scores, meaning that DEA like approaches are useful to complement more simplistic ranking procedures, pointing out potential inefficiencies at the input levels; (3) efficiencies performed worse during the period 2000-2005 than during the period 2005-2008, suggesting the possibility of corrective actions taken during or at the end of the first period but impacting only on the second period, probably due to an increasing environmental awareness of policymakers and governors; and (4) MPI analysis shows a positive technological evolution of all cities, according to the general technological evolution of the reference cities, reflecting a generalized convergence of most cities to their technological frontier and therefore an evolution in the right direction.

  12. Chemical methods of rock analysis

    National Research Council Canada - National Science Library

    Jeffery, P. G; Hutchison, D

    1981-01-01

    .... Such methods include those based upon spectrophotometry, flame emission spectrometry and atomic absorption spectroscopy, as well as gravimetry, titrimetry and the use of ion-selective electrodes...

  13. Advanced Computational Methods for Thermal Radiative Heat Transfer

    Energy Technology Data Exchange (ETDEWEB)

    Tencer, John; Carlberg, Kevin Thomas; Larsen, Marvin E.; Hogan, Roy E.,

    2016-10-01

    Participating media radiation (PMR) in weapon safety calculations for abnormal thermal environments are too costly to do routinely. This cost may be s ubstantially reduced by applying reduced order modeling (ROM) techniques. The application of ROM to PMR is a new and unique approach for this class of problems. This approach was investigated by the authors and shown to provide significant reductions in the computational expense associated with typical PMR simulations. Once this technology is migrated into production heat transfer analysis codes this capability will enable the routine use of PMR heat transfer in higher - fidelity simulations of weapon resp onse in fire environments.

  14. Aquatic ecosystem protection and restoration: Advances in methods for assessment and evaluation

    Science.gov (United States)

    Bain, M.B.; Harig, A.L.; Loucks, D.P.; Goforth, R.R.; Mills, K.E.

    2000-01-01

    Many methods and criteria are available to assess aquatic ecosystems, and this review focuses on a set that demonstrates advancements from community analyses to methods spanning large spatial and temporal scales. Basic methods have been extended by incorporating taxa sensitivity to different forms of stress, adding measures linked to system function, synthesizing multiple faunal groups, integrating biological and physical attributes, spanning large spatial scales, and enabling simulations through time. These tools can be customized to meet the needs of a particular assessment and ecosystem. Two case studies are presented to show how new methods were applied at the ecosystem scale for achieving practical management goals. One case used an assessment of biotic structure to demonstrate how enhanced river flows can improve habitat conditions and restore a diverse fish fauna reflective of a healthy riverine ecosystem. In the second case, multitaxonomic integrity indicators were successful in distinguishing lake ecosystems that were disturbed, healthy, and in the process of restoration. Most methods strive to address the concept of biological integrity and assessment effectiveness often can be impeded by the lack of more specific ecosystem management objectives. Scientific and policy explorations are needed to define new ways for designating a healthy system so as to allow specification of precise quality criteria that will promote further development of ecosystem analysis tools.

  15. Advanced scientific computational methods and their applications of nuclear technologies. (1) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (1)

    International Nuclear Information System (INIS)

    Oka, Yoshiaki; Okuda, Hiroshi

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the first issue showing their overview and introduction of continuum simulation methods. Finite element method as their applications is also reviewed. (T. Tanaka)

  16. New trends and advanced methods in interdisciplinary mathematical sciences

    CERN Document Server

    2017-01-01

    The latest of five multidisciplinary volumes, this book spans the STEAM-H (Science, Technology, Engineering, Agriculture, Mathematics, and Health) disciplines with the intent to generate meaningful interdisciplinary interaction and student interest. Emphasis is placed on important methods and applications within and beyond each field. Topics include geometric triple systems, image segmentation, pattern recognition in medicine, pricing barrier options, p-adic numbers distribution in geophysics data pattern, adelic physics, and evolutionary game theory. Contributions were by invitation only and peer-reviewed. Each chapter is reasonably self-contained and pedagogically presented for a multidisciplinary readership.

  17. Advanced Control Methods for Optimization of Arc Welding

    DEFF Research Database (Denmark)

    Thomsen, J. S.

    Gas Metal Arc Welding (GMAW) is a proces used for joining pieces of metal. Probably, the GMAW process is the most successful and widely used welding method in the industry today. A key issue in welding is the quality of the welds produced. The quality of a weld is influenced by several factors...... in the overall welding process; one of these factors are the ability of the welding machine to control the process. The internal control algorithms in GMAW machines are the topic of this PhD project. Basically, the internal control includes an algorithm which is able to keep the electrode at a given distance...

  18. Recent Advances in Conotoxin Classification by Using Machine Learning Methods.

    Science.gov (United States)

    Dao, Fu-Ying; Yang, Hui; Su, Zhen-Dong; Yang, Wuritu; Wu, Yun; Hui, Ding; Chen, Wei; Tang, Hua; Lin, Hao

    2017-06-25

    Conotoxins are disulfide-rich small peptides, which are invaluable peptides that target ion channel and neuronal receptors. Conotoxins have been demonstrated as potent pharmaceuticals in the treatment of a series of diseases, such as Alzheimer's disease, Parkinson's disease, and epilepsy. In addition, conotoxins are also ideal molecular templates for the development of new drug lead compounds and play important roles in neurobiological research as well. Thus, the accurate identification of conotoxin types will provide key clues for the biological research and clinical medicine. Generally, conotoxin types are confirmed when their sequence, structure, and function are experimentally validated. However, it is time-consuming and costly to acquire the structure and function information by using biochemical experiments. Therefore, it is important to develop computational tools for efficiently and effectively recognizing conotoxin types based on sequence information. In this work, we reviewed the current progress in computational identification of conotoxins in the following aspects: (i) construction of benchmark dataset; (ii) strategies for extracting sequence features; (iii) feature selection techniques; (iv) machine learning methods for classifying conotoxins; (v) the results obtained by these methods and the published tools; and (vi) future perspectives on conotoxin classification. The paper provides the basis for in-depth study of conotoxins and drug therapy research.

  19. Molecular analysis of microbial diversity in advanced caries.

    Science.gov (United States)

    Chhour, Kim-Ly; Nadkarni, Mangala A; Byun, Roy; Martin, F Elizabeth; Jacques, Nicholas A; Hunter, Neil

    2005-02-01

    Real-time PCR analysis of the total bacterial load in advanced carious lesions has shown that the total load exceeds the number of cultivable bacteria. This suggests that an unresolved complexity exists in bacteria associated with advanced caries. In this report, the profile of the microflora of carious dentine was explored by using DNA extracted from 10 lesions selected on the basis of comparable total microbial load and on the relative abundance of Prevotella spp. Using universal primers for the 16S rRNA gene, PCR amplicons were cloned, and approximately 100 transformants were processed for each lesion. Phylogenetic analysis of 942 edited sequences demonstrated the presence of 75 species or phylotypes in the 10 carious lesions. Up to 31 taxa were represented in each sample. A diverse array of lactobacilli were found to comprise 50% of the species, with prevotellae also abundant, comprising 15% of the species. Other taxa present in a number of lesions or occurring with high abundance included Selenomonas spp., Dialister spp., Fusobacterium nucleatum, Eubacterium spp., members of the Lachnospiraceae family, Olsenella spp., Bifidobacterium spp., Propionibacterium sp., and Pseudoramibacter alactolyticus. The mechanisms by which such diverse patterns of bacteria extend carious lesions, including the aspect of infection of the vital dental pulp, remain unclear.

  20. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-01

    research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing

  1. Advanced Materials Test Methods for Improved Life Prediction of Turbine Engine Components

    National Research Council Canada - National Science Library

    Stubbs, Jack

    2000-01-01

    Phase I final report developed under SBIR contract for Topic # AF00-149, "Durability of Turbine Engine Materials/Advanced Material Test Methods for Improved Use Prediction of Turbine Engine Components...

  2. Imaging spectroscopic analysis at the Advanced Light Source

    International Nuclear Information System (INIS)

    MacDowell, A. A.; Warwick, T.; Anders, S.; Lamble, G.M.; Martin, M.C.; McKinney, W.R.; Padmore, H.A.

    1999-01-01

    One of the major advances at the high brightness third generation synchrotrons is the dramatic improvement of imaging capability. There is a large multi-disciplinary effort underway at the ALS to develop imaging X-ray, UV and Infra-red spectroscopic analysis on a spatial scale from. a few microns to 10nm. These developments make use of light that varies in energy from 6meV to 15KeV. Imaging and spectroscopy are finding applications in surface science, bulk materials analysis, semiconductor structures, particulate contaminants, magnetic thin films, biology and environmental science. This article is an overview and status report from the developers of some of these techniques at the ALS. The following table lists all the currently available microscopes at the. ALS. This article will describe some of the microscopes and some of the early applications

  3. Advanced exergy analysis on a modified auto-cascade freezer cycle with an ejector

    International Nuclear Information System (INIS)

    Bai, Tao; Yu, Jianlin; Yan, Gang

    2016-01-01

    This paper presents a study on a modified ejector enhanced auto-cascade freezer cycle with conventional thermodynamic and advanced exergy analysis methods. The energetic analysis shows that the modified cycle exhibits better performance than the conventional auto-cascade freezer cycle, and the system COP and volumetric refrigeration capacity could be improved by 19.93% and 28.42%. Furthermore, advanced exergy analysis is adopted to better evaluate the performance of the proposed cycle. The exergy destruction within a system component is split into endogenous/exogenous and unavoidable/avoidable parts in the advanced exergy analysis. The results show that the compressor with the largest avoidable endogenous exergy destruction has highest improvement priority, followed by the condenser, evaporator and ejector, which is different from the conclusion obtained from the conventional exergy analysis. The evaporator/condenser greatly affects the exogenous exergy destruction within the system components, and the compressor has large impact on the exergy destruction within the condenser. Improving the efficiencies of the compressor efficiency and the ejector could effectively reduce the corresponding avoidable endogenous exergy destruction. The exergy destruction within the evaporator almost entirely belongs to the endogenous part, and reducing the temperature difference at the evaporator is the main approach of reducing its exergy destruction. - Highlights: • A modified ejector enhanced auto-cascade freezer cycle is proposed. • Conventional and advanced exergy analyses are performed in this study. • Compressor should be firstly improved first, followed by condenser and evaporator. • Interactions among the system components are assessed with advanced exergy analysis.

  4. Meshless methods in biomechanics bone tissue remodelling analysis

    CERN Document Server

    Belinha, Jorge

    2014-01-01

    This book presents the complete formulation of a new advanced discretization meshless technique: the Natural Neighbour Radial Point Interpolation Method (NNRPIM). In addition, two of the most popular meshless methods, the EFGM and the RPIM, are fully presented. Being a truly meshless method, the major advantages of the NNRPIM over the FEM, and other meshless methods, are the remeshing flexibility and the higher accuracy of the obtained variable field. Using the natural neighbour concept, the NNRPIM permits to determine organically the influence-domain, resembling the cellulae natural behaviour. This innovation permits the analysis of convex boundaries and extremely irregular meshes, which is an advantage in the biomechanical analysis, with no extra computational effort associated.   This volume shows how to extend the NNRPIM to the bone tissue remodelling analysis, expecting to contribute with new numerical tools and strategies in order to permit a more efficient numerical biomechanical analysis.

  5. Advanced methods and algorithm for high precision astronomical imaging

    International Nuclear Information System (INIS)

    Ngole-Mboula, Fred-Maurice

    2016-01-01

    One of the biggest challenges of modern cosmology is to gain a more precise knowledge of the dark energy and the dark matter nature. Fortunately, the dark matter can be traced directly through its gravitational effect on galaxies shapes. The European Spatial Agency Euclid mission will precisely provide data for such a purpose. A critical step is analyzing these data will be to accurately model the instrument Point Spread Function (PSF), which the focus of this thesis.We developed non parametric methods to reliably estimate the PSFs across an instrument field-of-view, based on unresolved stars images and accounting for noise, under sampling and PSFs spatial variability. At the core of these contributions, modern mathematical tools and concepts such as sparsity. An important extension of this work will be to account for the PSFs wavelength dependency. (author) [fr

  6. Comparative Assessment of Advanced Gay Hydrate Production Methods

    Energy Technology Data Exchange (ETDEWEB)

    M. D. White; B. P. McGrail; S. K. Wurstner

    2009-06-30

    Displacing natural gas and petroleum with carbon dioxide is a proven technology for producing conventional geologic hydrocarbon reservoirs, and producing additional yields from abandoned or partially produced petroleum reservoirs. Extending this concept to natural gas hydrate production offers the potential to enhance gas hydrate recovery with concomitant permanent geologic sequestration. Numerical simulation was used to assess a suite of carbon dioxide injection techniques for producing gas hydrates from a variety of geologic deposit types. Secondary hydrate formation was found to inhibit contact of the injected CO{sub 2} regardless of injectate phase state, thus diminishing the exchange rate due to pore clogging and hydrate zone bypass of the injected fluids. Additional work is needed to develop methods of artificially introducing high-permeability pathways in gas hydrate zones if injection of CO{sub 2} in either gas, liquid, or micro-emulsion form is to be more effective in enhancing gas hydrate production rates.

  7. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    International Nuclear Information System (INIS)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun; Jung, Won Dea

    2014-01-01

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks

  8. A Conceptual Framework of Human Reliability Analysis for Execution Human Error in NPP Advanced MCRs

    Energy Technology Data Exchange (ETDEWEB)

    Jang, In Seok; Kim, Ar Ryum; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-08-15

    The operation environment of Main Control Rooms (MCRs) in Nuclear Power Plants (NPPs) has changed with the adoption of new human-system interfaces that are based on computer-based technologies. The MCRs that include these digital and computer technologies, such as large display panels, computerized procedures, and soft controls, are called Advanced MCRs. Among the many features of Advanced MCRs, soft controls are a particularly important feature because the operation action in NPP Advanced MCRs is performed by soft control. Using soft controls such as mouse control, and touch screens, operators can select a specific screen, then choose the controller, and finally manipulate the given devices. Due to the different interfaces between soft control and hardwired conventional type control, different human error probabilities and a new Human Reliability Analysis (HRA) framework should be considered in the HRA for advanced MCRs. In other words, new human error modes should be considered for interface management tasks such as navigation tasks, and icon (device) selection tasks in monitors and a new framework of HRA method taking these newly generated human error modes into account should be considered. In this paper, a conceptual framework for a HRA method for the evaluation of soft control execution human error in advanced MCRs is suggested by analyzing soft control tasks.

  9. Advanced hydraulic fracturing methods to create in situ reactive barriers

    International Nuclear Information System (INIS)

    Murdoch, L.; Siegrist, B.; Vesper, S.

    1997-01-01

    Many contaminated areas consist of a source area and a plume. In the source area, the contaminant moves vertically downward from a release point through the vadose zone to an underlying saturated region. Where contaminants are organic liquids, NAPL may accumulate on the water table, or it may continue to migrate downward through the saturated region. Early developments of permeable barrier technology have focused on intercepting horizontally moving plumes with vertical structures, such as trenches, filled with reactive material capable of immobilizing or degrading dissolved contaminants. This focus resulted in part from a need to economically treat the potentially large volumes of contaminated water in a plume, and in part from the availability of construction technology to create the vertical structures that could house reactive compounds. Contaminant source areas, however, have thus far remained largely excluded from the application of permeable barrier technology. One reason for this is the lack of conventional construction methods for creating suitable horizontal structures that would place reactive materials in the path of downward-moving contaminants. Methods of hydraulic fracturing have been widely used to create flat-lying to gently dipping layers of granular material in unconsolidated sediments. Most applications thus far have involved filling fractures with coarse-grained sand to create permeable layers that will increase the discharge of wells recovering contaminated water or vapor. However, it is possible to fill fractures with other compounds that alter the chemical composition of the subsurface. One early application involved development and field testing micro-encapsulated sodium percarbonate, a solid compound that releases oxygen and can create aerobic conditions suitable for biodegradation in the subsurface for several months

  10. Advanced communication methods developed for nuclear data communication applications

    International Nuclear Information System (INIS)

    Tiwari, Akash; Tiwari, Railesha; Tiwari, S.S.; Panday, Lokesh; Suri, Nitin; Takle, Tarun Rao; Jain, Sanjeev; Gupta, Rishi; Sharma, Dipeeka; Takle, Rahul Rao; Gautam, Rajeev; Bhargava, Vishal; Arora, Himanshu; Agarwal, Ankur; Rupesh; Chawla, Mohit; Sethi, Amardeep Singh; Gupta, Mukesh; Gupta, Ankit; Verma, Neha; Sood, Nitin; Singh, Sunil; Agarwal, Chandresh

    2004-01-01

    We conducted various experiments and tested data communications methods that may be useful for various applications in nuclear industries. We explored the following areas. I. Scientific data communication among scientists within the laboratory and inter-laboratory data exchange. 2.Data from sensors from remote and wired sensors. 3.Data from multiple sensors with small zone. 4.Data from single or multiple sensors from distances above 100 m and less than 10 km. No any single data communication method was found to be the best solution for nuclear applications and multiple modes of communication were found to be advantageous than any single mode of data communication. Network of computers in the control room and in between laboratories connected with optical fiber or an isolated Ethernet coaxial LAN was found to be optimum. Information from multiple analog process sensors in smaller zones like reactor building and laboratories on 12C LAN and short-range wireless LAN were found to be advantageous. Within the laboratory sensor data network of 12C was found to be cost effective and wireless LAN was comparatively expansive. Within a room infrared optical LAN and FSK wireless LAN were found to be highly useful in making the sensors free from wires. Direct sensor interface on FSK wireless link were found to be fast accurate, cost effective over large distance data communication. Such links are the only way to communicate from sea boy and balloons hardware. 1-wire communication network of Dallas Semiconductor USA for weather station data communication Computer to computer communication using optical LAN links has been tried, temperature pressure, humidity, ionizing radiation, generator RPM and voltage and various other analog signals were also transported o FSK optical and wireless links. Multiple sensors needed a dedicated data acquisition system and wireless LAN for data telemetry. (author)

  11. MAESTRO: Methods and Advanced Equipment for Simulation and Treatment in Radio-Oncology

    Science.gov (United States)

    Barthe, Jean; Hugon, Régis; Nicolai, Jean Philippe

    2007-12-01

    The integrated project MAESTRO (Methods and Advanced Equipment for Simulation and Treatment in Radio-Oncology) under contract with the European Commission in life sciences FP6 (LSHC-CT-2004-503564), concerns innovative research to develop and validate in clinical conditions, advanced methods and equipment needed in cancer treatment for new modalities in high-conformal external radiotherapy using electrons, photons and protons beams of high energy.

  12. ANDREA: Advanced nodal diffusion code for reactor analysis

    International Nuclear Information System (INIS)

    Belac, J.; Josek, R.; Klecka, L.; Stary, V.; Vocka, R.

    2005-01-01

    A new macro code is being developed at NRI which will allow coupling of the advanced thermal-hydraulics model with neutronics calculations as well as efficient use in core loading pattern optimization process. This paper describes the current stage of the macro code development. The core simulator is based on the nodal expansion method, Helios lattice code is used for few group libraries preparation. Standard features such as pin wise power reconstruction and feedback iterations on critical control rod position, boron concentration and reactor power are implemented. A special attention is paid to the system and code modularity in order to enable flexible and easy implementation of new features in future. Precision of the methods used in the macro code has been verified on available benchmarks. Testing against Temelin PWR operational data is under way (Authors)

  13. Phase advance and β function measurements using model-independent analysis

    OpenAIRE

    Chun-xi Wang; Vadim Sajaev; Chih-Yuan Yao

    2003-01-01

    Phase advance and β function are basic lattice functions characterizing the linear properties of an accelerator lattice. Accurate and efficient measurements of these quantities are important for commissioning and operating a machine. For rings with little coupling, we report a new method to measure these lattice functions based on the model-independent analysis technique, which uses beam histories of excited betatron oscillations measured simultaneously at a large number of beam position moni...

  14. Adaptive Modeling, Engineering Analysis and Design of Advanced Aerospace Vehicles

    Science.gov (United States)

    Mukhopadhyay, Vivek; Hsu, Su-Yuen; Mason, Brian H.; Hicks, Mike D.; Jones, William T.; Sleight, David W.; Chun, Julio; Spangler, Jan L.; Kamhawi, Hilmi; Dahl, Jorgen L.

    2006-01-01

    This paper describes initial progress towards the development and enhancement of a set of software tools for rapid adaptive modeling, and conceptual design of advanced aerospace vehicle concepts. With demanding structural and aerodynamic performance requirements, these high fidelity geometry based modeling tools are essential for rapid and accurate engineering analysis at the early concept development stage. This adaptive modeling tool was used for generating vehicle parametric geometry, outer mold line and detailed internal structural layout of wing, fuselage, skin, spars, ribs, control surfaces, frames, bulkheads, floors, etc., that facilitated rapid finite element analysis, sizing study and weight optimization. The high quality outer mold line enabled rapid aerodynamic analysis in order to provide reliable design data at critical flight conditions. Example application for structural design of a conventional aircraft and a high altitude long endurance vehicle configuration are presented. This work was performed under the Conceptual Design Shop sub-project within the Efficient Aerodynamic Shape and Integration project, under the former Vehicle Systems Program. The project objective was to design and assess unconventional atmospheric vehicle concepts efficiently and confidently. The implementation may also dramatically facilitate physics-based systems analysis for the NASA Fundamental Aeronautics Mission. In addition to providing technology for design and development of unconventional aircraft, the techniques for generation of accurate geometry and internal sub-structure and the automated interface with the high fidelity analysis codes could also be applied towards the design of vehicles for the NASA Exploration and Space Science Mission projects.

  15. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  16. Advanced Energy Storage Devices: Basic Principles, Analytical Methods, and Rational Materials Design

    Science.gov (United States)

    Liu, Jilei; Wang, Jin; Xu, Chaohe; Li, Chunzhong; Lin, Jianyi

    2017-01-01

    Abstract Tremendous efforts have been dedicated into the development of high‐performance energy storage devices with nanoscale design and hybrid approaches. The boundary between the electrochemical capacitors and batteries becomes less distinctive. The same material may display capacitive or battery‐like behavior depending on the electrode design and the charge storage guest ions. Therefore, the underlying mechanisms and the electrochemical processes occurring upon charge storage may be confusing for researchers who are new to the field as well as some of the chemists and material scientists already in the field. This review provides fundamentals of the similarities and differences between electrochemical capacitors and batteries from kinetic and material point of view. Basic techniques and analysis methods to distinguish the capacitive and battery‐like behavior are discussed. Furthermore, guidelines for material selection, the state‐of‐the‐art materials, and the electrode design rules to advanced electrode are proposed. PMID:29375964

  17. Methods in quantitative image analysis.

    Science.gov (United States)

    Oberholzer, M; Ostreicher, M; Christen, H; Brühlmann, M

    1996-05-01

    The main steps of image analysis are image capturing, image storage (compression), correcting imaging defects (e.g. non-uniform illumination, electronic-noise, glare effect), image enhancement, segmentation of objects in the image and image measurements. Digitisation is made by a camera. The most modern types include a frame-grabber, converting the analog-to-digital signal into digital (numerical) information. The numerical information consists of the grey values describing the brightness of every point within the image, named a pixel. The information is stored in bits. Eight bits are summarised in one byte. Therefore, grey values can have a value between 0 and 256 (2(8)). The human eye seems to be quite content with a display of 5-bit images (corresponding to 64 different grey values). In a digitised image, the pixel grey values can vary within regions that are uniform in the original scene: the image is noisy. The noise is mainly manifested in the background of the image. For an optimal discrimination between different objects or features in an image, uniformity of illumination in the whole image is required. These defects can be minimised by shading correction [subtraction of a background (white) image from the original image, pixel per pixel, or division of the original image by the background image]. The brightness of an image represented by its grey values can be analysed for every single pixel or for a group of pixels. The most frequently used pixel-based image descriptors are optical density, integrated optical density, the histogram of the grey values, mean grey value and entropy. The distribution of the grey values existing within an image is one of the most important characteristics of the image. However, the histogram gives no information about the texture of the image. The simplest way to improve the contrast of an image is to expand the brightness scale by spreading the histogram out to the full available range. Rules for transforming the grey value

  18. Advanced stress analysis of PWR containments in the region of nozzles

    International Nuclear Information System (INIS)

    Schauer, G.

    1977-01-01

    As an example of the stress analysis of a nozzle in a PWR steel containment, an advanced stress analysis of a personnel lock is presented. Contrary to the calculations by means of numerical shell programs usual till now, this advanced stress analysis was executed with the finite-element-method. Because of their theory, the shell programs compute mathematically exact results, but at the intersection of two shells the notch stresses cannot be analyzed well. A further disadvantage must be seen in the fact that there is a great distance between the real critical region near the intersection line and the calculation point, which lies on the neutral axis of the shell. The study shows that the results obtained to date which are based on the shell theory and calculate stresses at a fictitious intersection line can be improved and that there is a possibility to get stress values adjacent to the real intersection line. (Auth.)

  19. Data analysis of asymmetric structures advanced approaches in computational statistics

    CERN Document Server

    Saito, Takayuki

    2004-01-01

    Data Analysis of Asymmetric Structures provides a comprehensive presentation of a variety of models and theories for the analysis of asymmetry and its applications and provides a wealth of new approaches in every section. It meets both the practical and theoretical needs of research professionals across a wide range of disciplines and  considers data analysis in fields such as psychology, sociology, social science, ecology, and marketing. In seven comprehensive chapters this guide details theories, methods, and models for the analysis of asymmetric structures in a variety of disciplines and presents future opportunities and challenges affecting research developments and business applications.

  20. Advanced cluster methods for correlated-electron systems

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Andre

    2015-04-27

    In this thesis, quantum cluster methods are used to calculate electronic properties of correlated-electron systems. A special focus lies in the determination of the ground state properties of a 3/4 filled triangular lattice within the one-band Hubbard model. At this filling, the electronic density of states exhibits a so-called van Hove singularity and the Fermi surface becomes perfectly nested, causing an instability towards a variety of spin-density-wave (SDW) and superconducting states. While chiral d+id-wave superconductivity has been proposed as the ground state in the weak coupling limit, the situation towards strong interactions is unclear. Additionally, quantum cluster methods are used here to investigate the interplay of Coulomb interactions and symmetry-breaking mechanisms within the nematic phase of iron-pnictide superconductors. The transition from a tetragonal to an orthorhombic phase is accompanied by a significant change in electronic properties, while long-range magnetic order is not established yet. The driving force of this transition may not only be phonons but also magnetic or orbital fluctuations. The signatures of these scenarios are studied with quantum cluster methods to identify the most important effects. Here, cluster perturbation theory (CPT) and its variational extention, the variational cluster approach (VCA) are used to treat the respective systems on a level beyond mean-field theory. Short-range correlations are incorporated numerically exactly by exact diagonalization (ED). In the VCA, long-range interactions are included by variational optimization of a fictitious symmetry-breaking field based on a self-energy functional approach. Due to limitations of ED, cluster sizes are limited to a small number of degrees of freedom. For the 3/4 filled triangular lattice, the VCA is performed for different cluster symmetries. A strong symmetry dependence and finite-size effects make a comparison of the results from different clusters difficult

  1. Advanced Monte Carlo methods for thermal radiation transport

    Science.gov (United States)

    Wollaber, Allan B.

    During the past 35 years, the Implicit Monte Carlo (IMC) method proposed by Fleck and Cummings has been the standard Monte Carlo approach to solving the thermal radiative transfer (TRT) equations. However, the IMC equations are known to have accuracy limitations that can produce unphysical solutions. In this thesis, we explicitly provide the IMC equations with a Monte Carlo interpretation by including particle weight as one of its arguments. We also develop and test a stability theory for the 1-D, gray IMC equations applied to a nonlinear problem. We demonstrate that the worst case occurs for 0-D problems, and we extend the results to a stability algorithm that may be used for general linearizations of the TRT equations. We derive gray, Quasidiffusion equations that may be deterministically solved in conjunction with IMC to obtain an inexpensive, accurate estimate of the temperature at the end of the time step. We then define an average temperature T* to evaluate the temperature-dependent problem data in IMC, and we demonstrate that using T* is more accurate than using the (traditional) beginning-of-time-step temperature. We also propose an accuracy enhancement to the IMC equations: the use of a time-dependent "Fleck factor". This Fleck factor can be considered an automatic tuning of the traditionally defined user parameter alpha, which generally provides more accurate solutions at an increased cost relative to traditional IMC. We also introduce a global weight window that is proportional to the forward scalar intensity calculated by the Quasidiffusion method. This weight window improves the efficiency of the IMC calculation while conserving energy. All of the proposed enhancements are tested in 1-D gray and frequency-dependent problems. These enhancements do not unconditionally eliminate the unphysical behavior that can be seen in the IMC calculations. However, for fixed spatial and temporal grids, they suppress them and clearly work to make the solution more

  2. Advanced fabrication method for the preparation of MOF thin films: Liquid-phase epitaxy approach meets spin coating method.

    KAUST Repository

    Chernikova, Valeriya; Shekhah, Osama; Eddaoudi, Mohamed

    2016-01-01

    Here we report a new and advanced method for the fabrication of highly oriented/polycrystalline metal-organic framework (MOF) thin films. Building on the attractive features of the liquid-phase epitaxy (LPE) approach, a facile spin coating method

  3. Moyer's method of mixed dentition analysis: a meta-analysis ...

    African Journals Online (AJOL)

    The applicability of tables derived from the data Moyer used to other ethnic groups has ... This implies that Moyer's method of prediction may have population variations. ... Key Words: meta-analysis, mixed dentition analysis, Moyer's method

  4. Analysis and study on core power capability with margin method

    International Nuclear Information System (INIS)

    Liu Tongxian; Wu Lei; Yu Yingrui; Zhou Jinman

    2015-01-01

    Core power capability analysis focuses on the power distribution control of reactor within the given mode of operation, for the purpose of defining the allowed normal operating space so that Condition Ⅰ maneuvering flexibility is maintained and Condition Ⅱ occurrences are adequately protected by the reactor protection system. For the traditional core power capability analysis methods, such as synthesis method or advanced three dimension method, usually calculate the key safety parameters of the power distribution, and then verify that these parameters meet the design criteria. For PWR with on-line power distribution monitoring system, core power capability analysis calculates the most power level which just meets the design criteria. On the base of 3D FAC method of Westinghouse, the calculation model of core power capability analysis with margin method is introduced to provide reference for engineers. The core power capability analysis of specific burnup of Sanmen NPP is performed with the margin method. The results demonstrate the rationality of the margin method. The calculation model of the margin method not only helps engineers to master the core power capability analysis for AP1000, but also provides reference for engineers for core power capability analysis of other PWR with on-line power distribution monitoring system. (authors)

  5. Applying critical analysis - main methods

    Directory of Open Access Journals (Sweden)

    Miguel Araujo Alonso

    2012-02-01

    Full Text Available What is the usefulness of critical appraisal of literature? Critical analysis is a fundamental condition for the correct interpretation of any study that is subject to review. In epidemiology, in order to learn how to read a publication, we must be able to analyze it critically. Critical analysis allows us to check whether a study fulfills certain previously established methodological inclusion and exclusion criteria. This is frequently used in conducting systematic reviews although eligibility criteria are generally limited to the study design. Critical analysis of literature and be done implicitly while reading an article, as in reading for personal interest, or can be conducted in a structured manner, using explicit and previously established criteria. The latter is done when formally reviewing a topic.

  6. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    Science.gov (United States)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  7. Advanced scientific computational methods and their applications to nuclear technologies. (4) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (4)

    International Nuclear Information System (INIS)

    Sekimura, Naoto; Okita, Taira

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the fourth issue showing the overview of scientific computational methods with the introduction of continuum simulation methods and their applications. Simulation methods on physical radiation effects on materials are reviewed based on the process such as binary collision approximation, molecular dynamics, kinematic Monte Carlo method, reaction rate method and dislocation dynamics. (T. Tanaka)

  8. Advances in real and complex analysis with applications

    CERN Document Server

    Cho, Yeol; Agarwal, Praveen; Area, Iván

    2017-01-01

    This book discusses a variety of topics in mathematics and engineering as well as their applications, clearly explaining the mathematical concepts in the simplest possible way and illustrating them with a number of solved examples. The topics include real and complex analysis, special functions and analytic number theory, q-series, Ramanujan’s mathematics, fractional calculus, Clifford and harmonic analysis, graph theory, complex analysis, complex dynamical systems, complex function spaces and operator theory, geometric analysis of complex manifolds, geometric function theory, Riemannian surfaces, Teichmüller spaces and Kleinian groups, engineering applications of complex analytic methods, nonlinear analysis, inequality theory, potential theory, partial differential equations, numerical analysis , fixed-point theory, variational inequality, equilibrium problems, optimization problems, stability of functional equations, and mathematical physics.  It includes papers presented at the 24th International Confe...

  9. Advanced Extraction Methods for Actinide/Lanthanide Separations

    International Nuclear Information System (INIS)

    Scott, M.J.

    2005-01-01

    The separation of An(III) ions from chemically similar Ln(III) ions is perhaps one of the most difficult problems encountered during the processing of nuclear waste. In the 3+ oxidation states, the metal ions have an identical charge and roughly the same ionic radius. They differ strictly in the relative energies of their f- and d-orbitals, and to separate these metal ions, ligands will need to be developed that take advantage of this small but important distinction. The extraction of uranium and plutonium from nitric acid solution can be performed quantitatively by the extraction with the TBP (tributyl phosphate). Commercially, this process has found wide use in the PUREX (plutonium uranium extraction) reprocessing method. The TRUEX (transuranium extraction) process is further used to coextract the trivalent lanthanides and actinides ions from HLLW generated during PUREX extraction. This method uses CMPO [(N, N-diisobutylcarbamoylmethyl) octylphenylphosphineoxide] intermixed with TBP as a synergistic agent. However, the final separation of trivalent actinides from trivalent lanthanides still remains a challenging task. In TRUEX nitric acid solution, the Am(III) ion is coordinated by three CMPO molecules and three nitrate anions. Taking inspiration from this data and previous work with calix[4]arene systems, researchers on this project have developed a C3-symmetric tris-CMPO ligand system using a triphenoxymethane platform as a base. The triphenoxymethane ligand systems have many advantages for the preparation of complex ligand systems. The compounds are very easy to prepare. The steric and solubility properties can be tuned through an extreme range by the inclusion of different alkoxy and alkyl groups such as methyoxy, ethoxy, t-butoxy, methyl, octyl, t-pentyl, or even t-pentyl at the ortho- and para-positions of the aryl rings. The triphenoxymethane ligand system shows promise as an improved extractant for both tetravalent and trivalent actinide recoveries form

  10. Advanced Extraction Methods for Actinide/Lanthanide Separations

    Energy Technology Data Exchange (ETDEWEB)

    Scott, M.J.

    2005-12-01

    The separation of An(III) ions from chemically similar Ln(III) ions is perhaps one of the most difficult problems encountered during the processing of nuclear waste. In the 3+ oxidation states, the metal ions have an identical charge and roughly the same ionic radius. They differ strictly in the relative energies of their f- and d-orbitals, and to separate these metal ions, ligands will need to be developed that take advantage of this small but important distinction. The extraction of uranium and plutonium from nitric acid solution can be performed quantitatively by the extraction with the TBP (tributyl phosphate). Commercially, this process has found wide use in the PUREX (plutonium uranium extraction) reprocessing method. The TRUEX (transuranium extraction) process is further used to coextract the trivalent lanthanides and actinides ions from HLLW generated during PUREX extraction. This method uses CMPO [(N, N-diisobutylcarbamoylmethyl) octylphenylphosphineoxide] intermixed with TBP as a synergistic agent. However, the final separation of trivalent actinides from trivalent lanthanides still remains a challenging task. In TRUEX nitric acid solution, the Am(III) ion is coordinated by three CMPO molecules and three nitrate anions. Taking inspiration from this data and previous work with calix[4]arene systems, researchers on this project have developed a C3-symmetric tris-CMPO ligand system using a triphenoxymethane platform as a base. The triphenoxymethane ligand systems have many advantages for the preparation of complex ligand systems. The compounds are very easy to prepare. The steric and solubility properties can be tuned through an extreme range by the inclusion of different alkoxy and alkyl groups such as methyoxy, ethoxy, t-butoxy, methyl, octyl, t-pentyl, or even t-pentyl at the ortho- and para-positions of the aryl rings. The triphenoxymethane ligand system shows promise as an improved extractant for both tetravalent and trivalent actinide recoveries form

  11. Trial Sequential Methods for Meta-Analysis

    Science.gov (United States)

    Kulinskaya, Elena; Wood, John

    2014-01-01

    Statistical methods for sequential meta-analysis have applications also for the design of new trials. Existing methods are based on group sequential methods developed for single trials and start with the calculation of a required information size. This works satisfactorily within the framework of fixed effects meta-analysis, but conceptual…

  12. A modified captive bubble method for determining advancing and receding contact angles

    International Nuclear Information System (INIS)

    Xue, Jian; Shi, Pan; Zhu, Lin; Ding, Jianfu; Chen, Qingmin; Wang, Qingjun

    2014-01-01

    Graphical abstract: - Highlights: • A modified captive bubble method for determining advancing and receding contact angle is proposed. • We have designed a pressure chamber with a pressure control system to the original experimental. • The modified method overcomes the deviation of the bubble in the traditional captive bubble method. • The modified captive bubble method allows a smaller error from the test. - Abstract: In this work, a modification to the captive bubble method was proposed to test the advancing and receding contact angle. This modification is done by adding a pressure chamber with a pressure control system to the original experimental system equipped with an optical angle mater equipped with a high speed CCD camera, a temperature control system and a computer. A series of samples with highly hydrophilic, hydrophilic, hydrophobic and superhydrophobic surfaces were prepared. The advancing and receding contact angles of these samples with highly hydrophilic, hydrophilic, and hydrophobic surfaces through the new methods was comparable to the result tested by the traditional sessile drop method. It is proved that this method overcomes the limitation of the traditional captive bubble method and the modified captive bubble method allows a smaller error from the test. However, due to the nature of the captive bubble technique, this method is also only suitable for testing the surface with advancing or receding contact angle below 130°

  13. A modified captive bubble method for determining advancing and receding contact angles

    Energy Technology Data Exchange (ETDEWEB)

    Xue, Jian; Shi, Pan; Zhu, Lin [Key Laboratory of High Performance Polymer Materials and Technology (Nanjing University), Ministry of Eduction, Nanjing 210093 (China); Ding, Jianfu [Security and Disruptive Technologies, National Research Council Canada, 1200 Montreal Road, Ottawa, K1A 0R6, Ontario (Canada); Chen, Qingmin [Key Laboratory of High Performance Polymer Materials and Technology (Nanjing University), Ministry of Eduction, Nanjing 210093 (China); Wang, Qingjun, E-mail: njuwqj@nju.edu.cn [Key Laboratory of High Performance Polymer Materials and Technology (Nanjing University), Ministry of Eduction, Nanjing 210093 (China)

    2014-03-01

    Graphical abstract: - Highlights: • A modified captive bubble method for determining advancing and receding contact angle is proposed. • We have designed a pressure chamber with a pressure control system to the original experimental. • The modified method overcomes the deviation of the bubble in the traditional captive bubble method. • The modified captive bubble method allows a smaller error from the test. - Abstract: In this work, a modification to the captive bubble method was proposed to test the advancing and receding contact angle. This modification is done by adding a pressure chamber with a pressure control system to the original experimental system equipped with an optical angle mater equipped with a high speed CCD camera, a temperature control system and a computer. A series of samples with highly hydrophilic, hydrophilic, hydrophobic and superhydrophobic surfaces were prepared. The advancing and receding contact angles of these samples with highly hydrophilic, hydrophilic, and hydrophobic surfaces through the new methods was comparable to the result tested by the traditional sessile drop method. It is proved that this method overcomes the limitation of the traditional captive bubble method and the modified captive bubble method allows a smaller error from the test. However, due to the nature of the captive bubble technique, this method is also only suitable for testing the surface with advancing or receding contact angle below 130°.

  14. Numerical evaluation of fluid mixing phenomena in boiling water reactor using advanced interface tracking method

    International Nuclear Information System (INIS)

    Yoshida, Hiroyuki; Takase, Kazuyuki

    2008-01-01

    Thermal-hydraulic design of the current boiling water reactor (BWR) is performed with the subchannel analysis codes which incorporated the correlations based on empirical results including actual-size tests. Then, for the Innovative Water Reactor for Flexible Fuel Cycle (FLWR) core, an actual size test of an embodiment of its design is required to confirm or modify such correlations. In this situation, development of a method that enables the thermal-hydraulic design of nuclear reactors without these actual size tests is desired, because these tests take a long time and entail great cost. For this reason, we developed an advanced thermal-hydraulic design method for FLWRs using innovative two-phase flow simulation technology. In this study, a detailed Two-Phase Flow simulation code using advanced Interface Tracking method: TPFIT is developed to calculate the detailed information of the two-phase flow. In this paper, firstly, we tried to verify the TPFIT code by comparing it with the existing 2-channel air-water mixing experimental results. Secondary, the TPFIT code was applied to simulation of steam-water two-phase flow in a model of two subchannels of a current BWRs and FLWRs rod bundle. The fluid mixing was observed at a gap between the subchannels. The existing two-phase flow correlation for fluid mixing is evaluated using detailed numerical simulation data. This data indicates that pressure difference between fluid channels is responsible for the fluid mixing, and thus the effects of the time average pressure difference and fluctuations must be incorporated in the two-phase flow correlation for fluid mixing. When inlet quality ratio of subchannels is relatively large, it is understood that evaluation precision of the existing two-phase flow correlations for fluid mixing are relatively low. (author)

  15. Advances in meta-analysis: examples from internal medicine to neurology.

    Science.gov (United States)

    Copetti, Massimiliano; Fontana, Andrea; Graziano, Giusi; Veneziani, Federica; Siena, Federica; Scardapane, Marco; Lucisano, Giuseppe; Pellegrini, Fabio

    2014-01-01

    We review the state of the art in meta-analysis and data pooling following the evolution of the statistical models employed. Starting from a classic definition of meta-analysis of published data, a set of apparent antinomies which characterized the development of the meta-analytic tools are reconciled in dichotomies where the second term represents a possible generalization of the first one. Particular attention is given to the generalized linear mixed models as an overall framework for meta-analysis. Bayesian meta-analysis is discussed as a further possibility of generalization for sensitivity analysis and the use of priors as a data augmentation approach. We provide relevant examples to underline how the need for adequate methods to solve practical issues in specific areas of research have guided the development of advanced methods in meta-analysis. We show how all the advances in meta-analysis naturally merge into the unified framework of generalized linear mixed models and reconcile apparently conflicting approaches. All these complex models can be easily implemented with the standard commercial software available. © 2013 S. Karger AG, Basel.

  16. Thermal hydraulics analysis of the Advanced High Temperature Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Dean, E-mail: Dean_Wang@uml.edu [University of Massachusetts Lowell, One University Avenue, Lowell, MA 01854 (United States); Yoder, Graydon L.; Pointer, David W.; Holcomb, David E. [Oak Ridge National Laboratory, 1 Bethel Valley RD #6167, Oak Ridge, TN 37831 (United States)

    2015-12-01

    Highlights: • The TRACE AHTR model was developed and used to define and size the DRACS and the PHX. • A LOFF transient was simulated to evaluate the reactor performance during the transient. • Some recommendations for modifying FHR reactor system component designs are discussed. - Abstract: The Advanced High Temperature Reactor (AHTR) is a liquid salt-cooled nuclear reactor design concept, featuring low-pressure molten fluoride salt coolant, a carbon composite fuel form with embedded coated particle fuel, passively triggered negative reactivity insertion mechanisms, and fully passive decay heat rejection. This paper describes an AHTR system model developed using the Nuclear Regulatory Commission (NRC) thermal hydraulic transient code TRAC/RELAP Advanced Computational Engine (TRACE). The TRACE model includes all of the primary components: the core, downcomer, hot legs, cold legs, pumps, direct reactor auxiliary cooling system (DRACS), the primary heat exchangers (PHXs), etc. The TRACE model was used to help define and size systems such as the DRACS and the PHX. A loss of flow transient was also simulated to evaluate the performance of the reactor during an anticipated transient event. Some initial recommendations for modifying system component designs are also discussed. The TRACE model will be used as the basis for developing more detailed designs and ultimately will be used to perform transient safety analysis for the reactor.

  17. Hybrid methods for cybersecurity analysis :

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Warren Leon,; Dunlavy, Daniel M.

    2014-01-01

    Early 2010 saw a signi cant change in adversarial techniques aimed at network intrusion: a shift from malware delivered via email attachments toward the use of hidden, embedded hyperlinks to initiate sequences of downloads and interactions with web sites and network servers containing malicious software. Enterprise security groups were well poised and experienced in defending the former attacks, but the new types of attacks were larger in number, more challenging to detect, dynamic in nature, and required the development of new technologies and analytic capabilities. The Hybrid LDRD project was aimed at delivering new capabilities in large-scale data modeling and analysis to enterprise security operators and analysts and understanding the challenges of detection and prevention of emerging cybersecurity threats. Leveraging previous LDRD research e orts and capabilities in large-scale relational data analysis, large-scale discrete data analysis and visualization, and streaming data analysis, new modeling and analysis capabilities were quickly brought to bear on the problems in email phishing and spear phishing attacks in the Sandia enterprise security operational groups at the onset of the Hybrid project. As part of this project, a software development and deployment framework was created within the security analyst work ow tool sets to facilitate the delivery and testing of new capabilities as they became available, and machine learning algorithms were developed to address the challenge of dynamic threats. Furthermore, researchers from the Hybrid project were embedded in the security analyst groups for almost a full year, engaged in daily operational activities and routines, creating an atmosphere of trust and collaboration between the researchers and security personnel. The Hybrid project has altered the way that research ideas can be incorporated into the production environments of Sandias enterprise security groups, reducing time to deployment from months and

  18. Recent advances in the modeling of plasmas with the Particle-In-Cell methods

    Science.gov (United States)

    Vay, Jean-Luc; Lehe, Remi; Vincenti, Henri; Godfrey, Brendan; Lee, Patrick; Haber, Irv

    2015-11-01

    The Particle-In-Cell (PIC) approach is the method of choice for self-consistent simulations of plasmas from first principles. The fundamentals of the PIC method were established decades ago but improvements or variations are continuously being proposed. We report on several recent advances in PIC related algorithms, including: (a) detailed analysis of the numerical Cherenkov instability and its remediation, (b) analytic pseudo-spectral electromagnetic solvers in Cartesian and cylindrical (with azimuthal modes decomposition) geometries, (c) arbitrary-order finite-difference and generalized pseudo-spectral Maxwell solvers, (d) novel analysis of Maxwell's solvers' stencil variation and truncation, in application to domain decomposition strategies and implementation of Perfectly Matched Layers in high-order and pseudo-spectral solvers. Work supported by US-DOE Contracts DE-AC02-05CH11231 and the US-DOE SciDAC program ComPASS. Used resources of NERSC, supported by US-DOE Contract DE-AC02-05CH11231.

  19. Microparticle analysis system and method

    Science.gov (United States)

    Morrison, Dennis R. (Inventor)

    2007-01-01

    A device for analyzing microparticles is provided which includes a chamber with an inlet and an outlet for respectively introducing and dispensing a flowing fluid comprising microparticles, a light source for providing light through the chamber and a photometer for measuring the intensity of light transmitted through individual microparticles. The device further includes an imaging system for acquiring images of the fluid. In some cases, the device may be configured to identify and determine a quantity of the microparticles within the fluid. Consequently, a method for identifying and tracking microparticles in motion is contemplated herein. The method involves flowing a fluid comprising microparticles in laminar motion through a chamber, transmitting light through the fluid, measuring the intensities of the light transmitted through the microparticles, imaging the fluid a plurality of times and comparing at least some of the intensities of light between different images of the fluid.

  20. Advanced Connectivity Analysis (ACA): a Large Scale Functional Connectivity Data Mining Environment.

    Science.gov (United States)

    Chen, Rong; Nixon, Erika; Herskovits, Edward

    2016-04-01

    Using resting-state functional magnetic resonance imaging (rs-fMRI) to study functional connectivity is of great importance to understand normal development and function as well as a host of neurological and psychiatric disorders. Seed-based analysis is one of the most widely used rs-fMRI analysis methods. Here we describe a freely available large scale functional connectivity data mining software package called Advanced Connectivity Analysis (ACA). ACA enables large-scale seed-based analysis and brain-behavior analysis. It can seamlessly examine a large number of seed regions with minimal user input. ACA has a brain-behavior analysis component to delineate associations among imaging biomarkers and one or more behavioral variables. We demonstrate applications of ACA to rs-fMRI data sets from a study of autism.

  1. FAST: An advanced code system for fast reactor transient analysis

    International Nuclear Information System (INIS)

    Mikityuk, Konstantin; Pelloni, Sandro; Coddington, Paul; Bubelis, Evaldas; Chawla, Rakesh

    2005-01-01

    One of the main goals of the FAST project at PSI is to establish a unique analytical code capability for the core and safety analysis of advanced critical (and sub-critical) fast-spectrum systems for a wide range of different coolants. Both static and transient core physics, as well as the behaviour and safety of the power plant as a whole, are studied. The paper discusses the structure of the code system, including the organisation of the interfaces and data exchange. Examples of validation and application of the individual programs, as well as of the complete code system, are provided using studies carried out within the context of designs for experimental accelerator-driven, fast-spectrum systems

  2. Systems analysis and futuristic designs of advanced biofuel factory concepts.

    Energy Technology Data Exchange (ETDEWEB)

    Chianelli, Russ; Leathers, James; Thoma, Steven George; Celina, Mathias C.; Gupta, Vipin P.

    2007-10-01

    The U.S. is addicted to petroleum--a dependency that periodically shocks the economy, compromises national security, and adversely affects the environment. If liquid fuels remain the main energy source for U.S. transportation for the foreseeable future, the system solution is the production of new liquid fuels that can directly displace diesel and gasoline. This study focuses on advanced concepts for biofuel factory production, describing three design concepts: biopetroleum, biodiesel, and higher alcohols. A general schematic is illustrated for each concept with technical description and analysis for each factory design. Looking beyond current biofuel pursuits by industry, this study explores unconventional feedstocks (e.g., extremophiles), out-of-favor reaction processes (e.g., radiation-induced catalytic cracking), and production of new fuel sources traditionally deemed undesirable (e.g., fusel oils). These concepts lay the foundation and path for future basic science and applied engineering to displace petroleum as a transportation energy source for good.

  3. Beam Optics Analysis - An Advanced 3D Trajectory Code

    International Nuclear Information System (INIS)

    Ives, R. Lawrence; Bui, Thuc; Vogler, William; Neilson, Jeff; Read, Mike; Shephard, Mark; Bauer, Andrew; Datta, Dibyendu; Beal, Mark

    2006-01-01

    Calabazas Creek Research, Inc. has completed initial development of an advanced, 3D program for modeling electron trajectories in electromagnetic fields. The code is being used to design complex guns and collectors. Beam Optics Analysis (BOA) is a fully relativistic, charged particle code using adaptive, finite element meshing. Geometrical input is imported from CAD programs generating ACIS-formatted files. Parametric data is inputted using an intuitive, graphical user interface (GUI), which also provides control of convergence, accuracy, and post processing. The program includes a magnetic field solver, and magnetic information can be imported from Maxwell 2D/3D and other programs. The program supports thermionic emission and injected beams. Secondary electron emission is also supported, including multiple generations. Work on field emission is in progress as well as implementation of computer optimization of both the geometry and operating parameters. The principle features of the program and its capabilities are presented

  4. Thermodynamic analysis of the advanced zero emission power plant

    Directory of Open Access Journals (Sweden)

    Kotowicz Janusz

    2016-03-01

    Full Text Available The paper presents the structure and parameters of advanced zero emission power plant (AZEP. This concept is based on the replacement of the combustion chamber in a gas turbine by the membrane reactor. The reactor has three basic functions: (i oxygen separation from the air through the membrane, (ii combustion of the fuel, and (iii heat transfer to heat the oxygen-depleted air. In the discussed unit hot depleted air is expanded in a turbine and further feeds a bottoming steam cycle (BSC through the main heat recovery steam generator (HRSG. Flue gas leaving the membrane reactor feeds the second HRSG. The flue gas consist mainly of CO2 and water vapor, thus, CO2 separation involves only the flue gas drying. Results of the thermodynamic analysis of described power plant are presented.

  5. Recent Advances in the Analysis of Spiral Bevel Gears

    Science.gov (United States)

    Handschuh, Robert F.

    1997-01-01

    A review of recent progress for the analysis of spiral bevel gears will be described. The foundation of this work relies on the description of the gear geometry of face-milled spiral bevel gears via the approach developed by Litvin. This methodology was extended by combining the basic gear design data with the manufactured surfaces using a differential geometry approach, and provides the data necessary for assembling three-dimensional finite element models. The finite element models have been utilized to conduct thermal and structural analysis of the gear system. Examples of the methods developed for thermal and structural/contact analysis are presented.

  6. Advances in analysis and control of timedelayed dynamical systems

    CERN Document Server

    Sun, Jianqiao

    2013-01-01

    Analysis and control of timedelayed systems have been applied in a wide range of applications, ranging from mechanical, control, economic, to biological systems. Over the years, there has been a steady stream of interest in timedelayed dynamic systems, this book takes a snap shot of recent research from the world leading experts in analysis and control of dynamic systems with time delay to provide a bird's eye view of its development. The topics covered in this book include solution methods, stability analysis and control of periodic dynamic systems with time delay, bifurcations, stochastic dy

  7. Evaluation of DNBR calculation methods for advanced digital core protection system

    International Nuclear Information System (INIS)

    Ihn, W. K.; Hwang, D. H.; Pak, Y. H.; Yoon, T. Y.

    2003-01-01

    This study evaluated the on-line DNBR calculation methods for an advanced digital core protection system in PWR, i.e., subchannel analysis and group-channel analysis. The subchannel code MATRA and the four-channel codes CETOP-D and CETOP2 were used here. CETOP2 is most simplified DNBR analysis code which is implemented in core protection calculator in Korea standard nuclear power plants. The detailed subchannel code TORC was used as a reference calculation of DNBR. The DNBR uncertainty and margin were compared using allowable operating conditions at Yonggwang nuclear units 3-4. The MATRA code using a nine lumping-channel model resulted in smaller mean and larger standard deviation of the DNBR error distribution. CETOP-D and CETOP2 showed conservatively biased mean and relatively smaller standard deviation of the DNBR error distribution. MATRA and CETOP-D w.r.t CETOP2 showed significant increase of the DNBR available margin at normal operating condition. Taking account for the DNBR uncertainty, MATRA and CETOP-D over CETOP2 were estimated to increase the DNBR net margin by 2.5%-9.8% and 2.5%-3.3%, respectively

  8. Burnout prediction using advance image analysis coal characterization techniques

    Energy Technology Data Exchange (ETDEWEB)

    Edward Lester; Dave Watts; Michael Cloke [University of Nottingham, Nottingham (United Kingdom). School of Chemical Environmental and Mining Engineering

    2003-07-01

    The link between petrographic composition and burnout has been investigated previously by the authors. However, these predictions were based on 'bulk' properties of the coal, including the proportion of each maceral or the reflectance of the macerals in the whole sample. Combustion studies relating burnout with microlithotype analysis, or similar, remain less common partly because the technique is more complex than maceral analysis. Despite this, it is likely that any burnout prediction based on petrographic characteristics will become more accurate if it includes information about the maceral associations and the size of each particle. Chars from 13 coals, 106-125 micron size fractions, were prepared using a Drop Tube Furnace (DTF) at 1300{degree}C and 200 millisecond and 1% Oxygen. These chars were then refired in the DTF at 1300{degree}C 5% oxygen and residence times of 200, 400 and 600 milliseconds. The progressive burnout of each char was compared with the characteristics of the initial coals. This paper presents an extension of previous studies in that it relates combustion behaviour to coals that have been characterized on a particle by particle basis using advanced image analysis techniques. 13 refs., 7 figs.

  9. Infinitesimal methods of mathematical analysis

    CERN Document Server

    Pinto, J S

    2004-01-01

    This modern introduction to infinitesimal methods is a translation of the book Métodos Infinitesimais de Análise Matemática by José Sousa Pinto of the University of Aveiro, Portugal and is aimed at final year or graduate level students with a background in calculus. Surveying modern reformulations of the infinitesimal concept with a thoroughly comprehensive exposition of important and influential hyperreal numbers, the book includes previously unpublished material on the development of hyperfinite theory of Schwartz distributions and its application to generalised Fourier transforms and harmon

  10. Current status of methods for shielding analysis

    International Nuclear Information System (INIS)

    Engle, W.W.

    1980-01-01

    Current methods used in shielding analysis and recent improvements in those methods are discussed. The status of methods development is discussed based on needs cited at the 1977 International Conference on Reactor Shielding. Additional areas where methods development is needed are discussed

  11. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    Energy Technology Data Exchange (ETDEWEB)

    Harris, C.E.

    1994-09-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance. Separate articles from this report have been indexed into the database.

  12. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    Science.gov (United States)

    Harris, Charles E. (Editor)

    1994-01-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance.

  13. Advanced correlation grid: Analysis and visualisation of functional connectivity among multiple spike trains.

    Science.gov (United States)

    Masud, Mohammad Shahed; Borisyuk, Roman; Stuart, Liz

    2017-07-15

    This study analyses multiple spike trains (MST) data, defines its functional connectivity and subsequently visualises an accurate diagram of connections. This is a challenging problem. For example, it is difficult to distinguish the common input and the direct functional connection of two spike trains. The new method presented in this paper is based on the traditional pairwise cross-correlation function (CCF) and a new combination of statistical techniques. First, the CCF is used to create the Advanced Correlation Grid (ACG) correlation where both the significant peak of the CCF and the corresponding time delay are used for detailed analysis of connectivity. Second, these two features of functional connectivity are used to classify connections. Finally, the visualization technique is used to represent the topology of functional connections. Examples are presented in the paper to demonstrate the new Advanced Correlation Grid method and to show how it enables discrimination between (i) influence from one spike train to another through an intermediate spike train and (ii) influence from one common spike train to another pair of analysed spike trains. The ACG method enables scientists to automatically distinguish between direct connections from spurious connections such as common source connection and indirect connection whereas existing methods require in-depth analysis to identify such connections. The ACG is a new and effective method for studying functional connectivity of multiple spike trains. This method can identify accurately all the direct connections and can distinguish common source and indirect connections automatically. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Second NASA Technical Interchange Meeting (TIM): Advanced Technology Lifecycle Analysis System (ATLAS) Technology Tool Box (TTB)

    Science.gov (United States)

    ONeil, D. A.; Mankins, J. C.; Christensen, C. B.; Gresham, E. C.

    2005-01-01

    The Advanced Technology Lifecycle Analysis System (ATLAS), a spreadsheet analysis tool suite, applies parametric equations for sizing and lifecycle cost estimation. Performance, operation, and programmatic data used by the equations come from a Technology Tool Box (TTB) database. In this second TTB Technical Interchange Meeting (TIM), technologists, system model developers, and architecture analysts discussed methods for modeling technology decisions in spreadsheet models, identified specific technology parameters, and defined detailed development requirements. This Conference Publication captures the consensus of the discussions and provides narrative explanations of the tool suite, the database, and applications of ATLAS within NASA s changing environment.

  15. Advanced airflow distribution methods for reduction of personal exposure to indoor pollutants

    DEFF Research Database (Denmark)

    Cao, Guangyu; Kosonen, Risto; Melikov, Arsen

    2016-01-01

    The main objective of this study is to recognize possible airflow distribution methods to protect the occupants from exposure to various indoor pollutants. The fact of the increasing exposure of occupants to various indoor pollutants shows that there is an urgent need to develop advanced airflow ...... distribution methods to reduce indoor exposure to various indoor pollutants. This article presents some of the latest development of advanced airflow distribution methods to reduce indoor exposure in various types of buildings.......The main objective of this study is to recognize possible airflow distribution methods to protect the occupants from exposure to various indoor pollutants. The fact of the increasing exposure of occupants to various indoor pollutants shows that there is an urgent need to develop advanced airflow...

  16. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-11

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  17. Analyzing Planck and low redshift data sets with advanced statistical methods

    Science.gov (United States)

    Eifler, Tim

    The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi

  18. 78 FR 16513 - Application of Advances in Nucleic Acid and Protein Based Detection Methods to Multiplex...

    Science.gov (United States)

    2013-03-15

    ... Methods to Multiplex Detection of Transfusion- Transmissible Agents and Blood Cell Antigens in Blood... Transfusion-Transmissible Agents and Blood Cell Antigens in Blood Donations; Public Workshop AGENCY: Food and... technological advances in gene based and protein based pathogen and blood cell antigen detection methods and to...

  19. Iterative Method of Regularization with Application of Advanced Technique for Detection of Contours

    International Nuclear Information System (INIS)

    Niedziela, T.; Stankiewicz, A.

    2000-01-01

    This paper proposes a novel iterative method of regularization with application of an advanced technique for detection of contours. To eliminate noises, the properties of convolution of functions are utilized. The method can be accomplished in a simple neural cellular network, which creates the possibility of extraction of contours by automatic image recognition equipment. (author)

  20. System and method to control h2o2 level in advanced oxidation processes

    DEFF Research Database (Denmark)

    2016-01-01

    The present invention relates to a bio-electrochemical system (BES) and a method of in-situ production and removal of H2O2 using such a bio-electrochemical system (BES). Further, the invention relates to a method for in-situ control of H2O2 content in an aqueous system of advanced oxidation...

  1. Social network analysis: Presenting an underused method for nursing research.

    Science.gov (United States)

    Parnell, James Michael; Robinson, Jennifer C

    2018-06-01

    This paper introduces social network analysis as a versatile method with many applications in nursing research. Social networks have been studied for years in many social science fields. The methods continue to advance but remain unknown to most nursing scholars. Discussion paper. English language and interpreted literature was searched from Ovid Healthstar, CINAHL, PubMed Central, Scopus and hard copy texts from 1965 - 2017. Social network analysis first emerged in nursing literature in 1995 and appears minimally through present day. To convey the versatility and applicability of social network analysis in nursing, hypothetical scenarios are presented. The scenarios are illustrative of three approaches to social network analysis and include key elements of social network research design. The methods of social network analysis are underused in nursing research, primarily because they are unknown to most scholars. However, there is methodological flexibility and epistemological versatility capable of supporting quantitative and qualitative research. The analytic techniques of social network analysis can add new insight into many areas of nursing inquiry, especially those influenced by cultural norms. Furthermore, visualization techniques associated with social network analysis can be used to generate new hypotheses. Social network analysis can potentially uncover findings not accessible through methods commonly used in nursing research. Social networks can be analysed based on individual-level attributes, whole networks and subgroups within networks. Computations derived from social network analysis may stand alone to answer a research question or incorporated as variables into robust statistical models. © 2018 John Wiley & Sons Ltd.

  2. Advanced High Temperature Reactor Systems and Economic Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Holcomb, David Eugene [ORNL; Peretz, Fred J [ORNL; Qualls, A L [ORNL

    2011-09-01

    The Advanced High Temperature Reactor (AHTR) is a design concept for a large-output [3400 MW(t)] fluoride-salt-cooled high-temperature reactor (FHR). FHRs, by definition, feature low-pressure liquid fluoride salt cooling, coated-particle fuel, a high-temperature power cycle, and fully passive decay heat rejection. The AHTR's large thermal output enables direct comparison of its performance and requirements with other high output reactor concepts. As high-temperature plants, FHRs can support either high-efficiency electricity generation or industrial process heat production. The AHTR analysis presented in this report is limited to the electricity generation mission. FHRs, in principle, have the potential to be low-cost electricity producers while maintaining full passive safety. However, no FHR has been built, and no FHR design has reached the stage of maturity where realistic economic analysis can be performed. The system design effort described in this report represents early steps along the design path toward being able to predict the cost and performance characteristics of the AHTR as well as toward being able to identify the technology developments necessary to build an FHR power plant. While FHRs represent a distinct reactor class, they inherit desirable attributes from other thermal power plants whose characteristics can be studied to provide general guidance on plant configuration, anticipated performance, and costs. Molten salt reactors provide experience on the materials, procedures, and components necessary to use liquid fluoride salts. Liquid metal reactors provide design experience on using low-pressure liquid coolants, passive decay heat removal, and hot refueling. High temperature gas-cooled reactors provide experience with coated particle fuel and graphite components. Light water reactors (LWRs) show the potentials of transparent, high-heat capacity coolants with low chemical reactivity. Modern coal-fired power plants provide design experience

  3. Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture

    Science.gov (United States)

    West, Phillip B [Idaho Falls, ID; Novascone, Stephen R [Idaho Falls, ID; Wright, Jerry P [Idaho Falls, ID

    2011-09-27

    Earth analysis methods, subsurface feature detection methods, earth analysis devices, and articles of manufacture are described. According to one embodiment, an earth analysis method includes engaging a device with the earth, analyzing the earth in a single substantially lineal direction using the device during the engaging, and providing information regarding a subsurface feature of the earth using the analysis.

  4. Thermodynamic analysis of steam-injected advanced gas turbine cycles

    Science.gov (United States)

    Pandey, Devendra; Bade, Mukund H.

    2017-12-01

    This paper deals with thermodynamic analysis of steam-injected gas turbine (STIGT) cycle. To analyse the thermodynamic performance of steam-injected gas turbine (STIGT) cycles, a methodology based on pinch analysis is proposed. This graphical methodology is a systematic approach proposed for a selection of gas turbine with steam injection. The developed graphs are useful for selection of steam-injected gas turbine (STIGT) for optimal operation of it and helps designer to take appropriate decision. The selection of steam-injected gas turbine (STIGT) cycle can be done either at minimum steam ratio (ratio of mass flow rate of steam to air) with maximum efficiency or at maximum steam ratio with maximum net work conditions based on the objective of plants designer. Operating the steam injection based advanced gas turbine plant at minimum steam ratio improves efficiency, resulting in reduction of pollution caused by the emission of flue gases. On the other hand, operating plant at maximum steam ratio can result in maximum work output and hence higher available power.

  5. Ultra Wideband Indoor Positioning Technologies: Analysis and Recent Advances.

    Science.gov (United States)

    Alarifi, Abdulrahman; Al-Salman, AbdulMalik; Alsaleh, Mansour; Alnafessah, Ahmad; Al-Hadhrami, Suheer; Al-Ammar, Mai A; Al-Khalifa, Hend S

    2016-05-16

    In recent years, indoor positioning has emerged as a critical function in many end-user applications; including military, civilian, disaster relief and peacekeeping missions. In comparison with outdoor environments, sensing location information in indoor environments requires a higher precision and is a more challenging task in part because various objects reflect and disperse signals. Ultra WideBand (UWB) is an emerging technology in the field of indoor positioning that has shown better performance compared to others. In order to set the stage for this work, we provide a survey of the state-of-the-art technologies in indoor positioning, followed by a detailed comparative analysis of UWB positioning technologies. We also provide an analysis of strengths, weaknesses, opportunities, and threats (SWOT) to analyze the present state of UWB positioning technologies. While SWOT is not a quantitative approach, it helps in assessing the real status and in revealing the potential of UWB positioning to effectively address the indoor positioning problem. Unlike previous studies, this paper presents new taxonomies, reviews some major recent advances, and argues for further exploration by the research community of this challenging problem space.

  6. Ultra Wideband Indoor Positioning Technologies: Analysis and Recent Advances

    Science.gov (United States)

    Alarifi, Abdulrahman; Al-Salman, AbdulMalik; Alsaleh, Mansour; Alnafessah, Ahmad; Al-Hadhrami, Suheer; Al-Ammar, Mai A.; Al-Khalifa, Hend S.

    2016-01-01

    In recent years, indoor positioning has emerged as a critical function in many end-user applications; including military, civilian, disaster relief and peacekeeping missions. In comparison with outdoor environments, sensing location information in indoor environments requires a higher precision and is a more challenging task in part because various objects reflect and disperse signals. Ultra WideBand (UWB) is an emerging technology in the field of indoor positioning that has shown better performance compared to others. In order to set the stage for this work, we provide a survey of the state-of-the-art technologies in indoor positioning, followed by a detailed comparative analysis of UWB positioning technologies. We also provide an analysis of strengths, weaknesses, opportunities, and threats (SWOT) to analyze the present state of UWB positioning technologies. While SWOT is not a quantitative approach, it helps in assessing the real status and in revealing the potential of UWB positioning to effectively address the indoor positioning problem. Unlike previous studies, this paper presents new taxonomies, reviews some major recent advances, and argues for further exploration by the research community of this challenging problem space. PMID:27196906

  7. Ultra Wideband Indoor Positioning Technologies: Analysis and Recent Advances

    Directory of Open Access Journals (Sweden)

    Abdulrahman Alarifi

    2016-05-01

    Full Text Available In recent years, indoor positioning has emerged as a critical function in many end-user applications; including military, civilian, disaster relief and peacekeeping missions. In comparison with outdoor environments, sensing location information in indoor environments requires a higher precision and is a more challenging task in part because various objects reflect and disperse signals. Ultra WideBand (UWB is an emerging technology in the field of indoor positioning that has shown better performance compared to others. In order to set the stage for this work, we provide a survey of the state-of-the-art technologies in indoor positioning, followed by a detailed comparative analysis of UWB positioning technologies. We also provide an analysis of strengths, weaknesses, opportunities, and threats (SWOT to analyze the present state of UWB positioning technologies. While SWOT is not a quantitative approach, it helps in assessing the real status and in revealing the potential of UWB positioning to effectively address the indoor positioning problem. Unlike previous studies, this paper presents new taxonomies, reviews some major recent advances, and argues for further exploration by the research community of this challenging problem space.

  8. Hydration in advanced cancer: can bioelectrical impedance analysis improve the evidence base? A systematic review of the literature.

    Science.gov (United States)

    Nwosu, Amara Callistus; Mayland, Catriona R; Mason, Stephen R; Khodabukus, Andrew F; Varro, Andrea; Ellershaw, John E

    2013-09-01

    Decisions surrounding the administration of clinically assisted hydration to patients dying of cancer can be challenging because of the limited understanding of hydration in advanced cancer and a lack of evidence to guide health care professionals. Bioelectrical impedance analysis (BIA) has been used to assess hydration in various patient groupings, but evidence for its use in advanced cancer is limited. To critically appraise existing methods of hydration status assessment in advanced cancer and review the potential for BIA to assess hydration in advanced cancer. Searches were carried out in four electronic databases. A hand search of selected peer-reviewed journals and conference abstracts also was conducted. Studies reporting (de)hydration assessment (physical examination, biochemical measures, symptom assessment, and BIA) in patients with advanced cancer were included. The results highlight how clinical examination and biochemical tests are standard methods of assessing hydration, but limitations exist with these methods in advanced cancer. Furthermore, there is disagreement over the evidence for some commonly associated symptoms with dehydration in cancer. Although there are limitations with using BIA alone to assess hydration in advanced cancer, analysis of BIA raw measurements through the method of bioelectrical impedance vector analysis may have a role in this population. The benefits and burdens of providing clinically assisted hydration to patients dying of cancer are unclear. Bioelectrical impedance vector analysis shows promise as a hydration assessment tool but requires further study in advanced cancer. Innovative methodologies for research are required to add to the evidence base and ultimately improve the care for the dying. Copyright © 2013 U.S. Cancer Pain Relief Committee. Published by Elsevier Inc. All rights reserved.

  9. Transport modelling and gyrokinetic analysis of advanced high performance discharges

    International Nuclear Information System (INIS)

    Kinsey, J.E.; Imbeaux, F.; Staebler, G.M.; Budny, R.; Bourdelle, C.; Fukuyama, A.; Garbet, X.; Tala, T.; Parail, V.

    2005-01-01

    Predictive transport modelling and gyrokinetic stability analyses of demonstration hybrid (HYBRID) and advanced tokamak (AT) discharges from the International Tokamak Physics Activity (ITPA) profile database are presented. Both regimes have exhibited enhanced core confinement (above the conventional ITER reference H-mode scenario) but differ in their current density profiles. Recent contributions to the ITPA database have facilitated an effort to study the underlying physics governing confinement in these advanced scenarios. In this paper, we assess the level of commonality of the turbulent transport physics and the relative roles of the transport suppression mechanisms (i.e. E x B shear and Shafranov shift (α) stabilization) using data for select HYBRID and AT discharges from the DIII-D, JET and AUG tokamaks. GLF23 transport modelling and gyrokinetic stability analysis indicate that E x B shear and Shafranov shift stabilization play essential roles in producing the improved core confinement in both HYBRID and AT discharges. Shafranov shift stabilization is found to be more important in AT discharges than in HYBRID discharges. We have also examined the competition between the stabilizing effects of E x B shear and Shafranov shift stabilization and the destabilizing effects of higher safety factors and parallel velocity shear. Linear and nonlinear gyrokinetic simulations of idealized low and high safety factor cases reveal some interesting consequences. A low safety factor (i.e. HYBRID relevant) is directly beneficial in reducing the transport, and E x B shear stabilization can dominate parallel velocity shear destabilization allowing the turbulence to be quenched. However, at low-q/high current, Shafranov shift stabilization plays less of a role. Higher safety factors (as found in AT discharges), on the other hand, have larger amounts of Shafranov shift stabilization, but parallel velocity shear destabilization can prevent E x B shear quenching of the turbulent

  10. Transport modeling and gyrokinetic analysis of advanced high performance discharges

    International Nuclear Information System (INIS)

    Kinsey, J.; Imbeaux, F.; Bourdelle, C.; Garbet, X.; Staebler, G.; Budny, R.; Fukuyama, A.; Tala, T.; Parail, V.

    2005-01-01

    Predictive transport modeling and gyrokinetic stability analyses of demonstration hybrid (HYBRID) and Advanced Tokamak (AT) discharges from the International Tokamak Physics Activity (ITPA) profile database are presented. Both regimes have exhibited enhanced core confinement (above the conventional ITER reference H-mode scenario) but differ in their current density profiles. Recent contributions to the ITPA database have facilitated an effort to study the underlying physics governing confinement in these advanced scenarios. In this paper, we assess the level of commonality of the turbulent transport physics and the relative roles of the transport suppression mechanisms (i.e. ExB shear and Shafranov shift (α) stabilization) using data for select HYBRID and AT discharges from the DIII-D, JET, and AUG tokamaks. GLF23 transport modeling and gyrokinetic stability analysis indicates that ExB shear and Shafranov shift stabilization play essential roles in producing the improved core confinement in both HYBRID and AT discharges. Shafranov shift stabilization is found to be more important in AT discharges than in HYBRID discharges. We have also examined the competition between the stabilizing effects of ExB shear and Shafranov shift stabilization and the destabilizing effects of higher safety factors and parallel velocity shear. Linear and nonlinear gyrokinetic simulations of idealized low and high safety factor cases reveals some interesting consequences. A low safety factor (i.e. HYBRID relevant) is directly beneficial in reducing the transport, and ExB shear stabilization can win out over parallel velocity shear destabilization allowing the turbulence to be quenched. However, at low-q/high current, Shafranov shift stabilization plays less of a role. Higher safety factors (as found in AT discharges), on the other hand, have larger amounts of Shafranov shift stabilization, but parallel velocity shear destabilization can prevent ExB shear quenching of the turbulent

  11. Parametric Methods for Order Tracking Analysis

    DEFF Research Database (Denmark)

    Nielsen, Jesper Kjær; Jensen, Tobias Lindstrøm

    2017-01-01

    Order tracking analysis is often used to find the critical speeds at which structural resonances are excited by a rotating machine. Typically, order tracking analysis is performed via non-parametric methods. In this report, however, we demonstrate some of the advantages of using a parametric method...

  12. Conventional and advanced exergoenvironmental analysis of an ammonia-water hybridabsorption-compression heat pump

    DEFF Research Database (Denmark)

    Jensen, Jonas Kjær; Markussen, Wiebke Brix; Reinholdt, Lars

    2015-01-01

    to allocate the initial and operational environmental impact to the system components, thus revealing the main sources of environmental impact. The application of the advanced exergoenvironmental analysis improves the level of detail attained.This is achieved by accounting for technological and economic...... constraints as well as component interdependencies.The advanced exergoenvironmental analysis shows that the highest avoidable environmental impact stems from the compressor, followed by the absorber. Further, it is found that the initial environmental impactof the HACHP is negligible compared...... of an advanced exergy-based analysis, comprised of both an advanced exergy, exergoeconomic and exergoenvironmental analysis. Recent studies have presented both the advanced exergy and advanced exergoeconmic analysis of the HACHP. Anexergoenvironmental analysis combines exergy analysis with life cycle assessment...

  13. Chemical and physical analysis of core materials for advanced high temperature reactors with process heat applications

    International Nuclear Information System (INIS)

    Nickel, H.

    1985-08-01

    Various chemical and physical methods for the analysis of structural materials have been developed in the research programmes for advanced high temperature reactors. These methods are discussed using as examples the structural materials of the reactor core - the fuel elements consisting of coated particles in a graphite matrix and the structural graphite. Emphasis is given to the methods of chemical analysis. The composition of fuel kernels is investigated using chemical analysis methods to determine the heavy metals content (uranium, plutonium, thorium and metallic impurity elements) and the amount of non-metallic constituents. The properties of the pyrocarbon and silicon carbide coatings of fuel elements are investigated using specially developed physiochemical methods. Regarding the irradiation behaviour of coated particles and fuel elements, methods have been developed for examining specimens in hot cells following exposures under reactor operating conditions, to supplement the measurements of in-reactor performance. For the structural graphite, the determination of impurities is important because certain impurities may cause pitting corrosion during irradiation. The localized analysis of very low impurity concentrations is carried out using spectrochemical d.c. arc excitation, local laser and inductively coupled plasma methods. (orig.)

  14. Symptom Clusters in Advanced Cancer Patients: An Empirical Comparison of Statistical Methods and the Impact on Quality of Life.

    Science.gov (United States)

    Dong, Skye T; Costa, Daniel S J; Butow, Phyllis N; Lovell, Melanie R; Agar, Meera; Velikova, Galina; Teckle, Paulos; Tong, Allison; Tebbutt, Niall C; Clarke, Stephen J; van der Hoek, Kim; King, Madeleine T; Fayers, Peter M

    2016-01-01

    Symptom clusters in advanced cancer can influence patient outcomes. There is large heterogeneity in the methods used to identify symptom clusters. To investigate the consistency of symptom cluster composition in advanced cancer patients using different statistical methodologies for all patients across five primary cancer sites, and to examine which clusters predict functional status, a global assessment of health and global quality of life. Principal component analysis and exploratory factor analysis (with different rotation and factor selection methods) and hierarchical cluster analysis (with different linkage and similarity measures) were used on a data set of 1562 advanced cancer patients who completed the European Organization for the Research and Treatment of Cancer Quality of Life Questionnaire-Core 30. Four clusters consistently formed for many of the methods and cancer sites: tense-worry-irritable-depressed (emotional cluster), fatigue-pain, nausea-vomiting, and concentration-memory (cognitive cluster). The emotional cluster was a stronger predictor of overall quality of life than the other clusters. Fatigue-pain was a stronger predictor of overall health than the other clusters. The cognitive cluster and fatigue-pain predicted physical functioning, role functioning, and social functioning. The four identified symptom clusters were consistent across statistical methods and cancer types, although there were some noteworthy differences. Statistical derivation of symptom clusters is in need of greater methodological guidance. A psychosocial pathway in the management of symptom clusters may improve quality of life. Biological mechanisms underpinning symptom clusters need to be delineated by future research. A framework for evidence-based screening, assessment, treatment, and follow-up of symptom clusters in advanced cancer is essential. Copyright © 2016 American Academy of Hospice and Palliative Medicine. Published by Elsevier Inc. All rights reserved.

  15. Advanced Diagnostic and Prognostic Testbed (ADAPT) Testability Analysis Report

    Science.gov (United States)

    Ossenfort, John

    2008-01-01

    As system designs become more complex, determining the best locations to add sensors and test points for the purpose of testing and monitoring these designs becomes more difficult. Not only must the designer take into consideration all real and potential faults of the system, he or she must also find efficient ways of detecting and isolating those faults. Because sensors and cabling take up valuable space and weight on a system, and given constraints on bandwidth and power, it is even more difficult to add sensors into these complex designs after the design has been completed. As a result, a number of software tools have been developed to assist the system designer in proper placement of these sensors during the system design phase of a project. One of the key functions provided by many of these software programs is a testability analysis of the system essentially an evaluation of how observable the system behavior is using available tests. During the design phase, testability metrics can help guide the designer in improving the inherent testability of the design. This may include adding, removing, or modifying tests; breaking up feedback loops, or changing the system to reduce fault propagation. Given a set of test requirements, the analysis can also help to verify that the system will meet those requirements. Of course, a testability analysis requires that a software model of the physical system is available. For the analysis to be most effective in guiding system design, this model should ideally be constructed in parallel with these efforts. The purpose of this paper is to present the final testability results of the Advanced Diagnostic and Prognostic Testbed (ADAPT) after the system model was completed. The tool chosen to build the model and to perform the testability analysis with is the Testability Engineering and Maintenance System Designer (TEAMS-Designer). The TEAMS toolset is intended to be a solution to span all phases of the system, from design and

  16. Develop advanced nonlinear signal analysis topographical mapping system

    Science.gov (United States)

    1994-01-01

    The Space Shuttle Main Engine (SSME) has been undergoing extensive flight certification and developmental testing, which involves some 250 health monitoring measurements. Under the severe temperature, pressure, and dynamic environments sustained during operation, numerous major component failures have occurred, resulting in extensive engine hardware damage and scheduling losses. To enhance SSME safety and reliability, detailed analysis and evaluation of the measurements signal are mandatory to assess its dynamic characteristics and operational condition. Efficient and reliable signal detection techniques will reduce catastrophic system failure risks and expedite the evaluation of both flight and ground test data, and thereby reduce launch turn-around time. The basic objective of this contract are threefold: (1) develop and validate a hierarchy of innovative signal analysis techniques for nonlinear and nonstationary time-frequency analysis. Performance evaluation will be carried out through detailed analysis of extensive SSME static firing and flight data. These techniques will be incorporated into a fully automated system; (2) develop an advanced nonlinear signal analysis topographical mapping system (ATMS) to generate a Compressed SSME TOPO Data Base (CSTDB). This ATMS system will convert tremendous amount of complex vibration signals from the entire SSME test history into a bank of succinct image-like patterns while retaining all respective phase information. High compression ratio can be achieved to allow minimal storage requirement, while providing fast signature retrieval, pattern comparison, and identification capabilities; and (3) integrate the nonlinear correlation techniques into the CSTDB data base with compatible TOPO input data format. Such integrated ATMS system will provide the large test archives necessary for quick signature comparison. This study will provide timely assessment of SSME component operational status, identify probable causes of

  17. The delayed neutron method of uranium analysis

    International Nuclear Information System (INIS)

    Wall, T.

    1989-01-01

    The technique of delayed neutron analysis (DNA) is discussed. The DNA rig installed on the MOATA reactor, the assay standards and the types of samples which have been assayed are described. Of the total sample throughput of about 55,000 units since the uranium analysis service began, some 78% has been concerned with analysis of uranium ore samples derived from mining and exploration. Delayed neutron analysis provides a high sensitivity, low cost uranium analysis method for both uranium exploration and other applications. It is particularly suitable for analysis of large batch samples and for non-destructive analysis over a wide range of matrices. 8 refs., 4 figs., 3 tabs

  18. Advanced AEM by Comprehensive Analysis and Modeling of System Drift

    Science.gov (United States)

    Schiller, Arnulf; Klune, Klaus; Schattauer, Ingrid

    2010-05-01

    The quality of the assessment of risks outgoing from environmental hazards strongly depends on the spatial and temporal distribution of the data collected in a survey area. Natural hazards generally emerge from wide areas as it is in the case of volcanoes or land slides. Conventional surface measurements are restricted to few lines or locations and often can't be conducted in difficult terrain. So they only give a spatial and temporary limited data set and therefore limit the reliability of risk analysis. Aero-geophysical measurements potentially provide a valuable tool for completing the data set as they can be performed over a wide area, even above difficult terrain within a short time. A most desirable opportunity in course of such measurements is the ascertainment of the dynamics of such potentially hazardous environmental processes. This necessitates repeated and reproducible measurements. Current HEM systems can't accomplish this adequately due to their system immanent drift and - in some cases - bad signal to noise ratio. So, to develop comprising concepts for advancing state of the art HEM-systems to a valuable tool for data acquisition in risk assessment or hydrological problems, different studies have been undertaken which form the contents of the presented work conducted in course of the project HIRISK (Helicopter Based Electromagnetic System for Advanced Environmental Risk Assessment - FWF L-354 N10, supported by the Austrian Science Fund). The methodology is based upon two paths: A - Comprehensive experimental testing on an existing HEM system serving as an experimental platform. B - The setup of a numerical model which is continuously refined according to the results of the experimental data. The model then serves to simulate the experimental as well as alternative configurations and to analyze them subject to their drift behavior. Finally, concepts for minimizing the drift are derived and tested. Different test series - stationary on ground as well

  19. Radiochemistry and nuclear methods of analysis

    International Nuclear Information System (INIS)

    Ehmann, W.D.; Vance, D.

    1991-01-01

    This book provides both the fundamentals of radiochemistry as well as specific applications of nuclear techniques to analytical chemistry. It includes such areas of application as radioimmunoassay and activation techniques using very short-lined indicator radionuclides. It emphasizes the current nuclear methods of analysis such as neutron activation PIXE, nuclear reaction analysis, Rutherford backscattering, isotope dilution analysis and others

  20. Computational methods in power system analysis

    CERN Document Server

    Idema, Reijer

    2014-01-01

    This book treats state-of-the-art computational methods for power flow studies and contingency analysis. In the first part the authors present the relevant computational methods and mathematical concepts. In the second part, power flow and contingency analysis are treated. Furthermore, traditional methods to solve such problems are compared to modern solvers, developed using the knowledge of the first part of the book. Finally, these solvers are analyzed both theoretically and experimentally, clearly showing the benefits of the modern approach.

  1. Advancing data management and analysis in different scientific disciplines

    Science.gov (United States)

    Fischer, M.; Gasthuber, M.; Giesler, A.; Hardt, M.; Meyer, J.; Prabhune, A.; Rigoll, F.; Schwarz, K.; Streit, A.

    2017-10-01

    Over the past several years, rapid growth of data has affected many fields of science. This has often resulted in the need for overhauling or exchanging the tools and approaches in the disciplines’ data life cycles. However, this allows the application of new data analysis methods and facilitates improved data sharing. The project Large-Scale Data Management and Analysis (LSDMA) of the German Helmholtz Association has been addressing both specific and generic requirements in its data life cycle successfully since 2012. Its data scientists work together with researchers from the fields such as climatology, energy and neuroscience to improve the community-specific data life cycles, in several cases even all stages of the data life cycle, i.e. from data acquisition to data archival. LSDMA scientists also study methods and tools that are of importance to many communities, e.g. data repositories and authentication and authorization infrastructure.

  2. Advances in isotopic analysis for food authenticity testing

    DEFF Research Database (Denmark)

    Laursen, Kristian Holst; Bontempo, L.; Camin, Federica

    2016-01-01

    Abstract Stable isotope analysis has been used for food authenticity testing for more than 30 years and is today being utilized on a routine basis for a wide variety of food commodities. During the past decade, major analytical method developments have been made and the fundamental understanding...... authenticity testing is currently developing even further. In this chapter, we aim to provide an overview of the latest developments in stable isotope analysis for food authenticity testing. As several review articles and book chapters have recently addressed this topic, we will primarily focus on relevant...... literature from the past 5 years. We will focus on well-established methods for food authenticity testing using stable isotopes but will also include recent methodological developments, new applications, and current and future challenges....

  3. Advances in methods of commercial FBR core characteristics analyses. Investigations of a treatment of the double-heterogeneity and a method to calculate homogenized control rod cross sections

    Energy Technology Data Exchange (ETDEWEB)

    Sugino, Kazuteru [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center; Iwai, Takehiko

    1998-07-01

    A standard data base for FBR core nuclear design is under development in order to improve the accuracy of FBR design calculation. As a part of the development, we investigated an improved treatment of double-heterogeneity and a method to calculate homogenized control rod cross sections in a commercial reactor geometry, for the betterment of the analytical accuracy of commercial FBR core characteristics. As an improvement in the treatment of double-heterogeneity, we derived a new method (the direct method) and compared both this and conventional methods with continuous energy Monte-Carlo calculations. In addition, we investigated the applicability of the reaction rate ratio preservation method as a advanced method to calculate homogenized control rod cross sections. The present studies gave the following information: (1) An improved treatment of double-heterogeneity: for criticality the conventional method showed good agreement with Monte-Carlo result within one sigma standard deviation; the direct method was consistent with conventional one. Preliminary evaluation of effects in core characteristics other than criticality showed that the effect of sodium void reactivity (coolant reactivity) due to the double-heterogeneity was large. (2) An advanced method to calculate homogenize control rod cross sections: for control rod worths the reaction rate ratio preservation method agreed with those produced by the calculations with the control rod heterogeneity included in the core geometry; in Monju control rod worth analysis, the present method overestimated control rod worths by 1 to 2% compared with the conventional method, but these differences were caused by more accurate model in the present method and it is considered that this method is more reliable than the conventional one. These two methods investigated in this study can be directly applied to core characteristics other than criticality or control rod worth. Thus it is concluded that these methods will

  4. Advanced uncertainty modelling for container port risk analysis.

    Science.gov (United States)

    Alyami, Hani; Yang, Zaili; Riahi, Ramin; Bonsall, Stephen; Wang, Jin

    2016-08-13

    Globalization has led to a rapid increase of container movements in seaports. Risks in seaports need to be appropriately addressed to ensure economic wealth, operational efficiency, and personnel safety. As a result, the safety performance of a Container Terminal Operational System (CTOS) plays a growing role in improving the efficiency of international trade. This paper proposes a novel method to facilitate the application of Failure Mode and Effects Analysis (FMEA) in assessing the safety performance of CTOS. The new approach is developed through incorporating a Fuzzy Rule-Based Bayesian Network (FRBN) with Evidential Reasoning (ER) in a complementary manner. The former provides a realistic and flexible method to describe input failure information for risk estimates of individual hazardous events (HEs) at the bottom level of a risk analysis hierarchy. The latter is used to aggregate HEs safety estimates collectively, allowing dynamic risk-based decision support in CTOS from a systematic perspective. The novel feature of the proposed method, compared to those in traditional port risk analysis lies in a dynamic model capable of dealing with continually changing operational conditions in ports. More importantly, a new sensitivity analysis method is developed and carried out to rank the HEs by taking into account their specific risk estimations (locally) and their Risk Influence (RI) to a port's safety system (globally). Due to its generality, the new approach can be tailored for a wide range of applications in different safety and reliability engineering and management systems, particularly when real time risk ranking is required to measure, predict, and improve the associated system safety performance. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Advanced study of transport analysis in bentonite (3)

    International Nuclear Information System (INIS)

    Kawamura, Katsuyuki

    2005-02-01

    Solute and radionuclide transport analysis in buffer material made of bentonite clay is essential in safety assessment of a geological disposal facility for high-level radioactive waste (HLW). It is keenly required to understand the true physical and chemical process of the transport phenomena and to improve reliability of the safety assessment, since any conventional methods based on experimental models involve difficulty to estimate the robustness for a very long-term behavior. In order to solve this difficulty we start with the molecular dynamics (MD) simulation method for understanding the molecular-based fundamental properties such as an ionic state and diffusion characteristics of hydrated smectite clay minerals, and we extend the microscale properties to the macroscale behaviors by applying the multiscale homogenization analysis (HA) method. In the study of this year we improved the MD atomic model for the hydrated clay minerals, and a new adsorption-diffusion analysis scheme by the homogenization analysis (HA). In the MD simulation we precisely simulated the molecular behaviors of cations and H 2 O in the neighborhood of a clay mineral. In FY2002 the swelling property and diffusivity of interlayer cations, Cs and Ca, were calculated. In FY2003 the interatomic potential model was improved, and the diffusivity of several interlayer cations were calculated. In FY2004 the interatomic potential model was further improved, and the swelling and diffusive properties became more realistic. Then the coordination number of cations were calculated. A microscopic image is important to specify micro/macro behavior of bentonite. In FY2002 we observed microstructures of bentonite by using a confocal laser scanning microscope (LSM). In FY2003 based on the knowledge of the local material properties obtained by MD and the microscopic observation we simulated the micro-/macro-behavior of diffusion experiments of the bentonite which included the microscale adsorption

  6. Advanced concepts for gamma-ray isotopic analysis and instrumentation

    International Nuclear Information System (INIS)

    Buckley, W.M.; Carlson, J.B.

    1994-07-01

    The Safeguards Technology Program at the Lawrence Livermore National Laboratory is developing actinide isotopic analysis technologies in response to needs that address issues of flexibility of analysis, robustness of analysis, ease-of-use, automation and portability. Recent developments such as the Intelligent Actinide Analysis System (IAAS), begin to address these issues. We are continuing to develop enhancements on this and other instruments that improve ease-of-use, automation and portability. Requests to analyze samples with unusual isotopics, contamination, or containers have made us aware of the need for more flexible and robust analysis. We have modified the MGA program to extend its plutonium isotopic analysis capability to samples with greater 241 Am content or U isotopics. We are looking at methods for dealing with tantalum or lead contamination and contamination with high-energy gamma emitters, such as 233 U. We are looking at ways to allow the program to use additional information about the sample to further extend the domain of analyzable samples. These unusual analyses will come from the domain of samples that need to be measured because of complex reconfiguration or environmental cleanup

  7. Setting health research priorities using the CHNRI method: IV. Key conceptual advances

    Directory of Open Access Journals (Sweden)

    Igor Rudan

    2016-06-01

    Full Text Available Child Health and Nutrition Research Initiative (CHNRI started as an initiative of the Global Forum for Health Research in Geneva, Switzerland. Its aim was to develop a method that could assist priority setting in health research investments. The first version of the CHNRI method was published in 2007–2008. The aim of this paper was to summarize the history of the development of the CHNRI method and its key conceptual advances.

  8. Advances in the analysis of pressure interference tests

    Energy Technology Data Exchange (ETDEWEB)

    Martinez R, N. [Petroleos Mexicanos, PEMEX, Mexico City (Mexico); Samaniego V, F. [Univ. Nacional Autonoma de Mexico (Mexico)

    2010-12-15

    This paper presented an extension for radial, linear, and spherical flow conditions of the El-Khatib method for analyzing pressure interference tests through utilization of the pressure derivative. Conventional analysis of interference tests considers only radial flow, but some reservoirs have physical field conditions in which linear or spherical flow conditions prevail. The INTERFERAN system, a friendly computer code for the automatic analysis of pressure interference tests, was also discussed and demonstrated by way of 2 field cases. INTERFERAN relies on the principle of superposition in time and space to interpret a test of several wells with variable histories of production or injection or both. The first field case addressed interference tests conducted in the naturally fractured geothermal field of Klamath Falls, and the second field case was conducted in a river-formed bed in which linear flow conditions are dominant. The analysis was deemed to be reliable. 13 refs., 1 tab., 7 figs.

  9. Advanced risk analysis of systems endangered by ESD

    International Nuclear Information System (INIS)

    Kiss, Istvan; Szedenik, Norbert; Nemeth, Balint; Gulyas, Attila; Berta, Istvan

    2008-01-01

    Evaluation of industrial processes to determine risk of fire or explosion caused by electrostatic discharge (ESD) is even nowadays qualitative in most cases. Although qualitative analysis significantly helps to make an industrial process safer, it is based on the survey of the process and strongly subjective, depending on the estimation of an expert. Fault tree analysis is a traditional method to quantify the risk; it helps to select optimal protection. However, determination of top event, secondary events and basic events of the fault tree is difficult, especially the quantification of the probabilities of the basic events. In several cases no statistical information is available for most of the events. Using fuzzy membership functions instead of simple numbers for the quantification of probabilities makes it possible to take this uncertainty into consideration. Fuzzy logic based fault tree analysis of chemical processes were made to determine the effect of basic events on the probability of the top event (explosion or fire) and its reliability.

  10. Text analysis methods, text analysis apparatuses, and articles of manufacture

    Science.gov (United States)

    Whitney, Paul D; Willse, Alan R; Lopresti, Charles A; White, Amanda M

    2014-10-28

    Text analysis methods, text analysis apparatuses, and articles of manufacture are described according to some aspects. In one aspect, a text analysis method includes accessing information indicative of data content of a collection of text comprising a plurality of different topics, using a computing device, analyzing the information indicative of the data content, and using results of the analysis, identifying a presence of a new topic in the collection of text.

  11. Nuclear analysis methods. Rudiments of radiation protection

    International Nuclear Information System (INIS)

    Roth, E.

    1998-01-01

    The nuclear analysis methods are generally used to analyse radioactive elements but they can be used also for chemical analysis, with fields such analysis and characterization of traces. The principles of radiation protection are explained (ALARA), the biological effects of ionizing radiations are given, elements and units used in radiation protection are reminded in tables. A part of this article is devoted to how to use radiation protection in a nuclear analysis laboratory. (N.C.)

  12. Rasch Analysis of the Fullerton Advanced Balance (FAB) Scale.

    Science.gov (United States)

    Klein, Penelope J; Fiedler, Roger C; Rose, Debra J

    2011-01-01

    This cross-sectional study explores the psychometric properties and dimensionality of the Fullerton Advanced Balance (FAB) Scale, a multi-item balance test for higher-functioning older adults. Participants (n=480) were community-dwelling adults able to ambulate independently. Data gathering consisted of survey and balance performance assessment. Psychometric properties were assessed using Rasch analysis. Mean age of participants was 76.4 (SD=7.1) years. Mean FAB Scale scores were 24.7/40 (SD=7.5). Analyses for scale dimensionality showed that 9 of the 10 items fit a unidimensional measure of balance. Item 10 (Reactive Postural Control) did not fit the model. The reliability of the scale to separate persons was 0.81 out of 1.00; the reliability of the scale to separate items in terms of their difficulty was 0.99 out of 1.00. Cronbach's alpha for a 10-item model was 0.805. Items of differing difficulties formed a useful ordinal hierarchy for scaling patterns of expected balance ability scoring for a normative population. The FAB Scale appears to be a reliable and valid tool to assess balance function in higher-functioning older adults. The test was found to discriminate among participants of varying balance abilities. Further exploration of concurrent validity of Rasch-generated expected item scoring patterns should be undertaken to determine the test's diagnostic and prescriptive utility.

  13. Stress analysis of HLW containers advanced test work Compas project

    International Nuclear Information System (INIS)

    Ove Arup and Partners

    1990-01-01

    The Compas project is concerned with the structural performance of metal overpacks which may be used to encapsulate vitrified high-level waste forms before disposal in deep geological repositories. This document describes the activities performed between June and August 1989 forming the advanced test work phase of this project. This is the culmination of two years' analysis and test work to demonstrate whether the analytical ability exists to model containers subjected to realistic loads. Three mild steel containers were designed and manufactured to be one-third scale models of a realistic HLW container, modified to represent the effect of anisotropic loading and to facilitate testing. The containers were tested under a uniform external pressure and all failed by buckling in the mid-body region. The outer surface of each container was comprehensively strain-gauged to provide strain history data at all positions of interest. In parallel with the test work, Compas project partners, from five different European countries, independently modelled the behaviour of each of the containers using their computer codes to predict the failure pressure and produce strain history data at a number of specified locations. The first axisymmetric container was well modelled but predictions for the remaining two non-axisymmetric containers were much more varied, with differences of up to 50% occurring between failure predictions and test data

  14. Lunar Advanced Volatile Analysis Subsystem: Pressure Transducer Trade Study

    Science.gov (United States)

    Kang, Edward Shinuk

    2017-01-01

    In Situ Resource Utilization (ISRU) is a key factor in paving the way for the future of human space exploration. The ability to harvest resources on foreign astronomical objects to produce consumables and propellant offers potential reduction in mission cost and risk. Through previous missions, the existence of water ice at the poles of the moon has been identified, however the feasibility of water extraction for resources remains unanswered. The Resource Prospector (RP) mission is currently in development to provide ground truth, and will enable us to characterize the distribution of water at one of the lunar poles. Regolith & Environment Science and Oxygen & Lunar Volatile Extraction (RESOLVE) is the primary payload on RP that will be used in conjunction with a rover. RESOLVE contains multiple instruments for systematically identifying the presence of water. The main process involves the use of two systems within RESOLVE: the Oxygen Volatile Extraction Node (OVEN) and Lunar Advanced Volatile Analysis (LAVA). Within the LAVA subsystem, there are multiple calculations that depend on accurate pressure readings. One of the most important instances where pressure transducers (PT) are used is for calculating the number of moles in a gas transfer from the OVEN subsystem. As a critical component of the main process, a mixture of custom and commercial off the shelf (COTS) PTs are currently being tested in the expected operating environment to eventually down select an option for integrated testing in the LAVA engineering test unit (ETU).

  15. Advances in three-dimensional field analysis and evaluation of performance parameters of electrical machines

    Science.gov (United States)

    Sivasubramaniam, Kiruba

    This thesis makes advances in three dimensional finite element analysis of electrical machines and the quantification of their parameters and performance. The principal objectives of the thesis are: (1)the development of a stable and accurate method of nonlinear three-dimensional field computation and application to electrical machinery and devices; and (2)improvement in the accuracy of determination of performance parameters, particularly forces and torque computed from finite elements. Contributions are made in two general areas: a more efficient formulation for three dimensional finite element analysis which saves time and improves accuracy, and new post-processing techniques to calculate flux density values from a given finite element solution. A novel three-dimensional magnetostatic solution based on a modified scalar potential method is implemented. This method has significant advantages over the traditional total scalar, reduced scalar or vector potential methods. The new method is applied to a 3D geometry of an iron core inductor and a permanent magnet motor. The results obtained are compared with those obtained from traditional methods, in terms of accuracy and speed of computation. A technique which has been observed to improve force computation in two dimensional analysis using a local solution of Laplace's equation in the airgap of machines is investigated and a similar method is implemented in the three dimensional analysis of electromagnetic devices. A new integral formulation to improve force calculation from a smoother flux-density profile is also explored and implemented. Comparisons are made and conclusions drawn as to how much improvement is obtained and at what cost. This thesis also demonstrates the use of finite element analysis to analyze torque ripples due to rotor eccentricity in permanent magnet BLDC motors. A new method for analyzing torque harmonics based on data obtained from a time stepping finite element analysis of the machine is

  16. On the methods and examples of aircraft impact analysis

    International Nuclear Information System (INIS)

    Arros, J.

    2012-01-01

    Conclusions: Aircraft impact analysis can be performed today within feasible run times using PCs and available advanced commercial finite element software tools. Adequate element and material model technologies exist. Explicit time integration enables analysis of very large deformation Missile/Target impacts. Meshless/particle based methods may be beneficial for large deformation concrete “punching shear” analysis – potentially solves the “element erosion” problem associated with FE, but are not generally implemented yet in major commercial software. Verification of the complicated modeling technologies continues to be a challenge. Not much work has been done yet on ACI shock loading – redundant and physically separated safety trains key to success. Analysis approach and detail should be “balanced” - commensurate with the significant uncertainties - do not “over-do” details of some parts of the model (e.g., the plane) and the analysis

  17. Advanced numerical methods for uncertainty reduction when predicting heat exchanger dynamic stability limits: Review and perspectives

    International Nuclear Information System (INIS)

    Longatte, E.; Baj, F.; Hoarau, Y.; Braza, M.; Ruiz, D.; Canteneur, C.

    2013-01-01

    Highlights: ► Proposal of hybrid computational methods for investigating dynamical system stability. ► Modeling turbulence disequilibrium due to interaction with moving solid boundaries. ► Providing computational procedure for large size system solution approximation through model reduction. -- Abstract: This article proposes a review of recent and current developments in the modeling and advanced numerical methods used to simulate large-size systems involving multi-physics in the field of mechanics. It addresses the complex issue of stability analysis of dynamical systems submitted to external turbulent flows and aims to establish accurate stability maps applicable to heat exchanger design. The purpose is to provide dimensionless stability limit modeling that is suitable for a variety of configurations and is as accurate as possible in spite of the large scale of the systems to be considered. The challenge lies in predicting local effects that may impact global systems. A combination of several strategies that are suited concurrently to multi-physics, multi-scale and large-size system computation is therefore required. Based on empirical concepts, the heuristic models currently used in the framework of standard stability analysis suffer from a lack of predictive capabilities. On the other hand, numerical approaches based on fully-coupled fluid–solid dynamics system computation remain expensive due to the multi-physics patterns of physics and the large number of degrees of freedom involved. In this context, since experimentation cannot be achieved and numerical simulation is unavoidable but prohibitive, a hybrid strategy is proposed in order to take advantage of both numerical local solutions and empirical global solutions

  18. Advancing alternatives analysis: The role of predictive toxicology in selecting safer chemical products and processes.

    Science.gov (United States)

    Malloy, Timothy; Zaunbrecher, Virginia; Beryt, Elizabeth; Judson, Richard; Tice, Raymond; Allard, Patrick; Blake, Ann; Cote, Ila; Godwin, Hilary; Heine, Lauren; Kerzic, Patrick; Kostal, Jakub; Marchant, Gary; McPartland, Jennifer; Moran, Kelly; Nel, Andre; Ogunseitan, Oladele; Rossi, Mark; Thayer, Kristina; Tickner, Joel; Whittaker, Margaret; Zarker, Ken

    2017-09-01

    Alternatives analysis (AA) is a method used in regulation and product design to identify, assess, and evaluate the safety and viability of potential substitutes for hazardous chemicals. It requires toxicological data for the existing chemical and potential alternatives. Predictive toxicology uses in silico and in vitro approaches, computational models, and other tools to expedite toxicological data generation in a more cost-effective manner than traditional approaches. The present article briefly reviews the challenges associated with using predictive toxicology in regulatory AA, then presents 4 recommendations for its advancement. It recommends using case studies to advance the integration of predictive toxicology into AA, adopting a stepwise process to employing predictive toxicology in AA beginning with prioritization of chemicals of concern, leveraging existing resources to advance the integration of predictive toxicology into the practice of AA, and supporting transdisciplinary efforts. The further incorporation of predictive toxicology into AA would advance the ability of companies and regulators to select alternatives to harmful ingredients, and potentially increase the use of predictive toxicology in regulation more broadly. Integr Environ Assess Manag 2017;13:915-925. © 2017 SETAC. © 2017 SETAC.

  19. Harmonic Analysis on Torque Ripple of Brushless DC Motor Based on Advanced Commutation Control

    Directory of Open Access Journals (Sweden)

    Yanpeng Ji

    2018-01-01

    Full Text Available This paper investigates the relationship between current, back electromotive force (back-EMF, and torque for permanent-magnet brushless DC (PM BLDC motors under advanced commutation control from the perspective of harmonics. Considering that the phase current is the influencing factor of both torque and torque ripple, this paper firstly analyzes the effects of advanced commutation on phase current and current harmonics. And then, based on the harmonics of the phase current and back-EMF, the torque harmonic expressions are deduced. The expressions reveal the relationship of harmonic order between the torque, phase current, and back-EMF and highlight the different contribution of individual torque harmonic to the total torque ripple. Finally, the proposed harmonic analysis method is verified by the experiments with different speed and load conditions.

  20. Thermodynamic and economic evaluations of a geothermal district heating system using advanced exergy-based methods

    International Nuclear Information System (INIS)

    Tan, Mehmet; Keçebaş, Ali

    2014-01-01

    Highlights: • Evaluation of a GDHS using advanced exergy-based methods. • Comparison of the results of the conventional and advanced exergy-based methods. • The modified exergetic efficiency and exergoeconomic factor are found as 45% and 13%. • Improvement and total cost-savings potentials are found to be 3% and 14%. • All the pumps have the highest improvement potential and total cost-savings potential. - Abstract: In this paper, a geothermal district heating system (GDHS) is comparatively evaluated in terms of thermodynamic and economic aspects using advanced exergy-based methods to identify the potential for improvement, the interactions among system components, and the direction and potential for energy savings. The actual operational data are taken from the Sarayköy GDHS, Turkey. In the advanced exergetic and exergoeconomic analyses, the exergy destruction and the total operating cost within each component of the system are split into endogenous/exogenous and unavoidable/avoidable parts. The advantages of these analyses over conventional ones are demonstrated. The results indicate that the advanced exergy-based method is a more meaningful and effective tool than the conventional one for system performance evaluation. The exergetic efficiency and the exergoeconomic factor of the overall system for the Sarayköy GDHS were determined to be 43.72% and 5.25% according to the conventional tools and 45.06% and 12.98% according to the advanced tools. The improvement potential and the total cost-savings potential of the overall system were also determined to be 2.98% and 14.05%, respectively. All of the pumps have the highest improvement potential and total cost-savings potential because the pumps were selected to have high power during installation at the Sarayköy GDHS