WorldWideScience

Sample records for neuromath advanced methods

  1. Advanced differential quadrature methods

    CERN Document Server

    Zong, Zhi

    2009-01-01

    Modern Tools to Perform Numerical DifferentiationThe original direct differential quadrature (DQ) method has been known to fail for problems with strong nonlinearity and material discontinuity as well as for problems involving singularity, irregularity, and multiple scales. But now researchers in applied mathematics, computational mechanics, and engineering have developed a range of innovative DQ-based methods to overcome these shortcomings. Advanced Differential Quadrature Methods explores new DQ methods and uses these methods to solve problems beyond the capabilities of the direct DQ method.After a basic introduction to the direct DQ method, the book presents a number of DQ methods, including complex DQ, triangular DQ, multi-scale DQ, variable order DQ, multi-domain DQ, and localized DQ. It also provides a mathematical compendium that summarizes Gauss elimination, the Runge-Kutta method, complex analysis, and more. The final chapter contains three codes written in the FORTRAN language, enabling readers to q...

  2. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay

    2017-04-24

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  3. Advanced methods of fatigue assessment

    CERN Document Server

    Radaj, Dieter

    2013-01-01

    The book in hand presents advanced methods of brittle fracture and fatigue assessment. The Neuber concept of fictitious notch rounding is enhanced with regard to theory and application. The stress intensity factor concept for cracks is extended to pointed and rounded corner notches as well as to locally elastic-plastic material behaviour. The averaged strain energy density within a circular sector volume around the notch tip is shown to be suitable for strength-assessments. Finally, the various implications of cyclic plasticity on fatigue crack growth are explained with emphasis being laid on the DJ-integral approach.   This book continues the expositions of the authors’ well known reference work in German language ‘Ermüdungsfestigkeit – Grundlagen für Ingenieure’ (Fatigue strength – fundamentals for engineers).

  4. Advanced Fine Particulate Characterization Methods

    Energy Technology Data Exchange (ETDEWEB)

    Steven Benson; Lingbu Kong; Alexander Azenkeng; Jason Laumb; Robert Jensen; Edwin Olson; Jill MacKenzie; A.M. Rokanuzzaman

    2007-01-31

    The characterization and control of emissions from combustion sources are of significant importance in improving local and regional air quality. Such emissions include fine particulate matter, organic carbon compounds, and NO{sub x} and SO{sub 2} gases, along with mercury and other toxic metals. This project involved four activities including Further Development of Analytical Techniques for PM{sub 10} and PM{sub 2.5} Characterization and Source Apportionment and Management, Organic Carbonaceous Particulate and Metal Speciation for Source Apportionment Studies, Quantum Modeling, and High-Potassium Carbon Production with Biomass-Coal Blending. The key accomplishments included the development of improved automated methods to characterize the inorganic and organic components particulate matter. The methods involved the use of scanning electron microscopy and x-ray microanalysis for the inorganic fraction and a combination of extractive methods combined with near-edge x-ray absorption fine structure to characterize the organic fraction. These methods have direction application for source apportionment studies of PM because they provide detailed inorganic analysis along with total organic and elemental carbon (OC/EC) quantification. Quantum modeling using density functional theory (DFT) calculations was used to further elucidate a recently developed mechanistic model for mercury speciation in coal combustion systems and interactions on activated carbon. Reaction energies, enthalpies, free energies and binding energies of Hg species to the prototype molecules were derived from the data obtained in these calculations. Bimolecular rate constants for the various elementary steps in the mechanism have been estimated using the hard-sphere collision theory approximation, and the results seem to indicate that extremely fast kinetics could be involved in these surface reactions. Activated carbon was produced from a blend of lignite coal from the Center Mine in North Dakota and

  5. Advanced computational electromagnetic methods and applications

    CERN Document Server

    Li, Wenxing; Elsherbeni, Atef; Rahmat-Samii, Yahya

    2015-01-01

    This new resource covers the latest developments in computational electromagnetic methods, with emphasis on cutting-edge applications. This book is designed to extend existing literature to the latest development in computational electromagnetic methods, which are of interest to readers in both academic and industrial areas. The topics include advanced techniques in MoM, FEM and FDTD, spectral domain method, GPU and Phi hardware acceleration, metamaterials, frequency and time domain integral equations, and statistics methods in bio-electromagnetics.

  6. Development of advanced MCR task analysis methods

    International Nuclear Information System (INIS)

    Na, J. C.; Park, J. H.; Lee, S. K.; Kim, J. K.; Kim, E. S.; Cho, S. B.; Kang, J. S.

    2008-07-01

    This report describes task analysis methodology for advanced HSI designs. Task analyses was performed by using procedure-based hierarchical task analysis and task decomposition methods. The results from the task analysis were recorded in a database. Using the TA results, we developed static prototype of advanced HSI and human factors engineering verification and validation methods for an evaluation of the prototype. In addition to the procedure-based task analysis methods, workload estimation based on the analysis of task performance time and analyses for the design of information structure and interaction structures will be necessary

  7. Advanced Computational Methods for Monte Carlo Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-12

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  8. Advances in structure research by diffraction methods

    CERN Document Server

    Brill, R

    1970-01-01

    Advances in Structure Research by Diffraction Methods reviews advances in the use of diffraction methods in structure research. Topics covered include the dynamical theory of X-ray diffraction, with emphasis on Ewald waves in theory and experiment; dynamical theory of electron diffraction; small angle scattering; and molecular packing. This book is comprised of four chapters and begins with an overview of the dynamical theory of X-ray diffraction, especially in terms of how it explains all the absorption and propagation properties of X-rays at the Bragg setting in a perfect crystal. The next

  9. Advances in structure research by diffraction methods

    CERN Document Server

    Hoppe, W

    1974-01-01

    Advances in Structure Research by Diffraction Methods: Volume 5 presents discussions on application of diffraction methods in structure research. The book provides the aspects of structure research using various diffraction methods. The text contains 2 chapters. Chapter 1 reviews the general theory and experimental methods used in the study of all types of amorphous solid, by both X-ray and neutron diffraction, and the detailed bibliography of work on inorganic glasses. The second chapter discusses electron diffraction, one of the major methods of determining the structures of molecules in the

  10. Advanced analysis methods in particle physics

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C.; /Fermilab

    2010-10-01

    Each generation of high energy physics experiments is grander in scale than the previous - more powerful, more complex and more demanding in terms of data handling and analysis. The spectacular performance of the Tevatron and the beginning of operations of the Large Hadron Collider, have placed us at the threshold of a new era in particle physics. The discovery of the Higgs boson or another agent of electroweak symmetry breaking and evidence of new physics may be just around the corner. The greatest challenge in these pursuits is to extract the extremely rare signals, if any, from huge backgrounds arising from known physics processes. The use of advanced analysis techniques is crucial in achieving this goal. In this review, I discuss the concepts of optimal analysis, some important advanced analysis methods and a few examples. The judicious use of these advanced methods should enable new discoveries and produce results with better precision, robustness and clarity.

  11. Advanced methods in diagnosis and therapy

    International Nuclear Information System (INIS)

    1987-01-01

    This important meeting covers the following topics: use and optimization of monoclonal antibobies in oncology: - Tumor markers: Clinical follow-up of patients through tumor marker serum determinations. - Cancer and medical imaging: The use of monoclonal antibodies in immunoscintigraphy. - Immunoradiotherapy: Monoclonal antibodies as therapeutic vectors. Advanced methods in diagnosis: - Contribution of monoclonal antibodies in modern immunochemistry (RIA, EIA). - Interest of monoclonal antibody in immunohistochemical pathology diagnosis. - In vitro diagnosis future prospects: with receptors and oncogenes. - Immunofluoroassay: a new sensitive immunoanalytical procedure with broad applications. Recent advances in brachitherapy: - Interest of computer processing. Blood products irradiation: - Interest in transfusion and bone marrow transplantations [fr

  12. Mathematics for natural scientists II advanced methods

    CERN Document Server

    Kantorovich, Lev

    2016-01-01

    This book covers the advanced mathematical techniques useful for physics and engineering students, presented in a form accessible to physics students, avoiding precise mathematical jargon and laborious proofs. Instead, all proofs are given in a simplified form that is clear and convincing for a physicist. Examples, where appropriate, are given from physics contexts. Both solved and unsolved problems are provided in each chapter. Mathematics for Natural Scientists II: Advanced Methods is the second of two volumes. It follows the first volume on Fundamentals and Basics.

  13. The advanced statistical methods in aerobiological studies

    Directory of Open Access Journals (Sweden)

    Agnieszka Grinn-Gofroń

    2012-12-01

    Full Text Available Pollen and spore forecasting has become an important aim in aerobiology. The main goal is to provide accurate information on biological particles in the air to sensitive users in order to help them optimize their treatment process. Many statistical methods of data analysis are based on the assumptions of linearity and normality that often cannot be fulfilled. The advanced statistical methods can be applied to the problems that cannot be solved in any other effective way, and are suited to predicting the concentration of airborne pollen or spores in relation to weather conditions. The purpose of the study was to review some advanced statistical methods that can be used in aerobiological studies.

  14. Advances of evolutionary computation methods and operators

    CERN Document Server

    Cuevas, Erik; Oliva Navarro, Diego Alberto

    2016-01-01

    The goal of this book is to present advances that discuss alternative Evolutionary Computation (EC) developments and non-conventional operators which have proved to be effective in the solution of several complex problems. The book has been structured so that each chapter can be read independently from the others. The book contains nine chapters with the following themes: 1) Introduction, 2) the Social Spider Optimization (SSO), 3) the States of Matter Search (SMS), 4) the collective animal behavior (CAB) algorithm, 5) the Allostatic Optimization (AO) method, 6) the Locust Search (LS) algorithm, 7) the Adaptive Population with Reduced Evaluations (APRE) method, 8) the multimodal CAB, 9) the constrained SSO method.

  15. Advanced statistical methods in data science

    CERN Document Server

    Chen, Jiahua; Lu, Xuewen; Yi, Grace; Yu, Hao

    2016-01-01

    This book gathers invited presentations from the 2nd Symposium of the ICSA- CANADA Chapter held at the University of Calgary from August 4-6, 2015. The aim of this Symposium was to promote advanced statistical methods in big-data sciences and to allow researchers to exchange ideas on statistics and data science and to embraces the challenges and opportunities of statistics and data science in the modern world. It addresses diverse themes in advanced statistical analysis in big-data sciences, including methods for administrative data analysis, survival data analysis, missing data analysis, high-dimensional and genetic data analysis, longitudinal and functional data analysis, the design and analysis of studies with response-dependent and multi-phase designs, time series and robust statistics, statistical inference based on likelihood, empirical likelihood and estimating functions. The editorial group selected 14 high-quality presentations from this successful symposium and invited the presenters to prepare a fu...

  16. Editorial: Latest methods and advances in biotechnology.

    Science.gov (United States)

    Lee, Sang Yup; Jungbauer, Alois

    2014-01-01

    The latest "Biotech Methods and Advances" special issue of Biotechnology Journal continues the BTJ tradition of featuring the latest breakthroughs in biotechnology. The special issue is edited by our Editors-in-Chief, Prof. Sang Yup Lee and Prof. Alois Jungbauer and covers a wide array of topics in biotechnology, including the perennial favorite workhorses of the biotech industry, Chinese hamster ovary (CHO) cell and Escherichia coli. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Advances in Packaging Methods, Processes and Systems

    Directory of Open Access Journals (Sweden)

    Nitaigour Premchand Mahalik

    2014-10-01

    Full Text Available The food processing and packaging industry is becoming a multi-trillion dollar global business. The reason is that the recent increase in incomes in traditionally less economically developed countries has led to a rise in standards of living that includes a significantly higher consumption of packaged foods. As a result, food safety guidelines have been more stringent than ever. At the same time, the number of research and educational institutions—that is, the number of potential researchers and stakeholders—has increased in the recent past. This paper reviews recent developments in food processing and packaging (FPP, keeping in view the aforementioned advancements and bearing in mind that FPP is an interdisciplinary area in that materials, safety, systems, regulation, and supply chains play vital roles. In particular, the review covers processing and packaging principles, standards, interfaces, techniques, methods, and state-of-the-art technologies that are currently in use or in development. Recent advances such as smart packaging, non-destructive inspection methods, printing techniques, application of robotics and machineries, automation architecture, software systems and interfaces are reviewed.

  18. Advanced fault diagnosis methods in molecular networks.

    Science.gov (United States)

    Habibi, Iman; Emamian, Effat S; Abdi, Ali

    2014-01-01

    Analysis of the failure of cell signaling networks is an important topic in systems biology and has applications in target discovery and drug development. In this paper, some advanced methods for fault diagnosis in signaling networks are developed and then applied to a caspase network and an SHP2 network. The goal is to understand how, and to what extent, the dysfunction of molecules in a network contributes to the failure of the entire network. Network dysfunction (failure) is defined as failure to produce the expected outputs in response to the input signals. Vulnerability level of a molecule is defined as the probability of the network failure, when the molecule is dysfunctional. In this study, a method to calculate the vulnerability level of single molecules for different combinations of input signals is developed. Furthermore, a more complex yet biologically meaningful method for calculating the multi-fault vulnerability levels is suggested, in which two or more molecules are simultaneously dysfunctional. Finally, a method is developed for fault diagnosis of networks based on a ternary logic model, which considers three activity levels for a molecule instead of the previously published binary logic model, and provides equations for the vulnerabilities of molecules in a ternary framework. Multi-fault analysis shows that the pairs of molecules with high vulnerability typically include a highly vulnerable molecule identified by the single fault analysis. The ternary fault analysis for the caspase network shows that predictions obtained using the more complex ternary model are about the same as the predictions of the simpler binary approach. This study suggests that by increasing the number of activity levels the complexity of the model grows; however, the predictive power of the ternary model does not appear to be increased proportionally.

  19. Advanced continuous cultivation methods for systems microbiology.

    Science.gov (United States)

    Adamberg, Kaarel; Valgepea, Kaspar; Vilu, Raivo

    2015-09-01

    Increasing the throughput of systems biology-based experimental characterization of in silico-designed strains has great potential for accelerating the development of cell factories. For this, analysis of metabolism in the steady state is essential as only this enables the unequivocal definition of the physiological state of cells, which is needed for the complete description and in silico reconstruction of their phenotypes. In this review, we show that for a systems microbiology approach, high-resolution characterization of metabolism in the steady state--growth space analysis (GSA)--can be achieved by using advanced continuous cultivation methods termed changestats. In changestats, an environmental parameter is continuously changed at a constant rate within one experiment whilst maintaining cells in the physiological steady state similar to chemostats. This increases the resolution and throughput of GSA compared with chemostats, and, moreover, enables following of the dynamics of metabolism and detection of metabolic switch-points and optimal growth conditions. We also describe the concept, challenge and necessary criteria of the systematic analysis of steady-state metabolism. Finally, we propose that such systematic characterization of the steady-state growth space of cells using changestats has value not only for fundamental studies of metabolism, but also for systems biology-based metabolic engineering of cell factories.

  20. Advancing UAS methods for monitoring coastal environments

    Science.gov (United States)

    Ridge, J.; Seymour, A.; Rodriguez, A. B.; Dale, J.; Newton, E.; Johnston, D. W.

    2017-12-01

    Utilizing fixed-wing Unmanned Aircraft Systems (UAS), we are working to improve coastal monitoring by increasing the accuracy, precision, temporal resolution, and spatial coverage of habitat distribution maps. Generally, multirotor aircraft are preferred for precision imaging, but recent advances in fixed-wing technology have greatly increased their capabilities and application for fine-scale (decimeter-centimeter) measurements. Present mapping methods employed by North Carolina coastal managers involve expensive, time consuming and localized observation of coastal environments, which often lack the necessary frequency to make timely management decisions. For example, it has taken several decades to fully map oyster reefs along the NC coast, making it nearly impossible to track trends in oyster reef populations responding to harvesting pressure and water quality degradation. It is difficult for the state to employ manned flights for collecting aerial imagery to monitor intertidal oyster reefs, because flights are usually conducted after seasonal increases in turbidity. In addition, post-storm monitoring of coastal erosion from manned platforms is often conducted days after the event and collects oblique aerial photographs which are difficult to use for accurately measuring change. Here, we describe how fixed wing UAS and standard RGB sensors can be used to rapidly quantify and assess critical coastal habitats (e.g., barrier islands, oyster reefs, etc.), providing for increased temporal frequency to isolate long-term and event-driven (storms, harvesting) impacts. Furthermore, drone-based approaches can accurately image intertidal habitats as well as resolve information such as vegetation density and bathymetry from shallow submerged areas. We obtain UAS imagery of a barrier island and oyster reefs under ideal conditions (low tide, turbidity, and sun angle) to create high resolution (cm scale) maps and digital elevation models to assess habitat condition

  1. Damped time advance methods for particles and EM fields

    International Nuclear Information System (INIS)

    Friedman, A.; Ambrosiano, J.J.; Boyd, J.K.; Brandon, S.T.; Nielsen, D.E. Jr.; Rambo, P.W.

    1990-01-01

    Recent developments in the application of damped time advance methods to plasma simulations include the synthesis of implicit and explicit ''adjustably damped'' second order accurate methods for particle motion and electromagnetic field propagation. This paper discusses this method

  2. Advanced Aqueous Phase Catalyst Development using Combinatorial Methods, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Combinatorial methods are proposed to develop advanced Aqueous Oxidation Catalysts (AOCs) with the capability to mineralize organic contaminants present in effluents...

  3. Advanced Source Deconvolution Methods for Compton Telescopes

    Science.gov (United States)

    Zoglauer, Andreas

    The next generation of space telescopes utilizing Compton scattering for astrophysical observations is destined to one day unravel the mysteries behind Galactic nucleosynthesis, to determine the origin of the positron annihilation excess near the Galactic center, and to uncover the hidden emission mechanisms behind gamma-ray bursts. Besides astrophysics, Compton telescopes are establishing themselves in heliophysics, planetary sciences, medical imaging, accelerator physics, and environmental monitoring. Since the COMPTEL days, great advances in the achievable energy and position resolution were possible, creating an extremely vast, but also extremely sparsely sampled data space. Unfortunately, the optimum way to analyze the data from the next generation of Compton telescopes has not yet been found, which can retrieve all source parameters (location, spectrum, polarization, flux) and achieves the best possible resolution and sensitivity at the same time. This is especially important for all sciences objectives looking at the inner Galaxy: the large amount of expected sources, the high background (internal and Galactic diffuse emission), and the limited angular resolution, make it the most taxing case for data analysis. In general, two key challenges exist: First, what are the best data space representations to answer the specific science questions? Second, what is the best way to deconvolve the data to fully retrieve the source parameters? For modern Compton telescopes, the existing data space representations can either correctly reconstruct the absolute flux (binned mode) or achieve the best possible resolution (list-mode), both together were not possible up to now. Here we propose to develop a two-stage hybrid reconstruction method which combines the best aspects of both. Using a proof-of-concept implementation we can for the first time show that it is possible to alternate during each deconvolution step between a binned-mode approach to get the flux right and a

  4. Advanced Methods of Biomedical Signal Processing

    CERN Document Server

    Cerutti, Sergio

    2011-01-01

    This book grew out of the IEEE-EMBS Summer Schools on Biomedical Signal Processing, which have been held annually since 2002 to provide the participants state-of-the-art knowledge on emerging areas in biomedical engineering. Prominent experts in the areas of biomedical signal processing, biomedical data treatment, medicine, signal processing, system biology, and applied physiology introduce novel techniques and algorithms as well as their clinical or physiological applications. The book provides an overview of a compelling group of advanced biomedical signal processing techniques, such as mult

  5. Recent advances in boundary element methods

    CERN Document Server

    Manolis, GD

    2009-01-01

    Addresses the needs of the computational mechanics research community in terms of information on boundary integral equation-based methods and techniques applied to a variety of fields. This book collects both original and review articles on contemporary Boundary Element Methods (BEM) as well as on the Mesh Reduction Methods (MRM).

  6. Advanced methods in teaching reactor physics

    International Nuclear Information System (INIS)

    Snoj, Luka; Kromar, Marjan; Zerovnik, Gasper; Ravnik, Matjaz

    2011-01-01

    Modern computer codes allow detailed neutron transport calculations. In combination with advanced 3D visualization software capable of treating large amounts of data in real time they form a powerful tool that can be used as a convenient modern educational tool for (nuclear power plant) operators, nuclear engineers, students and specialists involved in reactor operation and design. Visualization is applicable not only in education and training, but also as a tool for fuel management, core analysis and irradiation planning. The paper treats the visualization of neutron transport in different moderators, neutron flux and power distributions in two nuclear reactors (TRIGA type research reactor and typical PWR). The distributions are calculated with MCNP and CORD-2 computer codes and presented using Amira software.

  7. Advanced methods in teaching reactor physics

    Energy Technology Data Exchange (ETDEWEB)

    Snoj, Luka, E-mail: luka.snoj@ijs.s [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Kromar, Marjan, E-mail: marjan.kromar@ijs.s [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Zerovnik, Gasper, E-mail: gasper.zerovnik@ijs.s [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Ravnik, Matjaz [Jozef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia)

    2011-04-15

    Modern computer codes allow detailed neutron transport calculations. In combination with advanced 3D visualization software capable of treating large amounts of data in real time they form a powerful tool that can be used as a convenient modern educational tool for (nuclear power plant) operators, nuclear engineers, students and specialists involved in reactor operation and design. Visualization is applicable not only in education and training, but also as a tool for fuel management, core analysis and irradiation planning. The paper treats the visualization of neutron transport in different moderators, neutron flux and power distributions in two nuclear reactors (TRIGA type research reactor and typical PWR). The distributions are calculated with MCNP and CORD-2 computer codes and presented using Amira software.

  8. Advances in iterative methods for nonlinear equations

    CERN Document Server

    Busquier, Sonia

    2016-01-01

    This book focuses on the approximation of nonlinear equations using iterative methods. Nine contributions are presented on the construction and analysis of these methods, the coverage encompassing convergence, efficiency, robustness, dynamics, and applications. Many problems are stated in the form of nonlinear equations, using mathematical modeling. In particular, a wide range of problems in Applied Mathematics and in Engineering can be solved by finding the solutions to these equations. The book reveals the importance of studying convergence aspects in iterative methods and shows that selection of the most efficient and robust iterative method for a given problem is crucial to guaranteeing a good approximation. A number of sample criteria for selecting the optimal method are presented, including those regarding the order of convergence, the computational cost, and the stability, including the dynamics. This book will appeal to researchers whose field of interest is related to nonlinear problems and equations...

  9. Advanced finite element method in structural engineering

    CERN Document Server

    Long, Yu-Qiu; Long, Zhi-Fei

    2009-01-01

    This book systematically introduces the research work on the Finite Element Method completed over the past 25 years. Original theoretical achievements and their applications in the fields of structural engineering and computational mechanics are discussed.

  10. Core design methods for advanced LMFBRs

    International Nuclear Information System (INIS)

    Chandler, J.C.; Marr, D.R.; McCurry, D.C.; Cantley, D.A.

    1977-05-01

    The multidiscipline approach to advanced LMFBR core design requires an iterative design procedure to obtain a closely-coupled design. HEDL's philosophy requires that the designs should be coupled to the extent that the design limiting fuel pin, the design limiting duct and the core reactivity lifetime should all be equal and should equal the fuel residence time. The design procedure consists of an iterative loop involving three stages of the design sequence. Stage 1 consists of general mechanical design and reactor physics scoping calculations to arrive at an initial core layout. Stage 2 consists of detailed reactor physics calculations for the core configuration arrived at in Stage 1. Based upon the detailed reactor physics results, a decision is made either to alter the design (Stage 1) or go to Stage 3. Stage 3 consists of core orificing and detailed component mechanical design calculations. At this point, an assessment is made regarding design adequacy. If the design is inadequate the entire procedure is repeated until the design is acceptable

  11. Advanced verification methods for OVI security ink

    Science.gov (United States)

    Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom

    2006-02-01

    OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.

  12. Recent advances in coupled-cluster methods

    CERN Document Server

    Bartlett, Rodney J

    1997-01-01

    Today, coupled-cluster (CC) theory has emerged as the most accurate, widely applicable approach for the correlation problem in molecules. Furthermore, the correct scaling of the energy and wavefunction with size (i.e. extensivity) recommends it for studies of polymers and crystals as well as molecules. CC methods have also paid dividends for nuclei, and for certain strongly correlated systems of interest in field theory.In order for CC methods to have achieved this distinction, it has been necessary to formulate new, theoretical approaches for the treatment of a variety of essential quantities

  13. Advances in Chemical Mixtures Risk Methods

    Science.gov (United States)

    This presentation is an overview of emerging issues for dose addition in chemical mixtures risk assessment. It is intended to give the participants a perspective of recent developments in methods for dose addition. The workshop abstract is as follows:This problems-based, half-day...

  14. Advanced Testing Method for Ground Thermal Conductivity

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiaobing [ORNL; Clemenzi, Rick [Geothermal Design Center Inc.; Liu, Su [University of Tennessee (UT)

    2017-04-01

    A new method is developed that can quickly and more accurately determine the effective ground thermal conductivity (GTC) based on thermal response test (TRT) results. Ground thermal conductivity is an important parameter for sizing ground heat exchangers (GHEXs) used by geothermal heat pump systems. The conventional GTC test method usually requires a TRT for 48 hours with a very stable electric power supply throughout the entire test. In contrast, the new method reduces the required test time by 40%–60% or more, and it can determine GTC even with an unstable or intermittent power supply. Consequently, it can significantly reduce the cost of GTC testing and increase its use, which will enable optimal design of geothermal heat pump systems. Further, this new method provides more information about the thermal properties of the GHEX and the ground than previous techniques. It can verify the installation quality of GHEXs and has the potential, if developed, to characterize the heterogeneous thermal properties of the ground formation surrounding the GHEXs.

  15. Advances in organometallic synthesis with mechanochemical methods.

    Science.gov (United States)

    Rightmire, Nicholas R; Hanusa, Timothy P

    2016-02-14

    Solvent-based syntheses have long been normative in all areas of chemistry, although mechanochemical methods (specifically grinding and milling) have been used to good effect for decades in organic, and to a lesser but growing extent, inorganic coordination chemistry. Organometallic synthesis, in contrast, represents a relatively underdeveloped area for mechanochemical research, and the potential benefits are considerable. From access to new classes of unsolvated complexes, to control over stoichiometries that have not been observed in solution routes, mechanochemical (or 'M-chem') approaches have much to offer the synthetic chemist. It has already become clear that removing the solvent from an organometallic reaction can change reaction pathways considerably, so that prediction of the outcome is not always straightforward. This Perspective reviews recent developments in the field, and describes equipment that can be used in organometallic synthesis. Synthetic chemists are encouraged to add mechanochemical methods to their repertoire in the search for new and highly reactive metal complexes and novel types of organometallic transformations.

  16. Advanced Aqueous Phase Catalyst Development using Combinatorial Methods, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The use of combinatorial methods is proposed to rapidly screen catalyst formulations for the advanced development of aqueous phase oxidation catalysts with greater...

  17. Advanced Bayesian Methods for Lunar Surface Navigation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The key innovation of this project is the application of advanced Bayesian methods to integrate real-time dense stereo vision and high-speed optical flow with an...

  18. Advanced Bayesian Methods for Lunar Surface Navigation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The key innovation of this project will be the application of advanced Bayesian methods to integrate real-time dense stereo vision and high-speed optical flow with...

  19. Advanced Methods and Applications in Computational Intelligence

    CERN Document Server

    Nikodem, Jan; Jacak, Witold; Chaczko, Zenon; ACASE 2012

    2014-01-01

    This book offers an excellent presentation of intelligent engineering and informatics foundations for researchers in this field as well as many examples with industrial application. It contains extended versions of selected papers presented at the inaugural ACASE 2012 Conference dedicated to the Applications of Systems Engineering. This conference was held from the 6th to the 8th of February 2012, at the University of Technology, Sydney, Australia, organized by the University of Technology, Sydney (Australia), Wroclaw University of Technology (Poland) and the University of Applied Sciences in Hagenberg (Austria). The  book is organized into three main parts. Part I contains papers devoted to the heuristic approaches that are applicable in situations where the problem cannot be solved by exact methods, due to various characteristics or  dimensionality problems. Part II covers essential issues of the network management, presents intelligent models of the next generation of networks and distributed systems ...

  20. Advanced Topology Optimization Methods for Conceptual Architectural Design

    DEFF Research Database (Denmark)

    Aage, Niels; Amir, Oded; Clausen, Anders

    2014-01-01

    This paper presents a series of new, advanced topology optimization methods, developed specifically for conceptual architectural design of structures. The proposed computational procedures are implemented as components in the framework of a Grasshopper plugin, providing novel capacities in topolo......This paper presents a series of new, advanced topology optimization methods, developed specifically for conceptual architectural design of structures. The proposed computational procedures are implemented as components in the framework of a Grasshopper plugin, providing novel capacities...

  1. Advanced Topology Optimization Methods for Conceptual Architectural Design

    DEFF Research Database (Denmark)

    Aage, Niels; Amir, Oded; Clausen, Anders

    2015-01-01

    This paper presents a series of new, advanced topology optimization methods, developed specifically for conceptual architectural design of structures. The proposed computational procedures are implemented as components in the framework of a Grasshopper plugin, providing novel capacities in topolo......This paper presents a series of new, advanced topology optimization methods, developed specifically for conceptual architectural design of structures. The proposed computational procedures are implemented as components in the framework of a Grasshopper plugin, providing novel capacities...

  2. Polarization control method for UV writing of advanced bragg gratings

    DEFF Research Database (Denmark)

    Deyerl, Hans-Jürgen; Plougmann, Nikolai; Jensen, Jesper Bo Damm

    2002-01-01

    We report the application of the polarization control method for the UV writing of advanced fiber Bragg gratings (FBG). We demonstrate the strength of the new method for different apodization profiles, including the Sinc-profile and two designs for dispersion-free square filters. The method has...

  3. Advanced Fuzzy Potential Field Method for Mobile Robot Obstacle Avoidance

    Science.gov (United States)

    Park, Jong-Wook; Kwak, Hwan-Joo; Kang, Young-Chang; Kim, Dong W.

    2016-01-01

    An advanced fuzzy potential field method for mobile robot obstacle avoidance is proposed. The potential field method primarily deals with the repulsive forces surrounding obstacles, while fuzzy control logic focuses on fuzzy rules that handle linguistic variables and describe the knowledge of experts. The design of a fuzzy controller—advanced fuzzy potential field method (AFPFM)—that models and enhances the conventional potential field method is proposed and discussed. This study also examines the rule-explosion problem of conventional fuzzy logic and assesses the performance of our proposed AFPFM through simulations carried out using a mobile robot. PMID:27123001

  4. Advanced Fuzzy Potential Field Method for Mobile Robot Obstacle Avoidance.

    Science.gov (United States)

    Park, Jong-Wook; Kwak, Hwan-Joo; Kang, Young-Chang; Kim, Dong W

    2016-01-01

    An advanced fuzzy potential field method for mobile robot obstacle avoidance is proposed. The potential field method primarily deals with the repulsive forces surrounding obstacles, while fuzzy control logic focuses on fuzzy rules that handle linguistic variables and describe the knowledge of experts. The design of a fuzzy controller--advanced fuzzy potential field method (AFPFM)--that models and enhances the conventional potential field method is proposed and discussed. This study also examines the rule-explosion problem of conventional fuzzy logic and assesses the performance of our proposed AFPFM through simulations carried out using a mobile robot.

  5. Advanced methods of solid oxide fuel cell modeling

    CERN Document Server

    Milewski, Jaroslaw; Santarelli, Massimo; Leone, Pierluigi

    2011-01-01

    Fuel cells are widely regarded as the future of the power and transportation industries. Intensive research in this area now requires new methods of fuel cell operation modeling and cell design. Typical mathematical models are based on the physical process description of fuel cells and require a detailed knowledge of the microscopic properties that govern both chemical and electrochemical reactions. ""Advanced Methods of Solid Oxide Fuel Cell Modeling"" proposes the alternative methodology of generalized artificial neural networks (ANN) solid oxide fuel cell (SOFC) modeling. ""Advanced Methods

  6. Strategy to Promote Active Learning of an Advanced Research Method

    Science.gov (United States)

    McDermott, Hilary J.; Dovey, Terence M.

    2013-01-01

    Research methods courses aim to equip students with the knowledge and skills required for research yet seldom include practical aspects of assessment. This reflective practitioner report describes and evaluates an innovative approach to teaching and assessing advanced qualitative research methods to final-year psychology undergraduate students. An…

  7. METHODS ADVANCEMENT FOR MILK ANALYSIS: THE MAMA STUDY

    Science.gov (United States)

    The Methods Advancement for Milk Analysis (MAMA) study was designed by US EPA and CDC investigators to provide data to support the technological and study design needs of the proposed National Children=s Study (NCS). The NCS is a multi-Agency-sponsored study, authorized under the...

  8. Higher geometry an introduction to advanced methods in analytic geometry

    CERN Document Server

    Woods, Frederick S

    2005-01-01

    For students of mathematics with a sound background in analytic geometry and some knowledge of determinants, this volume has long been among the best available expositions of advanced work on projective and algebraic geometry. Developed from Professor Woods' lectures at the Massachusetts Institute of Technology, it bridges the gap between intermediate studies in the field and highly specialized works.With exceptional thoroughness, it presents the most important general concepts and methods of advanced algebraic geometry (as distinguished from differential geometry). It offers a thorough study

  9. Advanced airflow distribution methods for reducing exposure of indoor pollution

    DEFF Research Database (Denmark)

    Cao, Guangyu; Nielsen, Peter Vilhelm; Melikov, Arsen

    2017-01-01

    The adverse effect of various indoor pollutants on occupants’ health have been recognized. In public spaces flu viruses may spread from person to person by airflow generated by various traditional ventilation methods, like natural ventilation and mixing ventilation (MV Personalized ventilation (PV......) supplies clean air close to the occupant and directly into the breathing zone. Studies show that it improves the inhaled air quality and reduces the risk of airborne cross-infection in comparison with total volume (TV) ventilation. However, it is still challenging for PV and other advanced air distribution...... methods to reduce the exposure to gaseous and particulate pollutants under disturbed conditions and to ensure thermal comfort at the same time. The objective of this study is to analyse the performance of different advanced airflow distribution methods for protection of occupants from exposure to indoor...

  10. Advanced Markov chain Monte Carlo methods learning from past samples

    CERN Document Server

    Liang, Faming; Carrol, Raymond J

    2010-01-01

    This book provides comprehensive coverage of simulation of complex systems using Monte Carlo methods. Developing algorithms that are immune to the local trap problem has long been considered as the most important topic in MCMC research. Various advanced MCMC algorithms which address this problem have been developed include, the modified Gibbs sampler, the methods based on auxiliary variables and the methods making use of past samples. The focus of this book is on the algorithms that make use of past samples. This book includes the multicanonical algorithm, dynamic weighting, dynamically weight

  11. General advancing front packing algorithm for the discrete element method

    Science.gov (United States)

    Morfa, Carlos A. Recarey; Pérez Morales, Irvin Pablo; de Farias, Márcio Muniz; de Navarra, Eugenio Oñate Ibañez; Valera, Roberto Roselló; Casañas, Harold Díaz-Guzmán

    2018-01-01

    A generic formulation of a new method for packing particles is presented. It is based on a constructive advancing front method, and uses Monte Carlo techniques for the generation of particle dimensions. The method can be used to obtain virtual dense packings of particles with several geometrical shapes. It employs continuous, discrete, and empirical statistical distributions in order to generate the dimensions of particles. The packing algorithm is very flexible and allows alternatives for: 1—the direction of the advancing front (inwards or outwards), 2—the selection of the local advancing front, 3—the method for placing a mobile particle in contact with others, and 4—the overlap checks. The algorithm also allows obtaining highly porous media when it is slightly modified. The use of the algorithm to generate real particle packings from grain size distribution curves, in order to carry out engineering applications, is illustrated. Finally, basic applications of the algorithm, which prove its effectiveness in the generation of a large number of particles, are carried out.

  12. New or improved computational methods and advanced reactor design

    International Nuclear Information System (INIS)

    Nakagawa, Masayuki; Takeda, Toshikazu; Ushio, Tadashi

    1997-01-01

    Nuclear computational method has been studied continuously up to date, as a fundamental technology supporting the nuclear development. At present, research on computational method according to new theory and the calculating method thought to be difficult to practise are also continued actively to find new development due to splendid improvement of features of computer. In Japan, many light water type reactors are now in operations, new computational methods are induced for nuclear design, and a lot of efforts are concentrated for intending to more improvement of economics and safety. In this paper, some new research results on the nuclear computational methods and their application to nuclear design of the reactor were described for introducing recent trend of the nuclear design of the reactor. 1) Advancement of the computational method, 2) Reactor core design and management of the light water reactor, and 3) Nuclear design of the fast reactor. (G.K.)

  13. NATO Advanced Study Institute on Methods in Computational Molecular Physics

    CERN Document Server

    Diercksen, Geerd

    1992-01-01

    This volume records the lectures given at a NATO Advanced Study Institute on Methods in Computational Molecular Physics held in Bad Windsheim, Germany, from 22nd July until 2nd. August, 1991. This NATO Advanced Study Institute sought to bridge the quite considerable gap which exist between the presentation of molecular electronic structure theory found in contemporary monographs such as, for example, McWeeny's Methods 0/ Molecular Quantum Mechanics (Academic Press, London, 1989) or Wilson's Electron correlation in moleeules (Clarendon Press, Oxford, 1984) and the realization of the sophisticated computational algorithms required for their practical application. It sought to underline the relation between the electronic structure problem and the study of nuc1ear motion. Software for performing molecular electronic structure calculations is now being applied in an increasingly wide range of fields in both the academic and the commercial sectors. Numerous applications are reported in areas as diverse as catalysi...

  14. Advanced symbolic analysis for VLSI systems methods and applications

    CERN Document Server

    Shi, Guoyong; Tlelo Cuautle, Esteban

    2014-01-01

    This book provides comprehensive coverage of the recent advances in symbolic analysis techniques for design automation of nanometer VLSI systems. The presentation is organized in parts of fundamentals, basic implementation methods and applications for VLSI design. Topics emphasized include  statistical timing and crosstalk analysis, statistical and parallel analysis, performance bound analysis and behavioral modeling for analog integrated circuits . Among the recent advances, the Binary Decision Diagram (BDD) based approaches are studied in depth. The BDD-based hierarchical symbolic analysis approaches, have essentially broken the analog circuit size barrier. In particular, this book   • Provides an overview of classical symbolic analysis methods and a comprehensive presentation on the modern  BDD-based symbolic analysis techniques; • Describes detailed implementation strategies for BDD-based algorithms, including the principles of zero-suppression, variable ordering and canonical reduction; • Int...

  15. THE CHALLENGES OF ADVANCED MANAGEMENT METHODS FOR THE ROMANIAN ORGANISATIONS

    OpenAIRE

    Eduard EDELHAUSER

    2012-01-01

    The aim of the paper is to study the use of the advanced management methods in Romania, through the evolution of the Enterprise Resource Planning (ERP) and Business Intelligence (BI) systems. The study set sights on Romanian organizations which implemented a SIVECO ERP and BI software and the methodology used is both quantitative and qualitative. In the past few years I have attempted to point out certain essential elements of integrated information systems, used as decision and management in...

  16. Advanced Topology Optimization Methods for Conceptual Architectural Design

    DEFF Research Database (Denmark)

    Aage, Niels; Amir, Oded; Clausen, Anders

    2015-01-01

    in topological optimization: Interactive control and continuous visualization; embedding flexible voids within the design space; consideration of distinct tension / compression properties; and optimization of dual material systems. In extension, optimization procedures for skeletal structures such as trusses......This paper presents a series of new, advanced topology optimization methods, developed specifically for conceptual architectural design of structures. The proposed computational procedures are implemented as components in the framework of a Grasshopper plugin, providing novel capacities...

  17. Advanced applications of boundary-integral equation methods

    International Nuclear Information System (INIS)

    Cruse, T.A.; Wilson, R.B.

    1978-01-01

    Numerical analysis has become the basic tool for both design and research problems in solid mechanics. The need for accuracy and detail, plus the availablity of the high speed computer has led to the development of many new modeling methods ranging from general purpose structural analysis finite element programs to special purpose research programs. The boundary-integral equation (BIE) method is based on classical mathematical techniques but is finding new life as a basic stress analysis tool for engineering applications. The paper summarizes some advanced elastic applications of fracture mechanics and three-dimensional stress analysis, while referencing some of the much broader developmental effort. Future emphasis is needed to exploit the BIE method in conjunction with other techniques such as the finite element method through the creation of hybrid stress analysis methods. (Auth.)

  18. Advanced method of double contrast examination of the stomach

    International Nuclear Information System (INIS)

    Vlasov, P.V.; Yakimenko, V.F.

    1981-01-01

    An advanced method of double contrast examination of the stomach with the use of high concentrated barium suspension is described. It is shown that concentration of barium suspension must be not less than 200 mass/volume per cent to obtain the sharp image of the mucosal microrelief 6 standard position are recommended for the double contrast examination of all stomach walls. 200 patients with different digestive system diseases are examined with the help of developed methods. The sharp image of the mucosal microrelief is obtained in 70% cases [ru

  19. Digital spectral analysis parametric, non-parametric and advanced methods

    CERN Document Server

    Castanié, Francis

    2013-01-01

    Digital Spectral Analysis provides a single source that offers complete coverage of the spectral analysis domain. This self-contained work includes details on advanced topics that are usually presented in scattered sources throughout the literature.The theoretical principles necessary for the understanding of spectral analysis are discussed in the first four chapters: fundamentals, digital signal processing, estimation in spectral analysis, and time-series models.An entire chapter is devoted to the non-parametric methods most widely used in industry.High resolution methods a

  20. Advances in Statistical Methods for Substance Abuse Prevention Research

    Science.gov (United States)

    MacKinnon, David P.; Lockwood, Chondra M.

    2010-01-01

    The paper describes advances in statistical methods for prevention research with a particular focus on substance abuse prevention. Standard analysis methods are extended to the typical research designs and characteristics of the data collected in prevention research. Prevention research often includes longitudinal measurement, clustering of data in units such as schools or clinics, missing data, and categorical as well as continuous outcome variables. Statistical methods to handle these features of prevention data are outlined. Developments in mediation, moderation, and implementation analysis allow for the extraction of more detailed information from a prevention study. Advancements in the interpretation of prevention research results include more widespread calculation of effect size and statistical power, the use of confidence intervals as well as hypothesis testing, detailed causal analysis of research findings, and meta-analysis. The increased availability of statistical software has contributed greatly to the use of new methods in prevention research. It is likely that the Internet will continue to stimulate the development and application of new methods. PMID:12940467

  1. Three Methods of Assessing Values for Advance Care Planning

    Science.gov (United States)

    Karel, Michele J.; Moye, Jennifer; Bank, Adam; Azar, Armin R.

    2016-01-01

    Advance care planning ideally includes communication about values between patients, family members, and care providers. This study examined the utility of health care values assessment tools for older adults with and without dementia. Adults aged 60 and older, with and without dementia, completed three values assessment tools—open-ended, forced-choice, and rating scale questions—and named a preferred surrogate decision maker. Responses to forced-choice items were examined at 9-month retest. Adults with and without dementia appeared equally able to respond meaningfully to questions about values regarding quality of life and health care decisions. People with dementia were generally as able as controls to respond consistently after 9 months. Although values assessment methods show promise, further item and scale development work is needed. Older adults with dementia should be included in clarifying values for advance care planning to the extent that they desire and are able. PMID:17215205

  2. Methods and advances in the study of aeroelasticity with uncertainties

    Directory of Open Access Journals (Sweden)

    Dai Yuting

    2014-06-01

    Full Text Available Uncertainties denote the operators which describe data error, numerical error and model error in the mathematical methods. The study of aeroelasticity with uncertainty embedded in the subsystems, such as the uncertainty in the modeling of structures and aerodynamics, has been a hot topic in the last decades. In this paper, advances of the analysis and design in aeroelasticity with uncertainty are summarized in detail. According to the non-probabilistic or probabilistic uncertainty, the developments of theories, methods and experiments with application to both robust and probabilistic aeroelasticity analysis are presented, respectively. In addition, the advances in aeroelastic design considering either probabilistic or non-probabilistic uncertainties are introduced along with aeroelastic analysis. This review focuses on the robust aeroelasticity study based on the structured singular value method, namely the μ method. It covers the numerical calculation algorithm of the structured singular value, uncertainty model construction, robust aeroelastic stability analysis algorithms, uncertainty level verification, and robust flutter boundary prediction in the flight test, etc. The key results and conclusions are explored. Finally, several promising problems on aeroelasticity with uncertainty are proposed for future investigation.

  3. NATO Advanced Research Workshop on Vectorization of Advanced Methods for Molecular Electronic Structure

    CERN Document Server

    1984-01-01

    That there have been remarkable advances in the field of molecular electronic structure during the last decade is clear not only to those working in the field but also to anyone else who has used quantum chemical results to guide their own investiga­ tions. The progress in calculating the electronic structures of molecules has occurred through the truly ingenious theoretical and methodological developments that have made computationally tractable the underlying physics of electron distributions around a collection of nuclei. At the same time there has been consider­ able benefit from the great advances in computer technology. The growing sophistication, declining costs and increasing accessibi­ lity of computers have let theorists apply their methods to prob­ lems in virtually all areas of molecular science. Consequently, each year witnesses calculations on larger molecules than in the year before and calculations with greater accuracy and more com­ plete information on molecular properties. We can surel...

  4. Recent advances in B-cell epitope prediction methods

    Science.gov (United States)

    2010-01-01

    Identification of epitopes that invoke strong responses from B-cells is one of the key steps in designing effective vaccines against pathogens. Because experimental determination of epitopes is expensive in terms of cost, time, and effort involved, there is an urgent need for computational methods for reliable identification of B-cell epitopes. Although several computational tools for predicting B-cell epitopes have become available in recent years, the predictive performance of existing tools remains far from ideal. We review recent advances in computational methods for B-cell epitope prediction, identify some gaps in the current state of the art, and outline some promising directions for improving the reliability of such methods. PMID:21067544

  5. Advances in product family and product platform design methods & applications

    CERN Document Server

    Jiao, Jianxin; Siddique, Zahed; Hölttä-Otto, Katja

    2014-01-01

    Advances in Product Family and Product Platform Design: Methods & Applications highlights recent advances that have been made to support product family and product platform design and successful applications in industry. This book provides not only motivation for product family and product platform design—the “why” and “when” of platforming—but also methods and tools to support the design and development of families of products based on shared platforms—the “what”, “how”, and “where” of platforming. It begins with an overview of recent product family design research to introduce readers to the breadth of the topic and progresses to more detailed topics and design theory to help designers, engineers, and project managers plan, architect, and implement platform-based product development strategies in their companies. This book also: Presents state-of-the-art methods and tools for product family and product platform design Adopts an integrated, systems view on product family and pro...

  6. Advances in Applications of Hierarchical Bayesian Methods with Hydrological Models

    Science.gov (United States)

    Alexander, R. B.; Schwarz, G. E.; Boyer, E. W.

    2017-12-01

    Mechanistic and empirical watershed models are increasingly used to inform water resource decisions. Growing access to historical stream measurements and data from in-situ sensor technologies has increased the need for improved techniques for coupling models with hydrological measurements. Techniques that account for the intrinsic uncertainties of both models and measurements are especially needed. Hierarchical Bayesian methods provide an efficient modeling tool for quantifying model and prediction uncertainties, including those associated with measurements. Hierarchical methods can also be used to explore spatial and temporal variations in model parameters and uncertainties that are informed by hydrological measurements. We used hierarchical Bayesian methods to develop a hybrid (statistical-mechanistic) SPARROW (SPAtially Referenced Regression On Watershed attributes) model of long-term mean annual streamflow across diverse environmental and climatic drainages in 18 U.S. hydrological regions. Our application illustrates the use of a new generation of Bayesian methods that offer more advanced computational efficiencies than the prior generation. Evaluations of the effects of hierarchical (regional) variations in model coefficients and uncertainties on model accuracy indicates improved prediction accuracies (median of 10-50%) but primarily in humid eastern regions, where model uncertainties are one-third of those in arid western regions. Generally moderate regional variability is observed for most hierarchical coefficients. Accounting for measurement and structural uncertainties, using hierarchical state-space techniques, revealed the effects of spatially-heterogeneous, latent hydrological processes in the "localized" drainages between calibration sites; this improved model precision, with only minor changes in regional coefficients. Our study can inform advances in the use of hierarchical methods with hydrological models to improve their integration with stream

  7. Advanced methods for fabrication of PHWR and LMFBR fuels

    International Nuclear Information System (INIS)

    Ganguly, C.

    1988-01-01

    For self-reliance in nuclear power, the Department of Atomic Energy (DAE), India is pursuing two specific reactor systems, namely the pressurised heavy water reactors (PHWR) and the liquid metal cooled fast breeder reactors (LMFBR). The reference fuel for PHWR is zircaloy-4 clad high density (≤ 96 per cent T.D.) natural UO 2 pellet-pins. The advanced PHWR fuels are UO 2 -PuO 2 (≤ 2 per cent), ThO 2 -PuO 2 (≤ 4 per cent) and ThO 2 -U 233 O 2 (≤ 2 per cent). Similarly, low density (≤ 85 per cent T.D.) (UPu)O 2 pellets clad in SS 316 or D9 is the reference fuel for the first generation of prototype and commercial LMFBRs all over the world. However, (UPu)C and (UPu)N are considered as advanced fuels for LMFBRs mainly because of their shorter doubling time. The conventional method of fabrication of both high and low density oxide, carbide and nitride fuel pellets starting from UO 2 , PuO 2 and ThO 2 powders is 'powder metallurgy (P/M)'. The P/M route has, however, the disadvantage of generation and handling of fine powder particles of the fuel and the associated problem of 'radiotoxic dust hazard'. The present paper summarises the state-of-the-art of advanced methods of fabrication of oxide, carbide and nitride fuels and highlights the author's experience on sol-gel-microsphere-pelletisation (SGMP) route for preparation of these materials. The SGMP process uses sol gel derived, dust-free and free-flowing microspheres of oxides, carbide or nitride for direct pelletisation and sintering. Fuel pellets of both low and high density, excellent microhomogeneity and controlled 'open' or 'closed' porosity could be fabricated via the SGMP route. (author). 5 tables, 14 figs., 15 refs

  8. Methods and systems for advanced spaceport information management

    Science.gov (United States)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  9. The application of advanced rotor (performance) methods for design calculations

    Energy Technology Data Exchange (ETDEWEB)

    Bussel, G.J.W. van [Delft Univ. of Technology, Inst. for Wind Energy, Delft (Netherlands)

    1997-08-01

    The calculation of loads and performance of wind turbine rotors has been a topic for research over the last century. The principles for the calculation of loads on rotor blades with a given specific geometry, as well as the development of optimal shaped rotor blades have been published in the decades that significant aircraft development took place. Nowadays advanced computer codes are used for specific problems regarding modern aircraft, and application to wind turbine rotors has also been performed occasionally. The engineers designing rotor blades for wind turbines still use methods based upon global principles developed in the beginning of the century. The question what to expect in terms of the type of methods to be applied in a design environment for the near future is addressed here. (EG) 14 refs.

  10. Methods and Systems for Advanced Spaceport Information Management

    Science.gov (United States)

    Fussell, Ronald M. (Inventor); Ely, Donald W. (Inventor); Meier, Gary M. (Inventor); Halpin, Paul C. (Inventor); Meade, Phillip T. (Inventor); Jacobson, Craig A. (Inventor); Blackwell-Thompson, Charlie (Inventor)

    2007-01-01

    Advanced spaceport information management methods and systems are disclosed. In one embodiment, a method includes coupling a test system to the payload and transmitting one or more test signals that emulate an anticipated condition from the test system to the payload. One or more responsive signals are received from the payload into the test system and are analyzed to determine whether one or more of the responsive signals comprises an anomalous signal. At least one of the steps of transmitting, receiving, analyzing and determining includes transmitting at least one of the test signals and the responsive signals via a communications link from a payload processing facility to a remotely located facility. In one particular embodiment, the communications link is an Internet link from a payload processing facility to a remotely located facility (e.g. a launch facility, university, etc.).

  11. New reconstruction method for the advanced compton camera

    International Nuclear Information System (INIS)

    Kurihara, Takashi; Ogawa, Koichi

    2007-01-01

    Conventional gammacameras employ a mechanical collimator, which reduces the number of photons detected by such cameras. To address this issue, a Compton camera has been proposed to improve the efficiency of data acquisition by employing electronic collimation. With regard to Compton cameras, the advanced Compton camera (ACC) which has been proposed by Tanimori et al. can restrict the source locations with the help of the recoil electrons that are emitted in the process of Compton scattering. However, the reconstruction methods employed in conventional Compton cameras are inefficient in reconstructing images from the data acquired with the ACC. In this paper, we propose a new reconstruction method that is designed specifically for the ACC. This method, which is an improved version of the source space tree algorithm (SSTA), permits the source distribution to be reconstructed accurately and efficiently. The SSTA is one of the reconstruction methods for conventional Compton cameras proposed by Rohe et al. Our proposed algorithm employs a set of lines that are defined at equiangular intervals in the reconstruction region and the specified voxels of interest that include the search points located on the above predefined lines at equally spaced intervals. The validity of our method is demonstrated by simulations involving the reconstruction of a point source and a disk source. (author)

  12. Advanced numerical methods in mesh generation and mesh adaptation

    Energy Technology Data Exchange (ETDEWEB)

    Lipnikov, Konstantine [Los Alamos National Laboratory; Danilov, A [MOSCOW, RUSSIA; Vassilevski, Y [MOSCOW, RUSSIA; Agonzal, A [UNIV OF LYON

    2010-01-01

    Numerical solution of partial differential equations requires appropriate meshes, efficient solvers and robust and reliable error estimates. Generation of high-quality meshes for complex engineering models is a non-trivial task. This task is made more difficult when the mesh has to be adapted to a problem solution. This article is focused on a synergistic approach to the mesh generation and mesh adaptation, where best properties of various mesh generation methods are combined to build efficiently simplicial meshes. First, the advancing front technique (AFT) is combined with the incremental Delaunay triangulation (DT) to build an initial mesh. Second, the metric-based mesh adaptation (MBA) method is employed to improve quality of the generated mesh and/or to adapt it to a problem solution. We demonstrate with numerical experiments that combination of all three methods is required for robust meshing of complex engineering models. The key to successful mesh generation is the high-quality of the triangles in the initial front. We use a black-box technique to improve surface meshes exported from an unattainable CAD system. The initial surface mesh is refined into a shape-regular triangulation which approximates the boundary with the same accuracy as the CAD mesh. The DT method adds robustness to the AFT. The resulting mesh is topologically correct but may contain a few slivers. The MBA uses seven local operations to modify the mesh topology. It improves significantly the mesh quality. The MBA method is also used to adapt the mesh to a problem solution to minimize computational resources required for solving the problem. The MBA has a solid theoretical background. In the first two experiments, we consider the convection-diffusion and elasticity problems. We demonstrate the optimal reduction rate of the discretization error on a sequence of adaptive strongly anisotropic meshes. The key element of the MBA method is construction of a tensor metric from hierarchical edge

  13. Computational methods of the Advanced Fluid Dynamics Model

    International Nuclear Information System (INIS)

    Bohl, W.R.; Wilhelm, D.; Parker, F.R.

    1987-01-01

    To more accurately treat severe accidents in fast reactors, a program has been set up to investigate new computational models and approaches. The product of this effort is a computer code, the Advanced Fluid Dynamics Model (AFDM). This paper describes some of the basic features of the numerical algorithm used in AFDM. Aspects receiving particular emphasis are the fractional-step method of time integration, the semi-implicit pressure iteration, the virtual mass inertial terms, the use of three velocity fields, higher order differencing, convection of interfacial area with source and sink terms, multicomponent diffusion processes in heat and mass transfer, the SESAME equation of state, and vectorized programming. A calculated comparison with an isothermal tetralin/ammonia experiment is performed. We conclude that significant improvements are possible in reliably calculating the progression of severe accidents with further development

  14. Advanced methods for the study of PWR cores

    International Nuclear Information System (INIS)

    Lambert, M.; Salvatores, St.; Ferrier, A.; Pelet, J.; Nicaise, N.; Pouliquen, J.Y.; Foret, F.; Chauliac, C.; Johner, J.; Cohen, Ch.

    2003-01-01

    This document gathers the transparencies presented at the 6. technical session of the French nuclear energy society (SFEN) in October 2003. The transparencies of the annual meeting are presented in the introductive part: 1 - status of the French nuclear park: nuclear energy results, management of an exceptional climatic situation: the heat wave of summer 2003 and the power generation (J.C. Barral); 2 - status of the research on controlled thermonuclear fusion (J. Johner). Then follows the technical session about the advanced methods for the study of PWR reactor cores: 1 - the evolution approach of study methodologies (M. Lambert, J. Pelet); 2 - the point of view of the nuclear safety authority (D. Brenot); 3 - the improved decoupled methodology for the steam pipe rupture (S. Salvatores, J.Y. Pouliquen); 4 - the MIR method for the pellet-clad interaction (renovated IPG methodology) (E. Baud, C. Royere); 5 - the improved fuel management (IFM) studies for Koeberg (C. Cohen); 6 - principle of the methods of accident study implemented for the European pressurized reactor (EPR) (F. Foret, A. Ferrier); 7 - accident studies with the EPR, steam pipe rupture (N. Nicaise, S. Salvatores); 8 - the co-development platform, a new generation of software tools for the new methodologies (C. Chauliac). (J.S.)

  15. Comparison of advanced iterative reconstruction methods for SPECT/CT

    Energy Technology Data Exchange (ETDEWEB)

    Knoll, Peter; Koechle, Gunnar; Mirzaei, Siroos [Wilhelminenspital, Vienna (Austria). Dept. of Nuclear Medicine and PET Center; Kotalova, Daniela; Samal, Martin [Charles Univ. Prague, Prague (Czech Republic); Kuzelka, Ivan; Zadrazil, Ladislav [Hospital Havlickuv Brod (Czech Republic); Minear, Greg [Landesklinikum St. Poelten (Austria). Dept. of Internal Medicine II; Bergmann, Helmar [Medical Univ. of Vienna (Austria). Center for Medical Physics and Biomedical Engineering

    2012-07-01

    Aim: Corrective image reconstruction methods which produce reconstructed images with improved spatial resolution and decreased noise level became recently commercially available. In this work, we tested the performance of three new software packages with reconstruction schemes recommended by the manufacturers using physical phantoms simulating realistic clinical settings. Methods: A specially designed resolution phantom containing three {sup 99m}Tc lines sources and the NEMA NU-2 image quality phantom were acquired on three different SPECT/CT systems (General Electrics Infinia, Philips BrightView and Siemens Symbia T6). Measurement of both phantoms was done with the trunk filled with a {sup 99m}Tc-water solution. The projection data were reconstructed using the GE's Evolution for Bone {sup registered}, Philips Astonish {sup registered} and Siemens Flash3D {sup registered} software. The reconstruction parameters employed (number of iterations and subsets, the choice of post-filtering) followed theses recommendations of each vendor. These results were compared with reference reconstructions using the ordered subset expectation maximization (OSEM) reconstruction scheme. Results: The best results (smallest value for resolution, highest percent contrast values) for all three packages were found for the scatter corrected data without applying any post-filtering. The advanced reconstruction methods improve the full width at half maximum (FWHM) of the line sources from 11.4 to 9.5 mm (GE), from 9.1 to 6.4 mm (Philips), and from 12.1 to 8.9 mm (Siemens) if no additional post filter was applied. The total image quality control index measured for a concentration ratio of 8:1 improves for GE from 147 to 189, from 179. to 325 for Philips and from 217 to 320 for Siemens using the reference method for comparison. The same trends can be observed for the 4:1 concentration ratio. The use of a post-filter reduces the background variability approximately by a factor of two, but

  16. Radiation Mitigation Methods for Advanced Readout Array, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA is interested in the development of advanced instruments and instrument components for planetary science missions. Specifically, an area of importance in...

  17. Advances in Airborne and Ground Geophysical Methods for Uranium Exploration

    International Nuclear Information System (INIS)

    2013-01-01

    through the use of effective exploration techniques. Geophysical methods with the capability of mapping surface and subsurface parameters in relation to uranium deposition and accumulation are proving to be vital components of current exploration efforts around the world. There is continuous development and improvement of technical and scientific disciplines using measuring instruments and spatially referenced data processing techniques. Newly designed geophysical instruments and their applications in uranium exploration are contributing to an increased probability of successful discoveries. Dissemination of information on advances in geophysical techniques encourages new strategies and promotes new approaches toward uranium exploration. Meetings and conferences organized by the IAEA, collecting the experience of participating countries, as well as its publications and the International Nuclear Information System, play an important role in the dissemination of knowledge of all aspects of the nuclear fuel cycle. The purpose of this report is to highlight advances in airborne and ground geophysical techniques, succinctly describing modern geophysical methods and demonstrating the application of techniques through examples. The report also provides some basic concepts of radioactivity, nuclear radiation and interaction with matter.

  18. Advances in the analysis of iminocyclitols: Methods, sources and bioavailability.

    Science.gov (United States)

    Amézqueta, Susana; Torres, Josep Lluís

    2016-05-01

    Iminocyclitols are chemically and metabolically stable, naturally occurring sugar mimetics. Their biological activities make them interesting and extremely promising as both drug leads and functional food ingredients. The first iminocyclitols were discovered using preparative isolation and purification methods followed by chemical characterization using nuclear magnetic resonance spectroscopy. In addition to this classical approach, gas and liquid chromatography coupled to mass spectrometry are increasingly used; they are highly sensitive techniques capable of detecting minute amounts of analytes in a broad spectrum of sources after only minimal sample preparation. These techniques have been applied to identify new iminocyclitols in plants, microorganisms and synthetic mixtures. The separation of iminocyclitol mixtures by chromatography is particularly difficult however, as the most commonly used matrices have very low selectivity for these highly hydrophilic structurally similar molecules. This review critically summarizes recent advances in the analysis of iminocyclitols from plant sources and findings regarding their quantification in dietary supplements and foodstuffs, as well as in biological fluids and organs, from bioavailability studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Underwater Photosynthesis of Submerged Plants – Recent Advances and Methods

    Science.gov (United States)

    Pedersen, Ole; Colmer, Timothy D.; Sand-Jensen, Kaj

    2013-01-01

    We describe the general background and the recent advances in research on underwater photosynthesis of leaf segments, whole communities, and plant dominated aquatic ecosystems and present contemporary methods tailor made to quantify photosynthesis and carbon fixation under water. The majority of studies of aquatic photosynthesis have been carried out with detached leaves or thalli and this selectiveness influences the perception of the regulation of aquatic photosynthesis. We thus recommend assessing the influence of inorganic carbon and temperature on natural aquatic communities of variable density in addition to studying detached leaves in the scenarios of rising CO2 and temperature. Moreover, a growing number of researchers are interested in tolerance of terrestrial plants during flooding as torrential rains sometimes result in overland floods that inundate terrestrial plants. We propose to undertake studies to elucidate the importance of leaf acclimation of terrestrial plants to facilitate gas exchange and light utilization under water as these acclimations influence underwater photosynthesis as well as internal aeration of plant tissues during submergence. PMID:23734154

  20. Regenerative medicine: advances in new methods and technologies.

    Science.gov (United States)

    Park, Dong-Hyuk; Eve, David J

    2009-11-01

    The articles published in the journal Cell Transplantation - The Regenerative Medicine Journal over the last two years reveal the recent and future cutting-edge research in the fields of regenerative and transplantation medicine. 437 articles were published from 2007 to 2008, a 17% increase compared to the 373 articles in 2006-2007. Neuroscience was still the most common section in both the number of articles and the percentage of all manuscripts published. The increasing interest and rapid advance in bioengineering technology is highlighted by tissue engineering and bioartificial organs being ranked second again. For a similar reason, the methods and new technologies section increased significantly compared to the last period. Articles focusing on the transplantation of stem cell lineages encompassed almost 20% of all articles published. By contrast, the non-stem cell transplantation group which is made up primarily of islet cells, followed by biomaterials and fetal neural tissue, etc. comprised less than 15%. Transplantation of cells pre-treated with medicine or gene transfection to prolong graft survival or promote differentiation into the needed phenotype, was prevalent in the transplantation articles regardless of the kind of cells used. Meanwhile, the majority of non-transplantation-based articles were related to new devices for various purposes, characterization of unknown cells, medicines, cell preparation and/or optimization for transplantation (e.g. isolation and culture), and disease pathology.

  1. Application of the Advanced Distillation Curve Method to Fuels for Advanced Combustion Engine Gasolines

    KAUST Repository

    Burger, Jessica L.

    2015-07-16

    © This article not subject to U.S. Copyright. Published 2015 by the American Chemical Society. Incremental but fundamental changes are currently being made to fuel composition and combustion strategies to diversify energy feedstocks, decrease pollution, and increase engine efficiency. The increase in parameter space (by having many variables in play simultaneously) makes it difficult at best to propose strategic changes to engine and fuel design by use of conventional build-and-test methodology. To make changes in the most time- and cost-effective manner, it is imperative that new computational tools and surrogate fuels are developed. Currently, sets of fuels are being characterized by industry groups, such as the Coordinating Research Council (CRC) and other entities, so that researchers in different laboratories have access to fuels with consistent properties. In this work, six gasolines (FACE A, C, F, G, I, and J) are characterized by the advanced distillation curve (ADC) method to determine the composition and enthalpy of combustion in various distillate volume fractions. Tracking the composition and enthalpy of distillate fractions provides valuable information for determining structure property relationships, and moreover, it provides the basis for the development of equations of state that can describe the thermodynamic properties of these complex mixtures and lead to development of surrogate fuels composed of major hydrocarbon classes found in target fuels.

  2. Advanced diagnostic methods in oral and maxillofacial pathology. Part II: immunohistochemical and immunofluorescent methods.

    Science.gov (United States)

    Jordan, Richard C K; Daniels, Troy E; Greenspan, John S; Regezi, Joseph A

    2002-01-01

    The practice of pathology is currently undergoing significant change, in large part due to advances in the analysis of DNA, RNA, and proteins in tissues. These advances have permitted improved biologic insights into many developmental, inflammatory, metabolic, infectious, and neoplastic diseases. Moreover, molecular analysis has also led to improvements in the accuracy of disease diagnosis and classification. It is likely that, in the future, these methods will increasingly enter into the day-to-day diagnosis and management of patients. The pathologist will continue to play a fundamental role in diagnosis and will likely be in a pivotal position to guide the implementation and interpretation of these tests as they move from the research laboratory into diagnostic pathology. The purpose of this 2-part series is to provide an overview of the principles and applications of current molecular biologic and immunologic tests. In Part I, the biologic fundamentals of DNA, RNA, and proteins and methods that are currently available or likely to become available to the pathologist in the next several years for their isolation and analysis in tissue biopsies were discussed. In Part II, advances in immunohistochemistry and immunofluorescence methods and their application to modern diagnostic pathology are reviewed.

  3. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    International Nuclear Information System (INIS)

    Blanford, E.; Keldrauk, E.; Laufer, M.; Mieler, M.; Wei, J.; Stojadinovic, B.; Peterson, P.F.

    2010-01-01

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement, and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using

  4. ADVANCED SEISMIC BASE ISOLATION METHODS FOR MODULAR REACTORS

    Energy Technology Data Exchange (ETDEWEB)

    E. Blanford; E. Keldrauk; M. Laufer; M. Mieler; J. Wei; B. Stojadinovic; P.F. Peterson

    2010-09-20

    Advanced technologies for structural design and construction have the potential for major impact not only on nuclear power plant construction time and cost, but also on the design process and on the safety, security and reliability of next generation of nuclear power plants. In future Generation IV (Gen IV) reactors, structural and seismic design should be much more closely integrated with the design of nuclear and industrial safety systems, physical security systems, and international safeguards systems. Overall reliability will be increased, through the use of replaceable and modular equipment, and through design to facilitate on-line monitoring, in-service inspection, maintenance, replacement, and decommissioning. Economics will also receive high design priority, through integrated engineering efforts to optimize building arrangements to minimize building heights and footprints. Finally, the licensing approach will be transformed by becoming increasingly performance based and technology neutral, using best-estimate simulation methods with uncertainty and margin quantification. In this context, two structural engineering technologies, seismic base isolation and modular steel-plate/concrete composite structural walls, are investigated. These technologies have major potential to (1) enable standardized reactor designs to be deployed across a wider range of sites, (2) reduce the impact of uncertainties related to site-specific seismic conditions, and (3) alleviate reactor equipment qualification requirements. For Gen IV reactors the potential for deliberate crashes of large aircraft must also be considered in design. This report concludes that base-isolated structures should be decoupled from the reactor external event exclusion system. As an example, a scoping analysis is performed for a rectangular, decoupled external event shell designed as a grillage. This report also reviews modular construction technology, particularly steel-plate/concrete construction using

  5. Advanced Methods for Direct Ink Write Additive Manufacturing

    Energy Technology Data Exchange (ETDEWEB)

    Compel, W. S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Lewicki, J. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2018-01-24

    Lawrence Livermore National Laboratory is one of the world’s premier labs for research and development of additive manufacturing processes. Out of these many processes, direct ink write (DIW) is arguably one of the most relevant for the manufacture of architected polymeric materials, components and hardware. However, a bottleneck in this pipeline that has largely been ignored to date is the lack of advanced software implementation with respect to toolpath execution. There remains to be a convenient, automated method to design and produce complex parts that is user-friendly and enabling for the realization of next generation designs and structures. For a material to be suitable as a DIW ink it must possess the appropriate rheological properties for this process. Most importantly, the material must exhibit shear-thinning in order to extrude through a print head and have a rapid recovery of its static shear modulus. This makes it possible for the extrudate to be self-supporting upon exiting the print head. While this and other prerequisites narrow the scope of ‘offthe- shelf’ printable materials directly amenable to DIW, the process still tolerates a wide range of potential feedstock materials. These include metallic alloys, inorganic solvent borne dispersions, polymeric melts, filler stabilized monomer compositions, pre-elastomeric feedstocks and thermoset resins each of which requires custom print conditions tailored to the individual ink. As such, an ink perfectly suited for DIW may be prematurely determined to be undesirable for the process if printed under the wrong conditions. Defining appropriate print conditions such as extrusion rate, layer height, and maximum bridge length is a vital first step in validating an ink’s DIW capability.

  6. Advanced methods of continuum mechanics for materials and structures

    CERN Document Server

    Aßmus, Marcus

    2016-01-01

    This volume presents a collection of contributions on advanced approaches of continuum mechanics, which were written to celebrate the 60th birthday of Prof. Holm Altenbach. The contributions are on topics related to the theoretical foundations for the analysis of rods, shells and three-dimensional solids, formulation of constitutive models for advanced materials, as well as development of new approaches to the modeling of damage and fractures.

  7. Processing of alnico permanent magnets by advanced directional solidification methods

    Science.gov (United States)

    Zou, Min; Johnson, Francis; Zhang, Wanming; Zhao, Qi; Rutkowski, Stephen F.; Zhou, Lin; Kramer, Matthew J.

    2016-12-01

    Advanced directional solidification methods have been used to produce large (>15 cm length) castings of Alnico permanent magnets with highly oriented columnar microstructures. In combination with subsequent thermomagnetic and draw thermal treatment, this method was used to enable the high coercivity, high-Titanium Alnico composition of 39% Co, 29.5% Fe, 14% Ni, 7.5% Ti, 7% Al, 3% Cu (wt%) to have an intrinsic coercivity (Hci) of 2.0 kOe, a remanence (Br) of 10.2 kG, and an energy product (BH)max of 10.9 MGOe. These properties compare favorably to typical properties for the commercial Alnico 9. Directional solidification of higher Ti compositions yielded anisotropic columnar grained microstructures if high heat extraction rates through the mold surface of at least 200 kW/m2 were attained. This was achieved through the use of a thin walled (5 mm thick) high thermal conductivity SiC shell mold extracted from a molten Sn bath at a withdrawal rate of at least 200 mm/h. However, higher Ti compositions did not result in further increases in magnet performance. Images of the microstructures collected by scanning electron microscopy (SEM) reveal a majority α phase with inclusions of secondary αγ phase. Transmission electron microscopy (TEM) reveals that the α phase has a spinodally decomposed microstructure of FeCo-rich needles in a NiAl-rich matrix. In the 7.5% Ti composition the diameter distribution of the FeCo needles was bimodal with the majority having diameters of approximately 50 nm with a small fraction having diameters of approximately 10 nm. The needles formed a mosaic pattern and were elongated along one crystal direction (parallel to the field used during magnetic annealing). Cu precipitates were observed between the needles. Regions of abnormal spinodal morphology appeared to correlate with secondary phase precipitates. The presence of these abnormalities did not prevent the material from displaying superior magnetic properties in the 7.5% Ti composition

  8. Processing of alnico permanent magnets by advanced directional solidification methods

    Energy Technology Data Exchange (ETDEWEB)

    Zou, Min; Johnson, Francis; Zhang, Wanming; Zhao, Qi; Rutkowski, Stephen F. [Ceramic and Metallurgy Technologies, General Electric Global Research, Niskayuna, NY (United States); Zhou, Lin; Kramer, Matthew J. [Ames Laboratory, Ames, IA (United States); Iowa State University, Ames, IA (United States)

    2016-12-15

    Advanced directional solidification methods have been used to produce large (>15 cm length) castings of Alnico permanent magnets with highly oriented columnar microstructures. In combination with subsequent thermomagnetic and draw thermal treatment, this method was used to enable the high coercivity, high-Titanium Alnico composition of 39% Co, 29.5% Fe, 14% Ni, 7.5% Ti, 7% Al, 3% Cu (wt%) to have an intrinsic coercivity (H{sub ci}) of 2.0 kOe, a remanence (B{sub r}) of 10.2 kG, and an energy product (BH){sub max} of 10.9 MGOe. These properties compare favorably to typical properties for the commercial Alnico 9. Directional solidification of higher Ti compositions yielded anisotropic columnar grained microstructures if high heat extraction rates through the mold surface of at least 200 kW/m{sup 2} were attained. This was achieved through the use of a thin walled (5 mm thick) high thermal conductivity SiC shell mold extracted from a molten Sn bath at a withdrawal rate of at least 200 mm/h. However, higher Ti compositions did not result in further increases in magnet performance. Images of the microstructures collected by scanning electron microscopy (SEM) reveal a majority α phase with inclusions of secondary α{sub γ} phase. Transmission electron microscopy (TEM) reveals that the α phase has a spinodally decomposed microstructure of FeCo-rich needles in a NiAl-rich matrix. In the 7.5% Ti composition the diameter distribution of the FeCo needles was bimodal with the majority having diameters of approximately 50 nm with a small fraction having diameters of approximately 10 nm. The needles formed a mosaic pattern and were elongated along one 〈001〉 crystal direction (parallel to the field used during magnetic annealing). Cu precipitates were observed between the needles. Regions of abnormal spinodal morphology appeared to correlate with secondary phase precipitates. The presence of these abnormalities did not prevent the material from displaying superior magnetic

  9. Advanced scientific computational methods and their applications to nuclear technologies. (3) Introduction of continuum simulation methods and their applications (3)

    International Nuclear Information System (INIS)

    Satake, Shin-ichi; Kunugi, Tomoaki

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the third issue showing the introduction of continuum simulation methods and their applications. Spectral methods and multi-interface calculation methods in fluid dynamics are reviewed. (T. Tanaka)

  10. Preliminary study of clinical staging of moderately advanced and advanced thoracic esophageal carcinoma treated by non-surgical methods

    International Nuclear Information System (INIS)

    Zhu Shuchai; Li Ren; Li Juan; Qiu Rong; Han Chun; Wan Jun

    2004-01-01

    Objective: To explore the clinical staging of moderately advanced and advanced thoracic esophageal carcinoma by evaluating the prognosis and provide criteria for individual treatment. Methods: The authors retrospectively analyzed 500 patients with moderately advanced and advanced thoracic esophageal carcinoma treated by radiotherapy alone. According to the primary lesion length by barium meal X-ray film, the invasion range and the relation between location and the surrounding organs by CT scans the disease category was classified by a 6 stage method and a 4 stage method. With the primary lesion divide into T1, T2a, T2b, T3a, T3b and T4 incorporating the locregional lymph node metastasis, a 6 stage system was obtained, I, IIa , IIb, IIIa, IIIb and IV. The results of this as compared with those of 4 stage system, the following data were finally arrived at. Results: Among the 500 cases, there were T1 23, T2a 111, T2b 157, T3a 84, T3b 82 and T4 43. The survival rates of these six categories showed significant differences (χ 2 =63.32, P 2 =56.29, P 2 =94.29, P 2 =83.48, P<0.05). Conclusions: Both the 6 stage and 4 stage systems are adaptable to predict prognosis of moderately advanced and advanced esophageal carcinoma treated by radiotherapy alone. For simplicity and convenience, the 4 stage classification is recommended. (authors)

  11. Advanced diagnostic methods in oral and maxillofacial pathology. Part I: molecular methods.

    Science.gov (United States)

    Jordan, R C; Daniels, T E; Greenspan, J S; Regezi, J A

    2001-12-01

    The practice of pathology is currently undergoing significant change, in large part due to advances in the analysis of DNA, RNA, and proteins in tissues. These advances have permitted improved biologic insights into many developmental, inflammatory, metabolic, infectious, and neoplastic diseases. Moreover, molecular analysis has also led to improvements in accuracy of disease diagnosis and classification. It is likely that, in the future, these methods will increasingly enter into the day-to-day diagnosis and management of patients. The pathologist will continue to play a fundamental role in diagnosis and will likely be in a pivotal position to guide the implementation and interpretation of these tests as they move from the research laboratory into diagnostic pathology. The purpose of this 2-part series is to provide an overview of the principles and applications of current molecular biologic and immunologic tests. Part I will discuss the biologic fundamentals of DNA, RNA, and proteins and the methods that are currently available or likely to become available to the pathologist in the next several years for their isolation and analysis in tissue biopsies.

  12. Experiences from introduction of peer-to-peer teaching methods in Advanced Biochemistry E2010

    DEFF Research Database (Denmark)

    Brodersen, Ditlev; Etzerodt, Michael; Rasmussen, Jan Trige

    2012-01-01

    During the autumn semester 2010, we experimented with a range of active teaching methods on the course, Advanced Biochemistry, at the Department of Molecular Biology and Genetics.......During the autumn semester 2010, we experimented with a range of active teaching methods on the course, Advanced Biochemistry, at the Department of Molecular Biology and Genetics....

  13. Advanced 3D inverse method for designing turbomachine blades

    Energy Technology Data Exchange (ETDEWEB)

    Dang, T. [Syracuse Univ., NY (United States). Dept. of Mechanical/Aerospace/Manufacturing Engineering

    1995-12-31

    To meet the goal of 60% plant-cycle efficiency or better set in the ATS Program for baseload utility scale power generation, several critical technologies need to be developed. One such need is the improvement of component efficiencies. This work addresses the issue of improving the performance of turbo-machine components in gas turbines through the development of an advanced three-dimensional and viscous blade design system. This technology is needed to replace some elements in current design systems that are based on outdated technology.

  14. Nonlinear dynamics of rotating shallow water methods and advances

    CERN Document Server

    Zeitlin, Vladimir

    2007-01-01

    The rotating shallow water (RSW) model is of wide use as a conceptual tool in geophysical fluid dynamics (GFD), because, in spite of its simplicity, it contains all essential ingredients of atmosphere and ocean dynamics at the synoptic scale, especially in its two- (or multi-) layer version. The book describes recent advances in understanding (in the framework of RSW and related models) of some fundamental GFD problems, such as existence of the slow manifold, dynamical splitting of fast (inertia-gravity waves) and slow (vortices, Rossby waves) motions, nonlinear geostrophic adjustment and wa

  15. Advanced construction methods for new nuclear power plants

    International Nuclear Information System (INIS)

    Bilbao y Leon, Sama; Cleveland, John; Moon, Seong-Gyun; Tyobeka, Bismark

    2009-01-01

    The length of the construction and commissioning phases of nuclear power plants have historically been longer than for conventional fossil fuelled plants, often having a record of delays and cost overruns as a result from several factors including legal interventions and revisions of safety regulations. Recent nuclear construction projects however, have shown that long construction periods for nuclear power plants are no longer the norm. While there are several inter-related factors that influence the construction time, the use of advanced construction techniques has contributed significantly to reducing the construction length of recent nuclear projects. (author)

  16. Classification methods for noise transients in advanced gravitational-wave detectors II: performance tests on Advanced LIGO data

    International Nuclear Information System (INIS)

    Powell, Jade; Heng, Ik Siong; Torres-Forné, Alejandro; Font, José A; Lynch, Ryan; Trifirò, Daniele; Cuoco, Elena; Cavaglià, Marco

    2017-01-01

    The data taken by the advanced LIGO and Virgo gravitational-wave detectors contains short duration noise transients that limit the significance of astrophysical detections and reduce the duty cycle of the instruments. As the advanced detectors are reaching sensitivity levels that allow for multiple detections of astrophysical gravitational-wave sources it is crucial to achieve a fast and accurate characterization of non-astrophysical transient noise shortly after it occurs in the detectors. Previously we presented three methods for the classification of transient noise sources. They are Principal Component Analysis for Transients (PCAT), Principal Component LALInference Burst (PC-LIB) and Wavelet Detection Filter with Machine Learning (WDF-ML). In this study we carry out the first performance tests of these algorithms on gravitational-wave data from the Advanced LIGO detectors. We use the data taken between the 3rd of June 2015 and the 14th of June 2015 during the 7th engineering run (ER7), and outline the improvements made to increase the performance and lower the latency of the algorithms on real data. This work provides an important test for understanding the performance of these methods on real, non stationary data in preparation for the second advanced gravitational-wave detector observation run, planned for later this year. We show that all methods can classify transients in non stationary data with a high level of accuracy and show the benefits of using multiple classifiers. (paper)

  17. Viscous-Inviscid Coupling Methods for Advanced Marine Propeller Applications

    OpenAIRE

    Greve, Martin; Wöckner-Kluwe, Katja; Abdel-Maksoud, Moustafa; Rung, Thomas

    2012-01-01

    The paper reports the development of coupling strategies between an inviscid direct panel method and a viscous RANS method and their application to complex propeller ows. The work is motivated by the prohibitive computational cost associated to unsteady viscous flow simulations using geometrically resolved propellers to analyse the dynamics of ships in seaways. The present effort aims to combine the advantages of the two baseline methods in order to reduce the numerical effort without comprom...

  18. Special issue on warnings: advances in delivery, application, and methods.

    Science.gov (United States)

    Mayhorn, Christopher B; Wogalter, Michael S; Laughery, Kenneth R

    2014-09-01

    This special issue of Applied Ergonomics concerns the topic of warnings, safety communications designed to decrease harm to people and property. The field has evolved over time, and with it there has been advancement in knowledge and application. The current special issue contains 14 articles that reflect three distinguishable areas within the warnings literature where such changes are taking place in the laboratories and workplaces of our international colleagues: (1) multimodality of warning delivery, (2) emerging application areas, and (3) new methodology. This special issue brings together a set of studies investigating various factors that might impact safety behavior in diverse settings and domains where warnings are likely to be encountered. It is our hope that the special issue will motivate to development and exploration of new ideas regarding warning design and their use in a variety of applications that improve safety. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  19. Recent advances in neutral particle transport methods and codes

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1996-01-01

    An overview of ORNL's three-dimensional neutral particle transport code, TORT, is presented. Special features of the code that make it invaluable for large applications are summarized for the prospective user. Advanced capabilities currently under development and installation in the production release of TORT are discussed; they include: multitasking on Cray platforms running the UNICOS operating system; Adjacent cell Preconditioning acceleration scheme; and graphics codes for displaying computed quantities such as the flux. Further developments for TORT and its companion codes to enhance its present capabilities, as well as expand its range of applications are disucssed. Speculation on the next generation of neutron particle transport codes at ORNL, especially regarding unstructured grids and high order spatial approximations, are also mentioned

  20. Proceedings of national workshop on advanced methods for materials characterization

    International Nuclear Information System (INIS)

    2004-10-01

    During the past two decades there had been tremendous growth in the field of material science and a variety of new materials with user specific properties have been developed such as smart shape memory alloys, hybrid materials like glass-ceramics, cermets, met-glasses, inorganic- organic composite layered structures, mixed oxides with negative thermal expansion, functional polymer materials etc. Study of nano-particles and the materials assembled from such particles is another area of active research being pursued all over the world. Preparation and characterization of nano-sized materials is a challenge because of their dimensions and size dependent properties. This has led to the emergence of a variety of advanced techniques, which need to be brought to the attention of the researchers working in the field of material science which requires the expertise of physics, chemistry and process engineering. This volume deals with above aspects and papers relevant to INIS are indexed separately

  1. Adherence to Scientific Method while Advancing Exposure Science

    Science.gov (United States)

    Paul Lioy was simultaneously a staunch adherent to the scientific method and an innovator of new ways to conduct science, particularly related to human exposure. Current challenges to science and the application of the scientific method are presented as they relate the approaches...

  2. Ultrasonic and advanced methods for nondestructive testing and material characterization

    National Research Council Canada - National Science Library

    Chen, C. H

    2007-01-01

    ... and physics among others. There are at least two dozen NDT methods in use. In fact any sensor that can examine the inside of material nondestructively is useful for NDT. However the ultrasonic methods are still most popular because of its capability, flexibility, and relative cost effectiveness. For this reason this book places a heavy emphasis...

  3. Whole genome amplification: Use of advanced isothermal method ...

    African Journals Online (AJOL)

    Laboratory method for amplifying genomic deoxyribonucleic acid (DNA) samples aiming to generate more amounts and sufficient quantity DNA for subsequent specific analysis is named whole genome amplification (WGA). This method is only way to increase input material from few cells and limited DNA contents.

  4. Research advancement of detection methods of Listeria monocytogenes

    Directory of Open Access Journals (Sweden)

    LI Yunxia

    2015-10-01

    Full Text Available Listeria monocytogenes is a food-borne pathogen of both human beings and animals,which has been widely recognized over the world.L.monocytogenes can be found in dairy products,vegetables,meat and other food,so it has caused great threat to human health.How to detect the trace of L.monocytogenes is a key link in the process of food-borne disease prevention and control.A wide variety of culture and alternative methods have been developed in order to detect this pathogen in food.The current articl introduces the traditional detection method and all kinds of rapid methods such as immunoassay,biosensors,Bacteriophage-based detection methods and molecular biological assay.

  5. Advanced Steel Microstructural Classification by Deep Learning Methods.

    Science.gov (United States)

    Azimi, Seyed Majid; Britz, Dominik; Engstler, Michael; Fritz, Mario; Mücklich, Frank

    2018-02-01

    The inner structure of a material is called microstructure. It stores the genesis of a material and determines all its physical and chemical properties. While microstructural characterization is widely spread and well known, the microstructural classification is mostly done manually by human experts, which gives rise to uncertainties due to subjectivity. Since the microstructure could be a combination of different phases or constituents with complex substructures its automatic classification is very challenging and only a few prior studies exist. Prior works focused on designed and engineered features by experts and classified microstructures separately from the feature extraction step. Recently, Deep Learning methods have shown strong performance in vision applications by learning the features from data together with the classification step. In this work, we propose a Deep Learning method for microstructural classification in the examples of certain microstructural constituents of low carbon steel. This novel method employs pixel-wise segmentation via Fully Convolutional Neural Network (FCNN) accompanied by a max-voting scheme. Our system achieves 93.94% classification accuracy, drastically outperforming the state-of-the-art method of 48.89% accuracy. Beyond the strong performance of our method, this line of research offers a more robust and first of all objective way for the difficult task of steel quality appreciation.

  6. On the Radau pseudospectral method: theoretical and implementation advances

    Science.gov (United States)

    Sagliano, Marco; Theil, Stephan; Bergsma, Michiel; D'Onofrio, Vincenzo; Whittle, Lisa; Viavattene, Giulia

    2017-09-01

    In the last decades the theoretical development of more and more refined direct methods, together with a new generation of CPUs, led to a significant improvement of numerical approaches for solving optimal-control problems. One of the most promising class of methods is based on pseudospectral optimal control. These methods do not only provide an efficient algorithm to solve optimal-control problems, but also define a theoretical framework for linking the discrete numerical solution to the analytical one in virtue of the covector mapping theorem. However, several aspects in their implementation can be refined. In this framework SPARTAN, the first European tool based on flipped-Radau pseudospectral method, has been developed. This paper illustrates the aspects implemented for SPARTAN, which can potentially be valid for any other transcription. The novelties included in this work consist specifically of a new hybridization of the Jacobian matrix computation made of four distinct parts. These contributions include a new analytical formulation for expressing Lagrange cost function for open final-time problems, and the use of dual-number theory for ensuring exact differentiation. Moreover, a self-scaling strategy for primal and dual variables, which combines the projected-Jacobian rows normalization and the covector mapping, is described. Three concrete examples show the validity of the novelties introduced, and the quality of the results obtained with the proposed methods.

  7. Advances in methods and applications of reliability and safety analysis

    International Nuclear Information System (INIS)

    Fieandt, J.; Hossi, H.; Laakso, K.; Lyytikaeinen, A.; Niemelae, I.; Pulkkinen, U.; Pulli, T.

    1986-01-01

    The know-how of the reliability and safety design and analysis techniques of Vtt has been established over several years in analyzing the reliability in the Finnish nuclear power plants Loviisa and Olkiluoto. This experience has been later on applied and developed to be used in the process industry, conventional power industry, automation and electronics. VTT develops and transfers methods and tools for reliability and safety analysis to the private and public sectors. The technology transfer takes place in joint development projects with potential users. Several computer-aided methods, such as RELVEC for reliability modelling and analysis, have been developed. The tool developed are today used by major Finnish companies in the fields of automation, nuclear power, shipbuilding and electronics. Development of computer-aided and other methods needed in analysis of operating experience, reliability or safety is further going on in a number of research and development projects

  8. Advanced FDTD methods parallelization, acceleration, and engineering applications

    CERN Document Server

    Yu, Wenhua

    2011-01-01

    The finite-difference time-domain (FDTD) method has revolutionized antenna design and electromagnetics engineering. Here's a cutting-edge book that focuses on the performance optimization and engineering applications of FDTD simulation systems. Covering the latest developments in this area, this unique resource offer you expert advice on the FDTD method, hardware platforms, and network systems. Moreover the book offers guidance in distinguishing between the many different electromagnetics software packages on the market today. You also find a complete chapter dedicated to large multi-scale pro

  9. Advanced evaluation method of SG TSP BEC hole blockage rate

    International Nuclear Information System (INIS)

    Izumida, Hiroyuki; Nagata, Yasuyuki; Harada, Yutaka; Murakami, Ryuji

    2003-01-01

    In spite of the control of the water chemistry of SG secondary feed-water in PWR-SG, SG TSP BEC holes, which are the flow path of secondary water, are often clogged. In the past, the trending of BEC hole blockage rate has conducted by evaluating ECT original signals and visual inspections. However, the ECT original signals of deposits are diversified, it becomes difficult to analyze them with the existing evaluation method using the ECT original signals. In this regard, we have developed the secondary side visual inspection system, which enables the high-accuracy evaluation of BEC hole blockage rate, and new ECT signal evaluation method. (author)

  10. Methods to Determine Recommended Feeder-Wide Advanced Inverter Settings for Improving Distribution System Performance

    Energy Technology Data Exchange (ETDEWEB)

    Rylander, Matthew; Reno, Matthew J.; Quiroz, Jimmy E.; Ding, Fei; Li, Huijuan; Broderick, Robert J.; Mather, Barry; Smith, Jeff

    2016-11-21

    This paper describes methods that a distribution engineer could use to determine advanced inverter settings to improve distribution system performance. These settings are for fixed power factor, volt-var, and volt-watt functionality. Depending on the level of detail that is desired, different methods are proposed to determine single settings applicable for all advanced inverters on a feeder or unique settings for each individual inverter. Seven distinctly different utility distribution feeders are analyzed to simulate the potential benefit in terms of hosting capacity, system losses, and reactive power attained with each method to determine the advanced inverter settings.

  11. Method of public support evaluation for advanced NPP deployment

    International Nuclear Information System (INIS)

    Zezula, L.; Hermansky, B.

    2005-01-01

    Public support of nuclear power could be fully recovered only if the public would, from the very beginning of the new power source selection process, receive transparent information and was made a part of interactive dialogue. The presented method was developed with the objective to facilitate the complex process of the utilities - public interaction. Our method of the public support evaluation allows to classify designs of new nuclear power plants taking into consideration the public attitude to continued nuclear power deployment in the Czech Republic as well as the preference of a certain plant design. The method is based on the model with a set of probabilistic input metrics, which permits to compare the offered concepts with the reference one, with a high degree of objectivity. This method is a part of the more complex evaluation procedure applicable for the new designs assessment that uses the computer code ''Potencial'' developed at the NRI Rez plc. The metrics of the established public support criteria are discussed. (author)

  12. Origins, Methods and Advances in Qualitative Meta-Synthesis

    Science.gov (United States)

    Nye, Elizabeth; Melendez-Torres, G. J.; Bonell, Chris

    2016-01-01

    Qualitative research is a broad term encompassing many methods. Critiques of the field of qualitative research argue that while individual studies provide rich descriptions and insights, the absence of connections drawn between studies limits their usefulness. In response, qualitative meta-synthesis serves as a design to interpret and synthesise…

  13. A CTSA Agenda to Advance Methods for Comparative Effectiveness Research

    Science.gov (United States)

    Helfand, Mark; Tunis, Sean; Whitlock, Evelyn P.; Pauker, Stephen G.; Basu, Anirban; Chilingerian, Jon; Harrell Jr., Frank E.; Meltzer, David O.; Montori, Victor M.; Shepard, Donald S.; Kent, David M.

    2011-01-01

    Abstract Clinical research needs to be more useful to patients, clinicians, and other decision makers. To meet this need, more research should focus on patient‐centered outcomes, compare viable alternatives, and be responsive to individual patients’ preferences, needs, pathobiology, settings, and values. These features, which make comparative effectiveness research (CER) fundamentally patient‐centered, challenge researchers to adopt or develop methods that improve the timeliness, relevance, and practical application of clinical studies. In this paper, we describe 10 priority areas that address 3 critical needs for research on patient‐centered outcomes (PCOR): (1) developing and testing trustworthy methods to identify and prioritize important questions for research; (2) improving the design, conduct, and analysis of clinical research studies; and (3) linking the process and outcomes of actual practice to priorities for research on patient‐centered outcomes. We argue that the National Institutes of Health, through its clinical and translational research program, should accelerate the development and refinement of methods for CER by linking a program of methods research to the broader portfolio of large, prospective clinical and health system studies it supports. Insights generated by this work should be of enormous value to PCORI and to the broad range of organizations that will be funding and implementing CER. Clin Trans Sci 2011; Volume 4: 188–198 PMID:21707950

  14. New advanced in alpha spectrometry by liquid scintillation methods

    Energy Technology Data Exchange (ETDEWEB)

    McDowell, W.J.; Case, G.N.

    1979-01-01

    Although the ability to count alpha particles by liquid scintillation methods has been long recognized, limited use has been made of the method because of problems of high background and alpha energy identification. In recent years some new developments in methods of introducing the alpha-emitting nuclide to the scintillator, in detector construction, and in electronics for processing the energy analog and time analog signals from the detector have allowed significant alleviation of the problems of alpha spectrometry by liquid scintillation. Energy resolutions of 200 to 300 keV full peak width at half maximum and background counts of < 0.01 counts/min with rejection with rejection of > 99% of all beta plus gamma interference is now possible. Alpha liquid scintillation spectrometry is now suitable for a wide range of applications, from the accurate quantitative determination of relatively large amounts of known nuclides in laboratory-generated samples to the detection and identification of very small, subpicocurie amounts of alpha emitters in environmental-type samples. Suitable nuclide separation procedures, sample preparation methods, and instrument configurations are available for a variety of analyses.

  15. Advanced discretizations and multigrid methods for liquid crystal configurations

    Science.gov (United States)

    Emerson, David B.

    Liquid crystals are substances that possess mesophases with properties intermediate between liquids and crystals. Here, we consider nematic liquid crystals, which consist of rod-like molecules whose average pointwise orientation is represented by a unit-length vector, n( x, y, z) = (n1, n 2, n3)T. In addition to their self-structuring properties, nematics are dielectrically active and birefringent. These traits continue to lead to many important applications and discoveries. Numerical simulations of liquid crystal configurations are used to suggest the presence of new physical phenomena, analyze experiments, and optimize devices. This thesis develops a constrained energy-minimization finite-element method for the efficient computation of nematic liquid crystal equilibrium configurations based on a Lagrange multiplier formulation and the Frank-Oseen free-elastic energy model. First-order optimality conditions are derived and linearized via a Newton approach, yielding a linear system of equations. Due to the nonlinear unit-length constraint, novel well-posedness theory for the variational systems, as well as error analysis, is conducted. The approach is shown to constitute a convergent and well-posed approach, absent typical simplifying assumptions. Moreover, the energy-minimization method and well-posedness theory developed for the free-elastic case are extended to include the effects of applied electric fields and flexoelectricity. In the computational algorithm, nested iteration is applied and proves highly effective at reducing computational costs. Additionally, an alternative technique is studied, where the unit-length constraint is imposed by a penalty method. The performance of the penalty and Lagrange multiplier methods is compared. Furthermore, tailored trust-region strategies are introduced to improve robustness and efficiency. While both approaches yield effective algorithms, the Lagrange multiplier method demonstrates superior accuracy per unit cost. In

  16. Development and application of advanced methods for electronic structure calculations

    DEFF Research Database (Denmark)

    Schmidt, Per Simmendefeldt

    ground state energies, is used to calculate accurate adsorption energies for a wide range of reactions. The results are in good agreement with experimental values, where available. Additionally, a database consisting of 200 highly accurate adsorption energies is constructed to benchmark the accuracy......This thesis relates to improvements and applications of beyond-DFT methods for electronic structure calculations that are applied in computational material science. The improvements are of both technical and principal character. The well-known GW approximation is optimized for accurate calculations...... of electronic excitations in two-dimensional materials by exploiting exact limits of the screened Coulomb potential. This approach reduces the computational time by an order of magnitude, enabling large scale applications. The GW method is further improved by including so-called vertex corrections. This turns...

  17. Promising method advancement in palynology: a supplement to pollen analysis

    DEFF Research Database (Denmark)

    Enevold, Renée; Odgaard, Bent Vad

    2016-01-01

    and archaeological samples are often numerous in types as well as in abundance. The term encompasses a variety of organic remains from microscopic organisms including fungi, algae, insects and amoebae. Preparing these soilm samples with standard methods based on acid digestion holds the potential of severe bias...... leaving the assemblages devoid of acid vulnerable NPPs. In many cases it might be worth the effort to prepare the samples with as mild a preparation method as possible for a representative NPP assemblage. We have mildly prepared samples from a small water hole, Tårup Lund, Denmark. The sediment from...... the water hole feature environmental information from the last 6000 years, including a period of locally intense pastoral and/or agricultural activity during the Iron Age. We attempt identifying anthropogenic indicators from the recovered NPP assemblages by comparing to the environmental information derived...

  18. Advance of Therapeutic Methods for Malignant Pleural Effusion

    Directory of Open Access Journals (Sweden)

    Tao-tao XU

    2016-06-01

    Full Text Available Malignant pleural effusion (MPE is a condition caused by primary malignant tumors in the pleura or other malignant tumors metastasis to the pleura. It is also one of common serious complications of middle-late malignant tumor, which has severe impact on the quality of life, even threatening the life of the patients. The selection of treatments for MPE depends on many factors, including the symptoms, performance status, primary tumor types, response to systemic therapy, and degree of lung recruitment maneuvers (LRM after drainage of pleural effusion. Generally, the treatment methods include thoracentesis, indwelling pleural catheter, pleurodesis, intrapleural injection of drugs, chemotherapy, radiotherapy, anti-angiogenesis therapy, surgery, and thermotherapy. With the in-depth study on pathogenesis of MPE, the treatments of MPE have continuous improvements. This study mainly reviewed the treatment methods for MPE so as to provide the basis for clinical practice in the future.

  19. Numerical modeling of spray combustion with an advanced VOF method

    Science.gov (United States)

    Chen, Yen-Sen; Shang, Huan-Min; Shih, Ming-Hsin; Liaw, Paul

    1995-01-01

    This paper summarizes the technical development and validation of a multiphase computational fluid dynamics (CFD) numerical method using the volume-of-fluid (VOF) model and a Lagrangian tracking model which can be employed to analyze general multiphase flow problems with free surface mechanism. The gas-liquid interface mass, momentum and energy conservation relationships are modeled by continuum surface mechanisms. A new solution method is developed such that the present VOF model can be applied for all-speed flow regimes. The objectives of the present study are to develop and verify the fractional volume-of-fluid cell partitioning approach into a predictor-corrector algorithm and to demonstrate the effectiveness of the present approach by simulating benchmark problems including laminar impinging jets, shear coaxial jet atomization and shear coaxial spray combustion flows.

  20. Research advances in control methods of wearable walking assist robots

    Directory of Open Access Journals (Sweden)

    Xia ZHANG

    2016-04-01

    Full Text Available As the proportion of the elderly in China increases, the need for robotic assist walking is growing. The assisted-as-needed (AAN property of a wearable walking assist robot matches a user’s biological need and improves the flexibility, appetency and friendliness of a mechanical system. To realize AAN walking and aiming at realizing master/slave flexible assist, a new hybrid control method consisting of hip joint control based on central pattern generators and knee joint impedance structured control is proposed. The adaptation of a robot's master/slave motion mode to a user's physical function, the continuous switching method for knee joint impedance structured control and its stability, and the AAN effect of the Hybrid control theory are studied, which provides a new thought for the development of wearable walking assist robots.

  1. Advanced hydraulic fracturing methods to create in situ reactive barriers

    International Nuclear Information System (INIS)

    Murdoch, L.

    1997-01-01

    This article describes the use of hydraulic fracturing to increase permeability in geologic formations where in-situ remedial action of contaminant plumes will be performed. Several in-situ treatment strategies are discussed including the use of hydraulic fracturing to create in situ redox zones for treatment of organics and inorganics. Hydraulic fracturing methods offer a mechanism for the in-situ treatment of gently dipping layers of reactive compounds. Specialized methods using real-time monitoring and a high-energy jet during fracturing allow the form of the fracture to be influenced, such as creation of assymmetric fractures beneath potential sources (i.e. tanks, pits, buildings) that should not be penetrated by boring. Some examples of field applications of this technique such as creating fractures filled with zero-valent iron to reductively dechlorinate halogenated hydrocarbons, and the use of granular activated carbon to adsorb compounds are discussed

  2. Development of advanced test methods for cleanroom garments

    International Nuclear Information System (INIS)

    Liu, B.Y.H.; Pui, D.Y.H.; Ye, Yan

    1993-01-01

    The performance of a cleanroom garment in the operating environment of a semiconductor fabrication facility is determined by the properties of the garment fabric and the garment design. Ideally, the garment fabric should be highly permeable to water vapor and essentially impermeable to particles. The design of the garment should minimize the discharge of particles generated by the wearer into the cleanroom air. This paper describes a three-part test method for cleanroom garments to determine water vapor transmission through the garment fabric, particle penetration through the garment fabric under specified conditions of pressure drop and velocity, and the reduction in particle generation and discharge into the cleanroom by the garment. The performance of some typical cleanroom garments as measured by this test method is described and compared

  3. Advanced methods for scattering amplitudes in gauge theories

    Energy Technology Data Exchange (ETDEWEB)

    Peraro, Tiziano

    2014-09-24

    We present new techniques for the evaluation of multi-loop scattering amplitudes and their application to gauge theories, with relevance to the Standard Model phenomenology. We define a mathematical framework for the multi-loop integrand reduction of arbitrary diagrams, and elaborate algebraic approaches, such as the Laurent expansion method, implemented in the software Ninja, and the multivariate polynomial division technique by means of Groebner bases.

  4. Advanced physical models and monitoring methods for in situ bioremediation

    Energy Technology Data Exchange (ETDEWEB)

    Simon, K.; Chalmer, P.

    1996-05-30

    Numerous reports have indicated that contamination at DOE facilities is widespread and pervasive. Existing technology is often too costly or ineffective in remediating these contamination problems. An effective method to address one class of contamination, petroleum hydrocarbons, is in situ bioremediation. This project was designed to provide tools and approaches for increasing the reliability of in situ bioremediation. An example of the recognition within DOE for developing these tools is in the FY-1995 Technology Development Needs Summary of the Office of Technology Development of the US DOE. This document identifies specific needs addressed by this research. For example, Section 3.3 Need Statement IS-3 identifies the need for a {open_quotes}Rapid method to detect in situ biodegradation products.{close_quotes} Also, BW-I identifies the need to recognize boundaries between clean and contaminated materials and soils. Metabolic activity could identify these boundaries. Measuring rates of in situ microbial activity is critical to the fundamental understanding of subsurface microbiology and in selecting natural attenuation as a remediation option. Given the complexity and heterogeneity of subsurface environments, a significant cost incurred during bioremediation is the characterization of microbial activity, in part because so many intermediate end points (biomass, gene frequency, laboratory measurements of activity, etc.) must be used to infer in situ activity. A fast, accurate, real-time, and cost-effective method is needed to determine success of bioremediation at DOE sites.

  5. Advanced scientific computational methods and their applications of nuclear technologies. (1) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (1)

    International Nuclear Information System (INIS)

    Oka, Yoshiaki; Okuda, Hiroshi

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the first issue showing their overview and introduction of continuum simulation methods. Finite element method as their applications is also reviewed. (T. Tanaka)

  6. New trends and advanced methods in interdisciplinary mathematical sciences

    CERN Document Server

    2017-01-01

    The latest of five multidisciplinary volumes, this book spans the STEAM-H (Science, Technology, Engineering, Agriculture, Mathematics, and Health) disciplines with the intent to generate meaningful interdisciplinary interaction and student interest. Emphasis is placed on important methods and applications within and beyond each field. Topics include geometric triple systems, image segmentation, pattern recognition in medicine, pricing barrier options, p-adic numbers distribution in geophysics data pattern, adelic physics, and evolutionary game theory. Contributions were by invitation only and peer-reviewed. Each chapter is reasonably self-contained and pedagogically presented for a multidisciplinary readership.

  7. Advanced Control Methods for Optimization of Arc Welding

    DEFF Research Database (Denmark)

    Thomsen, J. S.

    Gas Metal Arc Welding (GMAW) is a proces used for joining pieces of metal. Probably, the GMAW process is the most successful and widely used welding method in the industry today. A key issue in welding is the quality of the welds produced. The quality of a weld is influenced by several factors...... in the overall welding process; one of these factors are the ability of the welding machine to control the process. The internal control algorithms in GMAW machines are the topic of this PhD project. Basically, the internal control includes an algorithm which is able to keep the electrode at a given distance...

  8. Recent Advances in Conotoxin Classification by Using Machine Learning Methods.

    Science.gov (United States)

    Dao, Fu-Ying; Yang, Hui; Su, Zhen-Dong; Yang, Wuritu; Wu, Yun; Hui, Ding; Chen, Wei; Tang, Hua; Lin, Hao

    2017-06-25

    Conotoxins are disulfide-rich small peptides, which are invaluable peptides that target ion channel and neuronal receptors. Conotoxins have been demonstrated as potent pharmaceuticals in the treatment of a series of diseases, such as Alzheimer's disease, Parkinson's disease, and epilepsy. In addition, conotoxins are also ideal molecular templates for the development of new drug lead compounds and play important roles in neurobiological research as well. Thus, the accurate identification of conotoxin types will provide key clues for the biological research and clinical medicine. Generally, conotoxin types are confirmed when their sequence, structure, and function are experimentally validated. However, it is time-consuming and costly to acquire the structure and function information by using biochemical experiments. Therefore, it is important to develop computational tools for efficiently and effectively recognizing conotoxin types based on sequence information. In this work, we reviewed the current progress in computational identification of conotoxins in the following aspects: (i) construction of benchmark dataset; (ii) strategies for extracting sequence features; (iii) feature selection techniques; (iv) machine learning methods for classifying conotoxins; (v) the results obtained by these methods and the published tools; and (vi) future perspectives on conotoxin classification. The paper provides the basis for in-depth study of conotoxins and drug therapy research.

  9. Advanced methods for the study of PWR's cores

    International Nuclear Information System (INIS)

    Hemmerich, Ph.; Lambert, M.; Pelet, J.; Raymond, P.

    2004-01-01

    A new generation of calculation strategies including methods and tools is under way. The aim is to be more efficient (faster and more accurate) in the simulation of physical processes and to make coupling between neutron transport, thermo-hydraulic and mechanical codes easier. For the study of the basic design of EPR (European pressurized reactor) innovating methods have been introduced, they rely on the latest progress made in the understanding of physical processes and in computing sciences and technology that allow a generalized use of three-dimensional calculations. The main advantage drawn from the evolution of calculation strategies is to identify over-sized margins in the previous ones in order to get room for new and ambitious strategies for the management of fuel in reactor'core without putting at risk safety. The achievement of a new calculation strategy can take 6 to 15 years between its definition to its full implementation, this period has to be reduced. (A.C.)

  10. Advanced Materials Test Methods for Improved Life Prediction of Turbine Engine Components

    National Research Council Canada - National Science Library

    Stubbs, Jack

    2000-01-01

    Phase I final report developed under SBIR contract for Topic # AF00-149, "Durability of Turbine Engine Materials/Advanced Material Test Methods for Improved Use Prediction of Turbine Engine Components...

  11. NATO Advanced Study Institute on Evolving Methods for Macromolecular Gystallography

    CERN Document Server

    Read, Randy J

    2007-01-01

    X-ray crystallography is the pre-eminent technique for visualizing the structures of macromolecules at atomic resolution. These structures are central to understanding the detailed mechanisms of biological processes, and to discovering novel therapeutics using a structure-based approach. As yet, structures are known for only a small fraction of the proteins encoded by human and pathogenic genomes. To counter the myriad modern threats of disease, there is an urgent need to determine the structures of the thousands of proteins whose structure and function remain unknown. This volume draws on the expertise of leaders in the field of macromolecular crystallography to illuminate the dramatic developments that are accelerating progress in structural biology. Their contributions span the range of techniques from crystallization through data collection, structure solution and analysis, and show how modern high-throughput methods are contributing to a deeper understanding of medical problems.

  12. Comparative Assessment of Advanced Gay Hydrate Production Methods

    Energy Technology Data Exchange (ETDEWEB)

    M. D. White; B. P. McGrail; S. K. Wurstner

    2009-06-30

    Displacing natural gas and petroleum with carbon dioxide is a proven technology for producing conventional geologic hydrocarbon reservoirs, and producing additional yields from abandoned or partially produced petroleum reservoirs. Extending this concept to natural gas hydrate production offers the potential to enhance gas hydrate recovery with concomitant permanent geologic sequestration. Numerical simulation was used to assess a suite of carbon dioxide injection techniques for producing gas hydrates from a variety of geologic deposit types. Secondary hydrate formation was found to inhibit contact of the injected CO{sub 2} regardless of injectate phase state, thus diminishing the exchange rate due to pore clogging and hydrate zone bypass of the injected fluids. Additional work is needed to develop methods of artificially introducing high-permeability pathways in gas hydrate zones if injection of CO{sub 2} in either gas, liquid, or micro-emulsion form is to be more effective in enhancing gas hydrate production rates.

  13. Advances in Modal Analysis Using a Robust and Multiscale Method

    Directory of Open Access Journals (Sweden)

    Frisson Christian

    2010-01-01

    Full Text Available Abstract This paper presents a new approach to modal synthesis for rendering sounds of virtual objects. We propose a generic method that preserves sound variety across the surface of an object at different scales of resolution and for a variety of complex geometries. The technique performs automatic voxelization of a surface model and automatic tuning of the parameters of hexahedral finite elements, based on the distribution of material in each cell. The voxelization is performed using a sparse regular grid embedding of the object, which permits the construction of plausible lower resolution approximations of the modal model. We can compute the audible impulse response of a variety of objects. Our solution is robust and can handle nonmanifold geometries that include both volumetric and surface parts. We present a system which allows us to manipulate and tune sounding objects in an appropriate way for games, training simulations, and other interactive virtual environments.

  14. Advanced communication methods developed for nuclear data communication applications

    International Nuclear Information System (INIS)

    Tiwari, Akash; Tiwari, Railesha; Tiwari, S.S.; Panday, Lokesh; Suri, Nitin; Takle, Tarun Rao; Jain, Sanjeev; Gupta, Rishi; Sharma, Dipeeka; Takle, Rahul Rao; Gautam, Rajeev; Bhargava, Vishal; Arora, Himanshu; Agarwal, Ankur; Rupesh; Chawla, Mohit; Sethi, Amardeep Singh; Gupta, Mukesh; Gupta, Ankit; Verma, Neha; Sood, Nitin; Singh, Sunil; Agarwal, Chandresh

    2004-01-01

    We conducted various experiments and tested data communications methods that may be useful for various applications in nuclear industries. We explored the following areas. I. Scientific data communication among scientists within the laboratory and inter-laboratory data exchange. 2.Data from sensors from remote and wired sensors. 3.Data from multiple sensors with small zone. 4.Data from single or multiple sensors from distances above 100 m and less than 10 km. No any single data communication method was found to be the best solution for nuclear applications and multiple modes of communication were found to be advantageous than any single mode of data communication. Network of computers in the control room and in between laboratories connected with optical fiber or an isolated Ethernet coaxial LAN was found to be optimum. Information from multiple analog process sensors in smaller zones like reactor building and laboratories on 12C LAN and short-range wireless LAN were found to be advantageous. Within the laboratory sensor data network of 12C was found to be cost effective and wireless LAN was comparatively expansive. Within a room infrared optical LAN and FSK wireless LAN were found to be highly useful in making the sensors free from wires. Direct sensor interface on FSK wireless link were found to be fast accurate, cost effective over large distance data communication. Such links are the only way to communicate from sea boy and balloons hardware. 1-wire communication network of Dallas Semiconductor USA for weather station data communication Computer to computer communication using optical LAN links has been tried, temperature pressure, humidity, ionizing radiation, generator RPM and voltage and various other analog signals were also transported o FSK optical and wireless links. Multiple sensors needed a dedicated data acquisition system and wireless LAN for data telemetry. (author)

  15. Steam leak detection in advance reactors via acoustics method

    International Nuclear Information System (INIS)

    Singh, Raj Kumar; Rao, A. Rama

    2011-01-01

    Highlights: → Steam leak detection system is developed to detect any leak inside the reactor vault. → The technique uses leak noise frequency spectrum for leak detection. → Testing of system and method to locate the leak is also developed and discussed in present paper. - Abstract: Prediction of LOCA (loss of coolant activity) plays very important role in safety of nuclear reactor. Coolant is responsible for heat transfer from fuel bundles. Loss of coolant is an accidental situation which requires immediate shut down of reactor. Fall in system pressure during LOCA is the trip parameter used for initiating automatic reactor shut down. However, in primary heat transport system operating in two phase regimes, detection of small break LOCA is not simple. Due to very slow leak rates, time for the fall of pressure is significantly slow. From reactor safety point of view, it is extremely important to find reliable and effective alternative for detecting slow pressure drop in case of small break LOCA. One such technique is the acoustic signal caused by LOCA in small breaks. In boiling water reactors whose primary heat transport is to be driven by natural circulation, small break LOCA detection is important. For prompt action on post small break LOCA, steam leak detection system is developed to detect any leak inside the reactor vault. The detection technique is reliable and plays a very important role in ensuring safety of the reactor. Methodology developed for steam leak detection is discussed in present paper. The methods to locate the leak is also developed and discussed in present paper which is based on analysis of the signal.

  16. Advanced hydraulic fracturing methods to create in situ reactive barriers

    International Nuclear Information System (INIS)

    Murdoch, L.; Siegrist, B.; Vesper, S.

    1997-01-01

    Many contaminated areas consist of a source area and a plume. In the source area, the contaminant moves vertically downward from a release point through the vadose zone to an underlying saturated region. Where contaminants are organic liquids, NAPL may accumulate on the water table, or it may continue to migrate downward through the saturated region. Early developments of permeable barrier technology have focused on intercepting horizontally moving plumes with vertical structures, such as trenches, filled with reactive material capable of immobilizing or degrading dissolved contaminants. This focus resulted in part from a need to economically treat the potentially large volumes of contaminated water in a plume, and in part from the availability of construction technology to create the vertical structures that could house reactive compounds. Contaminant source areas, however, have thus far remained largely excluded from the application of permeable barrier technology. One reason for this is the lack of conventional construction methods for creating suitable horizontal structures that would place reactive materials in the path of downward-moving contaminants. Methods of hydraulic fracturing have been widely used to create flat-lying to gently dipping layers of granular material in unconsolidated sediments. Most applications thus far have involved filling fractures with coarse-grained sand to create permeable layers that will increase the discharge of wells recovering contaminated water or vapor. However, it is possible to fill fractures with other compounds that alter the chemical composition of the subsurface. One early application involved development and field testing micro-encapsulated sodium percarbonate, a solid compound that releases oxygen and can create aerobic conditions suitable for biodegradation in the subsurface for several months

  17. Statistical methods of discrimination and classification advances in theory and applications

    CERN Document Server

    Choi, Sung C

    1986-01-01

    Statistical Methods of Discrimination and Classification: Advances in Theory and Applications is a collection of papers that tackles the multivariate problems of discriminating and classifying subjects into exclusive population. The book presents 13 papers that cover that advancement in the statistical procedure of discriminating and classifying. The studies in the text primarily focus on various methods of discriminating and classifying variables, such as multiple discriminant analysis in the presence of mixed continuous and categorical data; choice of the smoothing parameter and efficiency o

  18. Advanced cluster methods for correlated-electron systems

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Andre

    2015-04-27

    In this thesis, quantum cluster methods are used to calculate electronic properties of correlated-electron systems. A special focus lies in the determination of the ground state properties of a 3/4 filled triangular lattice within the one-band Hubbard model. At this filling, the electronic density of states exhibits a so-called van Hove singularity and the Fermi surface becomes perfectly nested, causing an instability towards a variety of spin-density-wave (SDW) and superconducting states. While chiral d+id-wave superconductivity has been proposed as the ground state in the weak coupling limit, the situation towards strong interactions is unclear. Additionally, quantum cluster methods are used here to investigate the interplay of Coulomb interactions and symmetry-breaking mechanisms within the nematic phase of iron-pnictide superconductors. The transition from a tetragonal to an orthorhombic phase is accompanied by a significant change in electronic properties, while long-range magnetic order is not established yet. The driving force of this transition may not only be phonons but also magnetic or orbital fluctuations. The signatures of these scenarios are studied with quantum cluster methods to identify the most important effects. Here, cluster perturbation theory (CPT) and its variational extention, the variational cluster approach (VCA) are used to treat the respective systems on a level beyond mean-field theory. Short-range correlations are incorporated numerically exactly by exact diagonalization (ED). In the VCA, long-range interactions are included by variational optimization of a fictitious symmetry-breaking field based on a self-energy functional approach. Due to limitations of ED, cluster sizes are limited to a small number of degrees of freedom. For the 3/4 filled triangular lattice, the VCA is performed for different cluster symmetries. A strong symmetry dependence and finite-size effects make a comparison of the results from different clusters difficult

  19. An evolutionary method for synthesizing technological planning and architectural advance

    Science.gov (United States)

    Cole, Bjorn Forstrom

    genetic algorithm. The fourth chapter of the thesis walks through the formulation of a method to attack the problem of strategically choosing an architecture. This method is designed to find the optimum architecture under multiple conditions, which is required for the ability to play the "what if" games typically undertaken in strategic situations. The chapter walks through a graph-based representation of architecture, provides the rationale for choosing a given technology forecasting technique, and lays out the implementation of the optimization algorithm, named Sindri, within a commercial analysis code, Pacelab. The fifth chapter of the thesis then tests the Sindri code. The first test applied is a series of standardized combinatorial spaces, which are meant to be analogous to test problems traditionally posed to optimizers (e.g., Rosenbrock's valley function). The results from this test assess the value of various operators used to transform the architecture graph in the course of conducting a genetic search. Finally, this method is employed on a test case involving the transition of a miniature helicopter from glow engine to battery propulsion, and finally to a design where the battery functions as both structure and power source. The final two chapters develop conclusions based on the body of work conducted within this thesis and issue some prescriptions for future work. The future work primarily concerns improving the continuous optimization processes undertaken within Sindri and in further refining the graph-based structure for physical architectures.

  20. Advanced methods in NDE using machine learning approaches

    Science.gov (United States)

    Wunderlich, Christian; Tschöpe, Constanze; Duckhorn, Frank

    2018-04-01

    Machine learning (ML) methods and algorithms have been applied recently with great success in quality control and predictive maintenance. Its goal to build new and/or leverage existing algorithms to learn from training data and give accurate predictions, or to find patterns, particularly with new and unseen similar data, fits perfectly to Non-Destructive Evaluation. The advantages of ML in NDE are obvious in such tasks as pattern recognition in acoustic signals or automated processing of images from X-ray, Ultrasonics or optical methods. Fraunhofer IKTS is using machine learning algorithms in acoustic signal analysis. The approach had been applied to such a variety of tasks in quality assessment. The principal approach is based on acoustic signal processing with a primary and secondary analysis step followed by a cognitive system to create model data. Already in the second analysis steps unsupervised learning algorithms as principal component analysis are used to simplify data structures. In the cognitive part of the software further unsupervised and supervised learning algorithms will be trained. Later the sensor signals from unknown samples can be recognized and classified automatically by the algorithms trained before. Recently the IKTS team was able to transfer the software for signal processing and pattern recognition to a small printed circuit board (PCB). Still, algorithms will be trained on an ordinary PC; however, trained algorithms run on the Digital Signal Processor and the FPGA chip. The identical approach will be used for pattern recognition in image analysis of OCT pictures. Some key requirements have to be fulfilled, however. A sufficiently large set of training data, a high signal-to-noise ratio, and an optimized and exact fixation of components are required. The automated testing can be done subsequently by the machine. By integrating the test data of many components along the value chain further optimization including lifetime and durability

  1. Advanced method for the characterization of polishing suspensions

    Science.gov (United States)

    Trum, Christian J.; Sitzberger, Sebastian; Rascher, Rolf

    2017-06-01

    The industrial production of components for applications in the area of precision optics has a long-standing tradition in Germany. As in almost all branches of industry, the external circumstances, processes and products have changed over time. Large lots are becoming less frequent and the demand for special components is growing. In order to meet these requirements, it is necessary to adapt the production processes quickly and flexibly. In the field of chemo-mechanical polishing (CMP), this means that in addition to the process parameters such as speed, pressure and feed, the task-specific adaptation of suspension and polishing pad carriers gain in importance. Along with these changes, it is becoming increasingly important to compare and evaluate the properties of the various polishing suspensions. The procedures according to DIN 58750-3 and DIN 58750-4 are suitable for this purpose. Due to the clearly defined procedures and the constant boundary conditions, different suspensions can be compared and evaluated. The study presented here shows that this method can also lead to misinterpretations. Known relationships, such as the influence of the polishing pad, the concentration of the suspension and the influence of the processed materials play an important role. An extension of the procedure of DIN 58750-3 for the test of a polishing agent can help in a task-specific characterization of polishing slurries.

  2. Advances in calibration methods for micro- and nanoscale surfaces

    Science.gov (United States)

    Leach, R. K.; Giusca, C. L.; Coupland, J. M.

    2012-04-01

    Optical surface topography measuring instrument manufacturers often quote accuracies of the order of nanometres and claim that the instruments can reliably measure a range of surfaces with structures on the micro- to nanoscale. However, for many years there has been debate about the interpretation of the data from optical surface topography measuring instruments. Optical artefacts in the output data and a lack of a calibration infrastructure mean that it can be difficult to get optical instruments to agree with contact stylus instruments. In this paper, the current situation with areal surface topography measurements is discussed along with the ISO specification standards that are in draft form. An infrastructure is discussed whereby the ISO-defined metrological characteristics of optical instruments can be determined, but these characteristics do not allow the instrument to measure complex surfaces. Current research into methods for determining the transfer function of optical instruments is reviewed, which will allow the calibration of optical instruments to measure complex surfaces, at least in the case of weak scattering. The ability of some optical instruments to measure outside the spatial bandwidth limitation of the numerical aperture is presented and some general outlook for future work given.

  3. Advanced scientific computational methods and their applications to nuclear technologies. (4) Overview of scientific computational methods, introduction of continuum simulation methods and their applications (4)

    International Nuclear Information System (INIS)

    Sekimura, Naoto; Okita, Taira

    2006-01-01

    Scientific computational methods have advanced remarkably with the progress of nuclear development. They have played the role of weft connecting each realm of nuclear engineering and then an introductory course of advanced scientific computational methods and their applications to nuclear technologies were prepared in serial form. This is the fourth issue showing the overview of scientific computational methods with the introduction of continuum simulation methods and their applications. Simulation methods on physical radiation effects on materials are reviewed based on the process such as binary collision approximation, molecular dynamics, kinematic Monte Carlo method, reaction rate method and dislocation dynamics. (T. Tanaka)

  4. Advanced Extraction Methods for Actinide/Lanthanide Separations

    Energy Technology Data Exchange (ETDEWEB)

    Scott, M.J.

    2005-12-01

    The separation of An(III) ions from chemically similar Ln(III) ions is perhaps one of the most difficult problems encountered during the processing of nuclear waste. In the 3+ oxidation states, the metal ions have an identical charge and roughly the same ionic radius. They differ strictly in the relative energies of their f- and d-orbitals, and to separate these metal ions, ligands will need to be developed that take advantage of this small but important distinction. The extraction of uranium and plutonium from nitric acid solution can be performed quantitatively by the extraction with the TBP (tributyl phosphate). Commercially, this process has found wide use in the PUREX (plutonium uranium extraction) reprocessing method. The TRUEX (transuranium extraction) process is further used to coextract the trivalent lanthanides and actinides ions from HLLW generated during PUREX extraction. This method uses CMPO [(N, N-diisobutylcarbamoylmethyl) octylphenylphosphineoxide] intermixed with TBP as a synergistic agent. However, the final separation of trivalent actinides from trivalent lanthanides still remains a challenging task. In TRUEX nitric acid solution, the Am(III) ion is coordinated by three CMPO molecules and three nitrate anions. Taking inspiration from this data and previous work with calix[4]arene systems, researchers on this project have developed a C3-symmetric tris-CMPO ligand system using a triphenoxymethane platform as a base. The triphenoxymethane ligand systems have many advantages for the preparation of complex ligand systems. The compounds are very easy to prepare. The steric and solubility properties can be tuned through an extreme range by the inclusion of different alkoxy and alkyl groups such as methyoxy, ethoxy, t-butoxy, methyl, octyl, t-pentyl, or even t-pentyl at the ortho- and para-positions of the aryl rings. The triphenoxymethane ligand system shows promise as an improved extractant for both tetravalent and trivalent actinide recoveries form

  5. Advanced Extraction Methods for Actinide/Lanthanide Separations

    International Nuclear Information System (INIS)

    Scott, M.J.

    2005-01-01

    The separation of An(III) ions from chemically similar Ln(III) ions is perhaps one of the most difficult problems encountered during the processing of nuclear waste. In the 3+ oxidation states, the metal ions have an identical charge and roughly the same ionic radius. They differ strictly in the relative energies of their f- and d-orbitals, and to separate these metal ions, ligands will need to be developed that take advantage of this small but important distinction. The extraction of uranium and plutonium from nitric acid solution can be performed quantitatively by the extraction with the TBP (tributyl phosphate). Commercially, this process has found wide use in the PUREX (plutonium uranium extraction) reprocessing method. The TRUEX (transuranium extraction) process is further used to coextract the trivalent lanthanides and actinides ions from HLLW generated during PUREX extraction. This method uses CMPO [(N, N-diisobutylcarbamoylmethyl) octylphenylphosphineoxide] intermixed with TBP as a synergistic agent. However, the final separation of trivalent actinides from trivalent lanthanides still remains a challenging task. In TRUEX nitric acid solution, the Am(III) ion is coordinated by three CMPO molecules and three nitrate anions. Taking inspiration from this data and previous work with calix[4]arene systems, researchers on this project have developed a C3-symmetric tris-CMPO ligand system using a triphenoxymethane platform as a base. The triphenoxymethane ligand systems have many advantages for the preparation of complex ligand systems. The compounds are very easy to prepare. The steric and solubility properties can be tuned through an extreme range by the inclusion of different alkoxy and alkyl groups such as methyoxy, ethoxy, t-butoxy, methyl, octyl, t-pentyl, or even t-pentyl at the ortho- and para-positions of the aryl rings. The triphenoxymethane ligand system shows promise as an improved extractant for both tetravalent and trivalent actinide recoveries form

  6. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  7. Demonstration of Advanced Geophysics and Classification Methods on Munitions Response Sites - East Fork Valley Range Complex, Former Camp Hale

    Science.gov (United States)

    2016-04-01

    Technical Report Demonstration of Advanced Geophysics and Classification Methods on Munitions Response Sites - East Fork Valley...AECOM Distribution Statement A FINAL REPORT Demonstration of Advanced Geophysics and Classification Methods on Munitions Response Sites East...NUMBER (Include area code) 12-06-2013 Final Report May - November 2014 Demonstration of Advanced Geophysics and Classification Methods on Munitions

  8. Preliminary investigation on a passive method for parametric instability control in advanced gravitational wave detectors

    International Nuclear Information System (INIS)

    Gras, S; Zhao, C; Ju, L; Blair, D G

    2006-01-01

    In this paper, we propose a passive method to suppress parametric instabilities. This method requires the design of a test mass mirror with appropriate loss distribution. We show that localized losses can significantly reduce the parametric gain without degrading the thermal noise for the advanced LIGO configuration. The method can be use individually or in conjunction with other active feed-back methods. We present numerical analysis for both spherical and non-spherical mirrors

  9. Recent Advances in Computational Methods for Nuclear Magnetic Resonance Data Processing

    KAUST Repository

    Gao, Xin

    2013-01-11

    Although three-dimensional protein structure determination using nuclear magnetic resonance (NMR) spectroscopy is a computationally costly and tedious process that would benefit from advanced computational techniques, it has not garnered much research attention from specialists in bioinformatics and computational biology. In this paper, we review recent advances in computational methods for NMR protein structure determination. We summarize the advantages of and bottlenecks in the existing methods and outline some open problems in the field. We also discuss current trends in NMR technology development and suggest directions for research on future computational methods for NMR.

  10. A Straightforward Method for Advance Estimation of User Charges for Information in Numeric Databases.

    Science.gov (United States)

    Jarvelin, Kalervo

    1986-01-01

    Describes a method for advance estimation of user charges for queries in relational data model-based numeric databases when charges are based on data retrieved. Use of this approach is demonstrated by sample queries to an imaginary marketing database. The principles and methods of this approach and its relevance are discussed. (MBR)

  11. 78 FR 16513 - Application of Advances in Nucleic Acid and Protein Based Detection Methods to Multiplex...

    Science.gov (United States)

    2013-03-15

    ... Methods to Multiplex Detection of Transfusion- Transmissible Agents and Blood Cell Antigens in Blood... Transfusion-Transmissible Agents and Blood Cell Antigens in Blood Donations; Public Workshop AGENCY: Food and... technological advances in gene based and protein based pathogen and blood cell antigen detection methods and to...

  12. 2D automatic body-fitted structured mesh generation using advancing extraction method

    Science.gov (United States)

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like...

  13. System and method to control h2o2 level in advanced oxidation processes

    DEFF Research Database (Denmark)

    2016-01-01

    The present invention relates to a bio-electrochemical system (BES) and a method of in-situ production and removal of H2O2 using such a bio-electrochemical system (BES). Further, the invention relates to a method for in-situ control of H2O2 content in an aqueous system of advanced oxidation...

  14. Advanced microscopic methods for the detection of adhesion barriers in immunology in medical imaging

    Science.gov (United States)

    Lawrence, Shane

    2017-07-01

    Advanced methods of microscopy and advanced techniques of analysis stemming therefrom have developed greatly in the past few years.The use of single discrete methods has given way to the combination of methods which means an increase in data for processing to progress to the analysis and diagnosis of ailments and diseases which can be viewed by each and any method.This presentation shows the combination of such methods and gives example of the data which arises from each individual method and the combined methodology and suggests how such data can be streamlined to enable conclusions to be drawn about the particular biological and biochemical considerations that arise.In this particular project the subject of the methodology was human lactoferrin and the relation of the adhesion properties of hlf in the overcoming of barriers to adhesion mainly on the perimeter of the cellular unit and how this affects the process of immunity in any particular case.

  15. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance

    Energy Technology Data Exchange (ETDEWEB)

    Harris, C.E.

    1994-09-01

    International technical experts in durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The symposium focused on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure, criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and advanced approaches to resist corrosion and environmentally assisted fatigue. Separate abstracts have been indexed for articles from this report.

  16. Human-system safety methods for development of advanced air traffic management systems

    International Nuclear Information System (INIS)

    Nelson, William R.

    1999-01-01

    The Idaho National Engineering and Environmental Laboratory (INEEL) is supporting the National Aeronautics and Space Administration in the development of advanced air traffic management (ATM) systems as part of the Advanced Air Transportation Technologies program. As part of this program INEEL conducted a survey of human-system safety methods that have been applied to complex technical systems, to identify lessons learned from these applications and provide recommendations for the development of advanced ATM systems. The domains that were surveyed included offshore oil and gas, commercial nuclear power, commercial aviation, and military. The survey showed that widely different approaches are used in these industries, and that the methods used range from very high-level, qualitative approaches to very detailed quantitative methods such as human reliability analysis (HRA) and probabilistic safety assessment (PSA). In addition, the industries varied widely in how effectively they incorporate human-system safety assessment in the design, development, and testing of complex technical systems. In spite of the lack of uniformity in the approaches and methods used, it was found that methods are available that can be combined and adapted to support the development of advanced air traffic management systems (author) (ml)

  17. Human-System Safety Methods for Development of Advanced Air Traffic Management Systems

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, W.R.

    1999-05-24

    The Idaho National Engineering and Environmental Laboratory (INEEL) is supporting the National Aeronautics and Space Administration in the development of advanced air traffic management (ATM) systems as part of the Advanced Air Transportation Technologies program. As part of this program INEEL conducted a survey of human-system safety methods that have been applied to complex technical systems, to identify lessons learned from these applications and provide recommendations for the development of advanced ATM systems. The domains that were surveyed included offshore oil and gas, commercial nuclear power, commercial aviation, and military. The survey showed that widely different approaches are used in these industries, and that the methods used range from very high-level, qualitative approaches to very detailed quantitative methods such as human reliability analysis (HRA) and probabilistic safety assessment (PSA). In addition, the industries varied widely in how effectively they incorporate human-system safety assessment in the design, development, and testing of complex technical systems. In spite of the lack of uniformity in the approaches and methods used, it was found that methods are available that can be combined and adapted to support the development of advanced air traffic management systems.

  18. Human-system safety methods for development of advanced air traffic management systems

    Energy Technology Data Exchange (ETDEWEB)

    Nelson, William R. [Idaho National Engineering and Environmental Laboratory, Idaho Falls (United States)

    1999-05-15

    The Idaho National Engineering and Environmental Laboratory (INEEL) is supporting the National Aeronautics and Space Administration in the development of advanced air traffic management (ATM) systems as part of the Advanced Air Transportation Technologies program. As part of this program INEEL conducted a survey of human-system safety methods that have been applied to complex technical systems, to identify lessons learned from these applications and provide recommendations for the development of advanced ATM systems. The domains that were surveyed included offshore oil and gas, commercial nuclear power, commercial aviation, and military. The survey showed that widely different approaches are used in these industries, and that the methods used range from very high-level, qualitative approaches to very detailed quantitative methods such as human reliability analysis (HRA) and probabilistic safety assessment (PSA). In addition, the industries varied widely in how effectively they incorporate human-system safety assessment in the design, development, and testing of complex technical systems. In spite of the lack of uniformity in the approaches and methods used, it was found that methods are available that can be combined and adapted to support the development of advanced air traffic management systems (author) (ml)

  19. Thermodynamic and economic evaluations of a geothermal district heating system using advanced exergy-based methods

    International Nuclear Information System (INIS)

    Tan, Mehmet; Keçebaş, Ali

    2014-01-01

    Highlights: • Evaluation of a GDHS using advanced exergy-based methods. • Comparison of the results of the conventional and advanced exergy-based methods. • The modified exergetic efficiency and exergoeconomic factor are found as 45% and 13%. • Improvement and total cost-savings potentials are found to be 3% and 14%. • All the pumps have the highest improvement potential and total cost-savings potential. - Abstract: In this paper, a geothermal district heating system (GDHS) is comparatively evaluated in terms of thermodynamic and economic aspects using advanced exergy-based methods to identify the potential for improvement, the interactions among system components, and the direction and potential for energy savings. The actual operational data are taken from the Sarayköy GDHS, Turkey. In the advanced exergetic and exergoeconomic analyses, the exergy destruction and the total operating cost within each component of the system are split into endogenous/exogenous and unavoidable/avoidable parts. The advantages of these analyses over conventional ones are demonstrated. The results indicate that the advanced exergy-based method is a more meaningful and effective tool than the conventional one for system performance evaluation. The exergetic efficiency and the exergoeconomic factor of the overall system for the Sarayköy GDHS were determined to be 43.72% and 5.25% according to the conventional tools and 45.06% and 12.98% according to the advanced tools. The improvement potential and the total cost-savings potential of the overall system were also determined to be 2.98% and 14.05%, respectively. All of the pumps have the highest improvement potential and total cost-savings potential because the pumps were selected to have high power during installation at the Sarayköy GDHS

  20. [Isolation and identification methods of enterobacteria group and its technological advancement].

    Science.gov (United States)

    Furuta, Itaru

    2007-08-01

    In the last half-century, isolation and identification methods of enterobacteria groups have markedly improved by technological advancement. Clinical microbiology tests have changed overtime from tube methods to commercial identification kits and automated identification. Tube methods are the original method for the identification of enterobacteria groups, that is, a basically essential method to recognize bacterial fermentation and biochemical principles. In this paper, traditional tube tests are discussed, such as the utilization of carbohydrates, indole, methyl red, and citrate and urease tests. Commercial identification kits and automated instruments by computer based analysis as current methods are also discussed, and those methods provide rapidity and accuracy. Nonculture techniques of nucleic acid typing methods using PCR analysis, and immunochemical methods using monoclonal antibodies can be further developed.

  1. Launch Vehicle Design and Optimization Methods and Priority for the Advanced Engineering Environment

    Science.gov (United States)

    Rowell, Lawrence F.; Korte, John J.

    2003-01-01

    NASA's Advanced Engineering Environment (AEE) is a research and development program that will improve collaboration among design engineers for launch vehicle conceptual design and provide the infrastructure (methods and framework) necessary to enable that environment. In this paper, three major technical challenges facing the AEE program are identified, and three specific design problems are selected to demonstrate how advanced methods can improve current design activities. References are made to studies that demonstrate these design problems and methods, and these studies will provide the detailed information and check cases to support incorporation of these methods into the AEE. This paper provides background and terminology for discussing the launch vehicle conceptual design problem so that the diverse AEE user community can participate in prioritizing the AEE development effort.

  2. Advances in explosives analysis--part II: photon and neutron methods.

    Science.gov (United States)

    Brown, Kathryn E; Greenfield, Margo T; McGrane, Shawn D; Moore, David S

    2016-01-01

    The number and capability of explosives detection and analysis methods have increased dramatically since publication of the Analytical and Bioanalytical Chemistry special issue devoted to Explosives Analysis [Moore DS, Goodpaster JV, Anal Bioanal Chem 395:245-246, 2009]. Here we review and critically evaluate the latest (the past five years) important advances in explosives detection, with details of the improvements over previous methods, and suggest possible avenues towards further advances in, e.g., stand-off distance, detection limit, selectivity, and penetration through camouflage or packaging. The review consists of two parts. Part I discussed methods based on animals, chemicals (including colorimetry, molecularly imprinted polymers, electrochemistry, and immunochemistry), ions (both ion-mobility spectrometry and mass spectrometry), and mechanical devices. This part, Part II, will review methods based on photons, from very energetic photons including X-rays and gamma rays down to the terahertz range, and neutrons.

  3. Advances in explosives analysis--part I: animal, chemical, ion, and mechanical methods.

    Science.gov (United States)

    Brown, Kathryn E; Greenfield, Margo T; McGrane, Shawn D; Moore, David S

    2016-01-01

    The number and capability of explosives detection and analysis methods have increased substantially since the publication of the Analytical and Bioanalytical Chemistry special issue devoted to Explosives Analysis (Moore and Goodpaster, Anal Bioanal Chem 395(2):245-246, 2009). Here we review and critically evaluate the latest (the past five years) important advances in explosives detection, with details of the improvements over previous methods, and suggest possible avenues towards further advances in, e.g., stand-off distance, detection limit, selectivity, and penetration through camouflage or packaging. The review consists of two parts. This part, Part I, reviews methods based on animals, chemicals (including colorimetry, molecularly imprinted polymers, electrochemistry, and immunochemistry), ions (both ion-mobility spectrometry and mass spectrometry), and mechanical devices. Part II will review methods based on photons, from very energetic photons including X-rays and gamma rays down to the terahertz range, and neutrons.

  4. Advanced Semi-Implicit Method (ASIM) for hyperbolic two-fluid model

    International Nuclear Information System (INIS)

    Lee, Sung Jae; Chung, Moon Sun

    2003-01-01

    Introducing the interfacial pressure jump terms based on the surface tension into the momentum equations of two-phase two-fluid model, the system of governing equations is turned mathematically into the hyperbolic system. The eigenvalues of the equation system become always real representing the void wave and the pressure wave propagation speeds as shown in the previous manuscript. To solve the interfacial pressure jump terms with void fraction gradients implicitly, the conventional semi-implicit method should be modified as an intermediate iteration method for void fraction at fractional time step. This Advanced Semi-Implicit Method (ASIM) then becomes stable without conventional additive terms. As a consequence, including the interfacial pressure jump terms with the advanced semi-implicit method, the numerical solutions of typical two-phase problems can be more stable and sound than those calculated exclusively by using any other terms like virtual mass, or artificial viscosity

  5. Advanced methods for a probabilistic safety analysis of fires. Development of advanced methods for performing as far as possible realistic plant specific fire risk analysis (fire PSA)

    International Nuclear Information System (INIS)

    Hofer, E.; Roewekamp, M.; Tuerschmann, M.

    2003-07-01

    In the frame of the research project RS 1112 'Development of Methods for a Recent Probabilistic Safety Analysis, Particularly Level 2' funded by the German Federal Ministry of Economics and Technology (BMWi), advanced methods, in particular for performing as far as possible realistic plant specific fire risk analyses (fire PSA), should be developed. The present Technical Report gives an overview on the methodologies developed in this context for assessing the fire hazard. In the context of developing advanced methodologies for fire PSA, a probabilistic dynamics analysis with a fire simulation code including an uncertainty and sensitivity study has been performed for an exemplary scenario of a cable fire induced by an electric cabinet inside the containment of a modern Konvoi type German nuclear power plant taking into consideration the effects of fire detection and fire extinguishing means. With the present study, it was possible for the first time to determine the probabilities of specified fire effects from a class of fire events by means of probabilistic dynamics supplemented by uncertainty and sensitivity analyses. The analysis applies a deterministic dynamics model, consisting of a dynamic fire simulation code and a model of countermeasures, considering effects of the stochastics (so-called aleatory uncertainties) as well as uncertainties in the state of knowledge (so-called epistemic uncertainties). By this means, probability assessments including uncertainties are provided to be used within the PSA. (orig.) [de

  6. Elementary and advanced Lie algebraic methods with applications to accelerator design, electron microscopes, and light optics

    International Nuclear Information System (INIS)

    Dragt, A.J.

    1987-01-01

    A review is given of elementary Lie algebraic methods for treating Hamiltonian systems. This review is followed by a brief exposition of advanced Lie algebraic methods including resonance bases and conjugacy theorems. Finally, applications are made to the design of third-order achromats for use in accelerators, to the design of subangstroem resolution electron microscopes, and to the classification and study of high order aberrations in light optics. (orig.)

  7. Technology Alignment and Portfolio Prioritization (TAPP): Advanced Methods in Strategic Analysis, Technology Forecasting and Long Term Planning for Human Exploration and Operations, Advanced Exploration Systems and Advanced Concepts

    Science.gov (United States)

    Funaro, Gregory V.; Alexander, Reginald A.

    2015-01-01

    The Advanced Concepts Office (ACO) at NASA, Marshall Space Flight Center is expanding its current technology assessment methodologies. ACO is developing a framework called TAPP that uses a variety of methods, such as association mining and rule learning from data mining, structure development using a Technological Innovation System (TIS), and social network modeling to measure structural relationships. The role of ACO is to 1) produce a broad spectrum of ideas and alternatives for a variety of NASA's missions, 2) determine mission architecture feasibility and appropriateness to NASA's strategic plans, and 3) define a project in enough detail to establish an initial baseline capable of meeting mission objectives ACO's role supports the decision­-making process associated with the maturation of concepts for traveling through, living in, and understanding space. ACO performs concept studies and technology assessments to determine the degree of alignment between mission objectives and new technologies. The first step in technology assessment is to identify the current technology maturity in terms of a technology readiness level (TRL). The second step is to determine the difficulty associated with advancing a technology from one state to the next state. NASA has used TRLs since 1970 and ACO formalized them in 1995. The DoD, ESA, Oil & Gas, and DoE have adopted TRLs as a means to assess technology maturity. However, "with the emergence of more complex systems and system of systems, it has been increasingly recognized that TRL assessments have limitations, especially when considering [the] integration of complex systems." When performing the second step in a technology assessment, NASA requires that an Advancement Degree of Difficulty (AD2) method be utilized. NASA has used and developed or used a variety of methods to perform this step: Expert Opinion or Delphi Approach, Value Engineering or Value Stream, Analytical Hierarchy Process (AHP), Technique for the Order of

  8. Advanced aircraft service life monitoring method via flight-by-flight load spectra

    Science.gov (United States)

    Lee, Hongchul

    This research is an effort to understand current method and to propose an advanced method for Damage Tolerance Analysis (DTA) for the purpose of monitoring the aircraft service life. As one of tasks in the DTA, the current indirect Individual Aircraft Tracking (IAT) method for the F-16C/D Block 32 does not properly represent changes in flight usage severity affecting structural fatigue life. Therefore, an advanced aircraft service life monitoring method based on flight-by-flight load spectra is proposed and recommended for IAT program to track consumed fatigue life as an alternative to the current method which is based on the crack severity index (CSI) value. Damage Tolerance is one of aircraft design philosophies to ensure that aging aircrafts satisfy structural reliability in terms of fatigue failures throughout their service periods. IAT program, one of the most important tasks of DTA, is able to track potential structural crack growth at critical areas in the major airframe structural components of individual aircraft. The F-16C/D aircraft is equipped with a flight data recorder to monitor flight usage and provide the data to support structural load analysis. However, limited memory of flight data recorder allows user to monitor individual aircraft fatigue usage in terms of only the vertical inertia (NzW) data for calculating Crack Severity Index (CSI) value which defines the relative maneuver severity. Current IAT method for the F-16C/D Block 32 based on CSI value calculated from NzW is shown to be not accurate enough to monitor individual aircraft fatigue usage due to several problems. The proposed advanced aircraft service life monitoring method based on flight-by-flight load spectra is recommended as an improved method for the F-16C/D Block 32 aircraft. Flight-by-flight load spectra was generated from downloaded Crash Survival Flight Data Recorder (CSFDR) data by calculating loads for each time hack in selected flight data utilizing loads equations. From

  9. The role of advanced MR methods in the diagnosis of cerebral amyloidoma.

    Science.gov (United States)

    Nossek, Erez; Bashat, Dafna Ben; Artzi, Moran; Rosenberg, Keren; Lichter, Irith; Shtern, Orit; Ami, Haim Ben; Aizenstein, Orna; Vlodavsky, Euvgeni; Constantinescu, Marius; Ram, Zvi

    2009-01-01

    Amyloidoma is a term referring to a tumor-like deposition of extracellular insoluble fibrillar protein. Tumor-like amyloid formation in the brain had been described in isolated cases. However no advanced radiological studies to characterize these lesions have been reported. In the report, we have describe a 59-year-old woman, presented several months prior to diagnosis with memory decline, dizziness, walking instability, and speech difficulties. MRI revealed a left basal ganglia lesion with an intraventricular component. The patient underwent a stereotactic biopsy, which confirmed the diagnosis of amyloidoma, an extensive radiographic characterization of amyloidoma using advanced MR techniques was done, including magnetic resonance spectroscopy, dynamic susceptibility contrast, susceptibility weighted image (SWI), and magnetization transfer (MTR). All advanced MR techniques were able to characterize the amyloidoma as a non-neoplastic process. This is an example where such methods can be used for differential diagnosis of atypical brain lesions.

  10. 2D automatic body-fitted structured mesh generation using advancing extraction method

    Science.gov (United States)

    Zhang, Yaoxin; Jia, Yafei

    2018-01-01

    This paper presents an automatic mesh generation algorithm for body-fitted structured meshes in Computational Fluids Dynamics (CFD) analysis using the Advancing Extraction Method (AEM). The method is applicable to two-dimensional domains with complex geometries, which have the hierarchical tree-like topography with extrusion-like structures (i.e., branches or tributaries) and intrusion-like structures (i.e., peninsula or dikes). With the AEM, the hierarchical levels of sub-domains can be identified, and the block boundary of each sub-domain in convex polygon shape in each level can be extracted in an advancing scheme. In this paper, several examples were used to illustrate the effectiveness and applicability of the proposed algorithm for automatic structured mesh generation, and the implementation of the method.

  11. GREY STATISTICS METHOD OF TECHNOLOGY SELECTION FOR ADVANCED PUBLIC TRANSPORTATION SYSTEMS

    Directory of Open Access Journals (Sweden)

    Chien Hung WEI

    2003-01-01

    Full Text Available Taiwan is involved in intelligent transportation systems planning, and is now selecting its prior focus areas for investment and development. The high social and economic impact associated with which intelligent transportation systems technology are chosen explains the efforts of various electronics and transportation corporations for developing intelligent transportation systems technology to expand their business opportunities. However, there has been no detailed research conducted with regard to selecting technology for advanced public transportation systems in Taiwan. Thus, the present paper demonstrates a grey statistics method integrated with a scenario method for solving the problem of selecting advanced public transportation systems technology for Taiwan. A comprehensive questionnaire survey was conducted to demonstrate the effectiveness of the grey statistics method. The proposed approach indicated that contactless smart card technology is the appropriate technology for Taiwan to develop in the near future. The significance of our research results implies that the grey statistics method is an effective method for selecting advanced public transportation systems technologies. We feel our information will be beneficial to the private sector for developing an appropriate intelligent transportation systems technology strategy.

  12. Balancing of linkages and robot manipulators advanced methods with illustrative examples

    CERN Document Server

    Arakelian, Vigen

    2015-01-01

    In this book advanced balancing methods for planar and spatial linkages, hand operated and automatic robot manipulators are presented. It is organized into three main parts and eight chapters. The main parts are the introduction to balancing, the balancing of linkages and the balancing of robot manipulators. The review of state-of-the-art literature including more than 500 references discloses particularities of shaking force/moment balancing and gravity compensation methods. Then new methods for balancing of linkages are considered. Methods provided in the second part of the book deal with the partial and complete shaking force/moment balancing of various linkages. A new field for balancing methods applications is the design of mechanical systems for fast manipulation. Special attention is given to the shaking force/moment balancing of robot manipulators. Gravity balancing methods are also discussed. The suggested balancing methods are illustrated by numerous examples.

  13. Method Verification Requirements for an Advanced Imaging System for Microbial Plate Count Enumeration.

    Science.gov (United States)

    Jones, David; Cundell, Tony

    2018-01-01

    The Growth Direct™ System that automates the incubation and reading of membrane filtration microbial counts on soybean-casein digest, Sabouraud dextrose, and R2A agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. LAY ABSTRACT: The Growth Direct™ System that automates the incubation and reading of microbial counts on membranes on solid agar differs only from the traditional method in that micro-colonies on the membrane are counted using an advanced imaging system up to 50% earlier in the incubation time. Based on the recommendations in USP Validation of New Microbiological Testing Methods , the system may be implemented in a microbiology laboratory after simple method verification and not a full method validation. © PDA, Inc. 2018.

  14. Study of thermodynamic and structural properties of a flexible homopolymer chain using advanced Monte Carlo methods

    Directory of Open Access Journals (Sweden)

    Hammou Amine Bouziane

    2013-03-01

    Full Text Available We study the thermodynamic and structural properties of a flexible homopolymer chain using both multi canonical Monte Carlo method and Wang-Landau method. In this work, we focus on the coil-globule transition. Starting from a completely random chain, we have obtained a globule for different sizes of the chain. The implementation of these advanced Monte Carlo methods allowed us to obtain a flat histogram in energy space and calculate various thermodynamic quantities such as the density of states, the free energy and the specific heat. Structural quantities such as the radius of gyration where also calculated.

  15. Left ventricular flow analysis: recent advances in numerical methods and applications in cardiac ultrasound.

    Science.gov (United States)

    Borazjani, Iman; Westerdale, John; McMahon, Eileen M; Rajaraman, Prathish K; Heys, Jeffrey J; Belohlavek, Marek

    2013-01-01

    The left ventricle (LV) pumps oxygenated blood from the lungs to the rest of the body through systemic circulation. The efficiency of such a pumping function is dependent on blood flow within the LV chamber. It is therefore crucial to accurately characterize LV hemodynamics. Improved understanding of LV hemodynamics is expected to provide important clinical diagnostic and prognostic information. We review the recent advances in numerical and experimental methods for characterizing LV flows and focus on analysis of intraventricular flow fields by echocardiographic particle image velocimetry (echo-PIV), due to its potential for broad and practical utility. Future research directions to advance patient-specific LV simulations include development of methods capable of resolving heart valves, higher temporal resolution, automated generation of three-dimensional (3D) geometry, and incorporating actual flow measurements into the numerical solution of the 3D cardiovascular fluid dynamics.

  16. Left Ventricular Flow Analysis: Recent Advances in Numerical Methods and Applications in Cardiac Ultrasound

    Directory of Open Access Journals (Sweden)

    Iman Borazjani

    2013-01-01

    Full Text Available The left ventricle (LV pumps oxygenated blood from the lungs to the rest of the body through systemic circulation. The efficiency of such a pumping function is dependent on blood flow within the LV chamber. It is therefore crucial to accurately characterize LV hemodynamics. Improved understanding of LV hemodynamics is expected to provide important clinical diagnostic and prognostic information. We review the recent advances in numerical and experimental methods for characterizing LV flows and focus on analysis of intraventricular flow fields by echocardiographic particle image velocimetry (echo-PIV, due to its potential for broad and practical utility. Future research directions to advance patient-specific LV simulations include development of methods capable of resolving heart valves, higher temporal resolution, automated generation of three-dimensional (3D geometry, and incorporating actual flow measurements into the numerical solution of the 3D cardiovascular fluid dynamics.

  17. Application of advanced data reduction methods to gas turbine dynamic analysis

    International Nuclear Information System (INIS)

    Juhl, P.B.

    1978-01-01

    This paper discusses the application of advanced data reduction methods to the evaluation of dynamic data from gas turbines and turbine components. The use of the Fast Fourier Transform and of real-time spectrum analyzers is discussed. The use of power spectral density and probability density functions for analyzing random data is discussed. Examples of the application of these modern techniques to gas turbine testing are presented. The use of the computer to automate the data reduction procedures is discussed. (orig.) [de

  18. Linking advanced biofuels policies with stakeholder interests: A method building on Quality Function Deployment

    International Nuclear Information System (INIS)

    Schillo, R. Sandra; Isabelle, Diane A.; Shakiba, Abtin

    2017-01-01

    The field of renewable energy policy is inherently complex due to the long-term impacts of its policies, the broad range of potential stakeholders, the intricacy of scientific, engineering and technological developments, and the interplay of complex policy mixes that may result in unintended consequences. Quality Function Deployment (QFD) provides a systematic consideration of all relevant stakeholders, a rigorous analysis of the needs of stakeholders, and a prioritization of design features based on stakeholders needs. We build on QFD combined with Analytical Hierarchy Process (AHP) to develop a novel method applied to the area of advanced biofuel policies. This Multi-Stakeholder Policy QFD (MSP QFD) provides a systematic approach to capture the voice of the stakeholders and align it with the broad range of potential advanced biofuels policies. To account for the policy environment, the MSP QFD utilizes a novel approach to stakeholder importance weights. This MSP QFD adds to the literature as it permits the analysis of the broad range of relevant national policies with regards to the development of advanced biofuels, as compared to more narrowly focused typical QFD applications. It also allows policy developers to gain additional insights into the perceived impacts of policies, as well as international comparisons. - Highlights: • Advanced biofuels are mostly still in research and early commercialization stages. • Government policies are expected to support biofuels stakeholders in market entry. • A Multi-Stakeholder Policy QFD (MSP QFD) links biofuels policies with stakeholders. • MSP QFD employs novel stakeholder weights method. • The case of advanced biofuels in Canada shows comparative importance of policies.

  19. Perugia urodynamic method of analysis (PUMA): a new advanced method of urodynamic analysis applied clinically and compared with other advanced methods.

    Science.gov (United States)

    Porena, Massimo; Biscotto, Sauro; Costantini, Elisabetta; Mearini, Ettore; Verdini, Livio

    2003-01-01

    The aim of this study is to compare PUMA curves with different pathologic conditions causing bladder dysfunction in 158 men and 83 women. PUMA results in terms of bladder outlet obstruction and detrusor contractility were compared in 92 men with benign prostatic hypertrophy (BPH) and p(ves) congruent with p(det) (i.e., p(abd) congruent with 0) with the results of the urodynamics operator's opinion, the provisional International Continence Society method, Abrams and Griffith's diagram, urethral resistence factor (URA), Schäfer's diagram, and Watt factor. PUMA curves correlated reliably with different pathologic conditions such as obstructive BPH, orthotopic bladder, cystocele, the neurological bladder, and bladder diverticulum. Statistical analysis indicated excellent agreement between PUMA and URA; agreement with other methods was good in cases of obstruction and nonobstruction. In doubtful cases, as diagnosed by standard methods, PUMA agreed only with the Abrams and Griffith's diagram. PUMA and Wmax were in good agreement on detrusor con traction force. Agreement between PUMA and Schäfer's diagram was excellent for patients with detrusor hypercontractility and good for patients with detrusor hypocontractility and normocontractility. PUMA is the only method applicable to women. It is easy to perform. When integrated with other diagnostic tests, it provides realistic data for diagnosis, medical or surgical therapy, and outcome. Copyright 2003 Wiley-Liss, Inc.

  20. Development and application of a probabilistic evaluation method for advanced process technologies

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H.C.; Rubin, E.S.

    1991-04-01

    The objective of this work is to develop and apply a method for research planning for advanced process technologies. To satisfy requirements for research planning, it is necessary to: (1) identify robust solutions to process design questions in the face of uncertainty to eliminate inferior design options; (2) identify key problem areas in a technology that should be the focus of further research to reduce the risk of technology failure; (3) compare competing technologies on a consistent basis to determine the risks associated with adopting a new technology; and (4) evaluate the effects that additional research might have on comparisons with conventional technology. An important class of process technologies are electric power plants. In particular, advanced clean coal technologies are expected to play a key role in the energy and environmental future of the US, as well as in other countries. Research planning for advanced clean coal technology development is an important part of energy and environmental policy. Thus, the research planning method developed here is applied to case studies focusing on a specific clean coal technology. The purpose of the case studies is both to demonstrate the research planning method and to obtain technology-specific conclusions regarding research strategies.

  1. Development and application of a probabilistic evaluation method for advanced process technologies. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Frey, H.C.; Rubin, E.S.

    1991-04-01

    The objective of this work is to develop and apply a method for research planning for advanced process technologies. To satisfy requirements for research planning, it is necessary to: (1) identify robust solutions to process design questions in the face of uncertainty to eliminate inferior design options; (2) identify key problem areas in a technology that should be the focus of further research to reduce the risk of technology failure; (3) compare competing technologies on a consistent basis to determine the risks associated with adopting a new technology; and (4) evaluate the effects that additional research might have on comparisons with conventional technology. An important class of process technologies are electric power plants. In particular, advanced clean coal technologies are expected to play a key role in the energy and environmental future of the US, as well as in other countries. Research planning for advanced clean coal technology development is an important part of energy and environmental policy. Thus, the research planning method developed here is applied to case studies focusing on a specific clean coal technology. The purpose of the case studies is both to demonstrate the research planning method and to obtain technology-specific conclusions regarding research strategies.

  2. Advanced calculational methods for power reactors and LWR core design parameters

    International Nuclear Information System (INIS)

    1992-12-01

    The purpose of the Specialists Meeting on Advanced Calculational Methods for Power Reactors, held in Cadarache, France, 10-14 September 1990, was to provide a forum for reviewing and discussing selected core physics of water cooled reactors (including high convertors). New methods of advanced calculation for advanced fuels and complex geometries of next generation reactors with a high level of accuracy were discussed and the importance of supercomputing and on-line monitoring was also acknowledged. The meeting was attended by about 60 participants from 20 countries who presented 30 papers. The Technical Committee Meeting on LWR Core Design Parameters, held in Rez, former Czechoslovakia, 7-11 October 1991, provided an opportunity for participants to exchange their experience on reactor physics aspects of benchmark calculations of various lattices, methods for core parameter calculations, core monitoring and in-core fuel management. At the Workshop there were further discussions related to the benchmark problems, homogenization techniques and cross-section representations. Thirty-five papers were presented by about 43 participants from 19 countries. A separate abstract was prepared for each of the mentioned papers. Refs, figs and tabs

  3. Glomerular filtration rate in children with advanced chronic renal failure: methods of determination and clinical applications.

    Science.gov (United States)

    Chan, J C; Sharpe, A R

    1982-01-01

    The rationale, significance and pitfalls of currently available methods for the determination of glomerular filtration rates in children with advanced chronic renal diseases are reviewed. Normal renal clearances in infants and children from birth to adulthood are presented. Methods of serial measurements of renal function, especially by the reciprocals of serum creatinine concentrations, are evaluated. They are applied clinically to monitor the effectiveness of conservative and new modalities of treatment, such as dietary supplementation with essential amino acids plus their keto analogues, and to determine that 1,25-dihydroxyvitamin-D3 treatment of renal osteodystrophy does not accelerate the deterioration of renal function in children with chronic renal insufficiency.

  4. Advances in the biometric recognition methods: a survey on iris and fingerprint recognition

    Science.gov (United States)

    Zaeri, Naser; Alkoot, Fuad

    2010-02-01

    Human recognition based on biometrics finds many important applications in many life sectors and in particular in commercial and law enforcement. This paper aims to give a general overview of the advances in the biometric recognition methods. We concentrate on main methods and accessible ideas presented for human recognition systems based on two types of biometrics: iris and fingerprint. We present a quick overview of the landmark papers that laid the foundation in each track then we present the latest updates and important turns and solutions that developed in each track in the last few years.

  5. An Analysis and Quantification Method of Human Errors of Soft Controls in Advanced MCRs

    International Nuclear Information System (INIS)

    Lee, Seung Jun; Kim, Jae Whan; Jang, Seung Cheol

    2011-01-01

    In this work, a method was proposed for quantifying human errors that may occur during operation executions using soft control. Soft controls of advanced main control rooms (MCRs) have totally different features from conventional controls, and thus they may have different human error modes and occurrence probabilities. It is important to define the human error modes and to quantify the error probability for evaluating the reliability of the system and preventing errors. This work suggests a modified K-HRA method for quantifying error probability

  6. Utilization of Advanced Methods in the Control of a Mechatronic System with Flexible Elements

    Directory of Open Access Journals (Sweden)

    Juhás Martin

    2016-12-01

    Full Text Available Analysis of the negative impact of a mechatronic system with the flexible elements parameter error and the possibilities of impact reduction are presented in this contribution. Two advanced methods – the Model Predictive Control method and the inclusion of an LMS filter into the control process are proposed for the reduction of the insufficient effect of a double notch filter, which was initially integrated into the system for elimination of two-mass flexible joint parasitic frequencies. Simulation experiments results – response of system angular velocity and control process quality analysis confirmed the correctness of the proposition for the usage of these progressive control elements.

  7. Selenium contaminated waters: An overview of analytical methods, treatment options and recent advances in sorption methods.

    Science.gov (United States)

    Santos, Sílvia; Ungureanu, Gabriela; Boaventura, Rui; Botelho, Cidália

    2015-07-15

    Selenium is an essential trace element for many organisms, including humans, but it is bioaccumulative and toxic at higher than homeostatic levels. Both selenium deficiency and toxicity are problems around the world. Mines, coal-fired power plants, oil refineries and agriculture are important examples of anthropogenic sources, generating contaminated waters and wastewaters. For reasons of human health and ecotoxicity, selenium concentration has to be controlled in drinking-water and in wastewater, as it is a potential pollutant of water bodies. This review article provides firstly a general overview about selenium distribution, sources, chemistry, toxicity and environmental impact. Analytical techniques used for Se determination and speciation and water and wastewater treatment options are reviewed. In particular, published works on adsorption as a treatment method for Se removal from aqueous solutions are critically analyzed. Recent published literature has given particular attention to the development and search for effective adsorbents, including low-cost alternative materials. Published works mostly consist in exploratory findings and laboratory-scale experiments. Binary metal oxides and LDHs (layered double hydroxides) have presented excellent adsorption capacities for selenium species. Unconventional sorbents (algae, agricultural wastes and other biomaterials), in raw or modified forms, have also led to very interesting results with the advantage of their availability and low-cost. Some directions to be considered in future works are also suggested. Copyright © 2015 Elsevier B.V. All rights reserved.

  8. Comparing sensitivity analysis methods to advance lumped watershed model identification and evaluation

    Directory of Open Access Journals (Sweden)

    Y. Tang

    2007-01-01

    Full Text Available This study seeks to identify sensitivity tools that will advance our understanding of lumped hydrologic models for the purposes of model improvement, calibration efficiency and improved measurement schemes. Four sensitivity analysis methods were tested: (1 local analysis using parameter estimation software (PEST, (2 regional sensitivity analysis (RSA, (3 analysis of variance (ANOVA, and (4 Sobol's method. The methods' relative efficiencies and effectiveness have been analyzed and compared. These four sensitivity methods were applied to the lumped Sacramento soil moisture accounting model (SAC-SMA coupled with SNOW-17. Results from this study characterize model sensitivities for two medium sized watersheds within the Juniata River Basin in Pennsylvania, USA. Comparative results for the 4 sensitivity methods are presented for a 3-year time series with 1 h, 6 h, and 24 h time intervals. The results of this study show that model parameter sensitivities are heavily impacted by the choice of analysis method as well as the model time interval. Differences between the two adjacent watersheds also suggest strong influences of local physical characteristics on the sensitivity methods' results. This study also contributes a comprehensive assessment of the repeatability, robustness, efficiency, and ease-of-implementation of the four sensitivity methods. Overall ANOVA and Sobol's method were shown to be superior to RSA and PEST. Relative to one another, ANOVA has reduced computational requirements and Sobol's method yielded more robust sensitivity rankings.

  9. Impact of different treatment methods on survival in advanced pancreatic cancer

    International Nuclear Information System (INIS)

    Brasiuniene, B.; Juozaityte, E.; Barauskas, G.

    2005-01-01

    The aim of the study was to evaluate the impact of different treatment methods on survival of patients treated for advanced pancreatic cancer at Kaunas University of Medicine Hospital from 1987 to 2003. Data on 262 patients with advanced pancreatic cancer treated from 1987 to 2003 were analyzed retrospectively. Four groups of patients were analyzed. One hundred eighty patients underwent palliative bypass or endoscopic bile duct stenting or observation alone. Forty three patients in addition to surgery were treated by radiotherapy. Twenty five patients received gemcitabine in standard doses and schedules. Fourteen patients received concomitant chemoradiotherapy (with gemcitabine or 5-fluorouracil). All patients were grouped by treatment method and median survival was analyzed. Median survival of patients treated by palliative surgery only or observation alone was 1.9 month, and for patients treated by palliative surgery and radiotherapy was 6.1 months (p=0.00007). Median survival of patients treated with gemcitabine was 9.5 months (p<0.001), and median survival of patients treated with concomitant chemoradiotherapy was 8.5 months (p=0.00003). Patients diagnosed with advanced pancreatic cancer in addition to surgical treatment should be treated by chemotherapy, concomitant chemoradiotherapy or radiotherapy. (author)

  10. A framework for advanced methods of control of human-induced vibrations

    Science.gov (United States)

    Reynolds, Paul

    2012-04-01

    The vibration serviceability of civil engineering structures under human dynamic excitation is becoming ever more critical with the design and redevelopment of structures with reduced mass, stiffness and damping. A large number of problems have been reported in floors, footbridges, sports stadia, staircases and other structures. Unfortunately, the range of options available to fix such problems are very limited and are primarily limited to structural modification or the implementation of passive vibration control measures, such as tuned mass dampers. This paper presents the initial development of a new framework for advanced methods of control of humaninduced vibrations in civil engineering structures. This framework includes both existing passive methods of vibration control and more advanced active, semi-active and hybrid control techniques, which may be further developed as practical solutions for these problems. Through the use of this framework, rational decisions as to the most appropriate technologies for particular human vibration problems may be made and pursued further. This framework is also intended to be used in the design of new civil engineering structures, where advanced control technologies may be used both to increase the achievable slenderness and to reduce the amount of construction materials used and hence their embodied energy. This will be an ever more important consideration with the current drive for structures with reduced environmental impact.

  11. Advanced image based methods for structural integrity monitoring: Review and prospects

    Science.gov (United States)

    Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.

    2018-02-01

    There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.

  12. Advances in mixed-integer programming methods for chemical production scheduling.

    Science.gov (United States)

    Velez, Sara; Maravelias, Christos T

    2014-01-01

    The goal of this paper is to critically review advances in the area of chemical production scheduling over the past three decades and then present two recently proposed solution methods that have led to dramatic computational enhancements. First, we present a general framework and problem classification and discuss modeling and solution methods with an emphasis on mixed-integer programming (MIP) techniques. Second, we present two solution methods: (a) a constraint propagation algorithm that allows us to compute parameters that are then used to tighten MIP scheduling models and (b) a reformulation that introduces new variables, thus leading to effective branching. We also present computational results and an example illustrating how these methods are implemented, as well as the resulting enhancements. We close with a discussion of open research challenges and future research directions.

  13. Advances in dynamic and mean field games theory, applications, and numerical methods

    CERN Document Server

    Viscolani, Bruno

    2017-01-01

    This contributed volume considers recent advances in dynamic games and their applications, based on presentations given at the 17th Symposium of the International Society of Dynamic Games, held July 12-15, 2016, in Urbino, Italy. Written by experts in their respective disciplines, these papers cover various aspects of dynamic game theory including mean-field games, stochastic and pursuit-evasion games, and computational methods for dynamic games. Topics covered include Pedestrian flow in crowded environments Models for climate change negotiations Nash Equilibria for dynamic games involving Volterra integral equations Differential games in healthcare markets Linear-quadratic Gaussian dynamic games Aircraft control in wind shear conditions Advances in Dynamic and Mean-Field Games presents state-of-the-art research in a wide spectrum of areas. As such, it serves as a testament to the continued vitality and growth of the field of dynamic games and their applications. It will be of interest to an interdisciplinar...

  14. Trial examination of direct pebble fabrication for advanced tritium breeders by the emulsion method

    Energy Technology Data Exchange (ETDEWEB)

    Hoshino, Tsuyoshi, E-mail: hoshino.tsuyoshi@jaea.go.jp

    2014-10-15

    Highlights: • The integration of raw material preparation and granulation is proposed as a new direct pebble fabrication process. • The emulsion method granulates gel spheres of Li{sub 2}CO{sub 3} and TiO{sub 2} or SiO{sub 2}. • The gel spheres are calcined and sintered in air. • The crush load of the sintered Li{sub 2}TiO{sub 3} or Li{sub 4}SiO{sub 4} pebbles obtained is 37.2 or 59.3 N, respectively. - Abstract: Demonstration power plant reactors require advanced tritium breeders with high thermal stability. For the mass production of advanced tritium breeder pebbles, pebble fabrication by the emulsion method is a promising technique. To develop the most efficient pebble fabrication method, a new direct pebble fabrication process utilizing the emulsion method was implemented. A prior pebble fabrication process consisted of the preparation of raw materials followed by granulation. The new process integrates the preparation and granulation of raw materials. The slurry for the emulsion granulation of Li{sub 2}TiO{sub 3} or Li{sub 4}SiO{sub 4} as a tritium breeder consists of mixtures of Li{sub 2}CO{sub 3} and TiO{sub 2} or SiO{sub 2} at specific ratios. Subsequently, gel spheres of tritium breeders are fabricated by controlling the relative flow speeds of slurry and oil. The average diameter and crush load of the obtained sintered Li{sub 2}TiO{sub 3} or Li{sub 4}SiO{sub 4} pebbles were 1.0 or 1.5 mm and 37.2 or 59.3 N, respectively. The trial fabrication results suggest that the new process has the potential to increase the fabrication efficiency of advanced tritium breeder pebbles.

  15. The Water-Energy-Food Nexus: Advancing Innovative, Policy-Relevant Methods

    Science.gov (United States)

    Crootof, A.; Albrecht, T.; Scott, C. A.

    2017-12-01

    The water-energy-food (WEF) nexus is rapidly expanding in scholarly literature and policy settings as a novel way to address complex Anthropocene challenges. The nexus approach aims to identify tradeoffs and synergies of water, energy, and food systems, internalize social and environmental impacts, and guide development of cross-sectoral policies. However, a primary limitation of the nexus approach is the absence - or gaps and inconsistent use - of adequate methods to advance an innovative and policy-relevant nexus approach. This paper presents an analytical framework to identify robust nexus methods that align with nexus thinking and highlights innovative nexus methods at the frontier. The current state of nexus methods was assessed with a systematic review of 245 journal articles and book chapters. This review revealed (a) use of specific and reproducible methods for nexus assessment is uncommon - less than one-third of the reviewed studies present explicit methods; (b) nexus methods frequently fall short of capturing interactions among water, energy, and food - the very concept they purport to address; (c) assessments strongly favor quantitative approaches - 70% use primarily quantitative tools; (d) use of social science methods is limited (26%); and (e) many nexus methods are confined to disciplinary silos - only about one-quarter combine methods from diverse disciplines and less than one-fifth utilize both quantitative and qualitative approaches. Despite some pitfalls of current nexus methods, there are a host of studies that offer innovative approaches to help quantify nexus linkages and interactions among sectors, conceptualize dynamic feedbacks, and support mixed method approaches to better understand WEF systems. Applying our analytical framework to all 245 studies, we identify, and analyze herein, seventeen studies that implement innovative multi-method and cross-scalar tools to demonstrate promising advances toward improved nexus assessment. This paper

  16. An Advanced Actuator Line Method for Wind Energy Applications and Beyond: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Churchfield, Matthew; Schreck, Scott; Martinez-Tossas, Luis A.; Meneveau, Charles; Spalart, Philippe R.

    2017-03-24

    The actuator line method to represent rotor aerodynamics within computational fluid dynamics has been in use for over a decade. This method applies a body force to the flow field along rotating lines corresponding to the individual rotor blades and employs tabular airfoil data to compute the force distribution. The actuator line method is attractive because compared to blade-resolved simulations, the required mesh is much simpler and the computational cost is lower. This work proposes a higher fidelity variant of the actuator line method meant to fill the space between current actuator line and blade-resolved simulations. It contains modifications in two key areas. The first is that of freestream velocity vector estimation along the line, which is necessary to compute the lift and drag along the line using tabular airfoil data. Most current methods rely on point sampling in which the location of sampling is ambiguous. Here we test a velocity sampling method that uses a properly weighted integral over space, removing this ambiguity. The second area of improvement is the function used to project the one-dimensional actuator line force onto the three-dimensional fluid mesh as a body force. We propose and test a projection function that spreads the force over a region that looks something like a real blade with the hope that it will produce the blade local and near wake flow features with more accuracy and higher fidelity. Our goal is that between these two improvements, not only will the flow field predictions be enhanced, but also the spanwise loading will be made more accurate. We refer to this combination of improvements as the advanced actuator line method. We apply these improvements to two different wind turbine cases. Although there is a strong wind energy motivation in our work, there is no reason these advanced actuator line ideas cannot be used in other applications, such as helicopter rotors.

  17. An Advanced Actuator Line Method for Wind Energy Applications and Beyond

    Energy Technology Data Exchange (ETDEWEB)

    Churchfield, Matthew J.; Schreck, Scott; Martinez-Tossas, Luis A.; Meneveau, Charles; Spalart, Philippe R.

    2017-01-09

    The actuator line method to represent rotor aerodynamics within computational fluid dynamics has been in use for over a decade. This method applies a body force to the flow field along rotating lines corresponding to the individual rotor blades and employs tabular airfoil data to compute the force distribution. The actuator line method is attractive because compared to blade-resolved simulations, the required mesh is much simpler and the computational cost is lower. This work proposes a higher fidelity variant of the actuator line method meant to fill the space between current actuator line and blade-resolved simulations. It contains modifications in two key areas. The first is that of freestream velocity vector estimation along the line, which is necessary to compute the lift and drag along the line using tabular airfoil data. Most current methods rely on point sampling in which the location of sampling is ambiguous. Here we test a velocity sampling method that uses a properly weighted integral over space, removing this ambiguity. The second area of improvement is the function used to project the one-dimensional actuator line force onto the three-dimensional fluid mesh as a body force. We propose and test a projection function that spreads the force over a region that looks something like a real blade with the hope that it will produce the blade local and near wake flow features with more accuracy and higher fidelity. Our goal is that between these two improvements, not only will the flow field predictions be enhanced, but also the spanwise loading will be made more accurate. We refer to this combination of improvements as the advanced actuator line method. We apply these improvements to two different wind turbine cases. Although there is a strong wind energy motivation in our work, there is no reason these advanced actuator line ideas cannot be used in other applications, such as helicopter rotors.

  18. Time-domain hybrid method for simulating large amplitude motions of ships advancing in waves

    Directory of Open Access Journals (Sweden)

    Shukui Liu

    2011-03-01

    Full Text Available Typical results obtained by a newly developed, nonlinear time domain hybrid method for simulating large amplitude motions of ships advancing with constant forward speed in waves are presented. The method is hybrid in the way of combining a time-domain transient Green function method and a Rankine source method. The present approach employs a simple double integration algorithm with respect to time to simulate the free-surface boundary condition. During the simulation, the diffraction and radiation forces are computed by pressure integration over the mean wetted surface, whereas the incident wave and hydrostatic restoring forces/moments are calculated on the instantaneously wetted surface of the hull. Typical numerical results of application of the method to the seakeeping performance of a standard containership, namely the ITTC S175, are herein presented. Comparisons have been made between the results from the present method, the frequency domain 3D panel method (NEWDRIFT of NTUA-SDL and available experimental data and good agreement has been observed for all studied cases between the results of the present method and comparable other data.

  19. A complex method of equipment replacement planning. An advanced plan for the replacement of medical equipment.

    Science.gov (United States)

    Dondelinger, Robert M

    2004-01-01

    This complex method of equipment replacement planning is a methodology; it is a means to an end, a process that focuses on equipment most in need of replacement, rather than the end itself. It uses data available from the maintenance management database, and attempts to quantify those subjective items important [figure: see text] in making equipment replacement decisions. Like the simple method of the last issue, it is a starting point--albeit an advanced starting point--which the user can modify to fit their particular organization, but the complex method leaves room for expansion. It is based on sound logic, documented facts, and is fully defensible during the decision-making process and will serve your organization well as provide a structure for your equipment replacement planning decisions.

  20. Palliative care in advanced dementia; A mixed methods approach for the development of a complex intervention

    Directory of Open Access Journals (Sweden)

    Tookman Adrian

    2008-07-01

    Full Text Available Abstract Background There is increasing interest in improving the quality of care that patients with advanced dementia receive when they are dying. Our understanding of the palliative care needs of these patients and the natural history of advanced disease is limited. Many people with advanced dementia have unplanned emergency admissions to the acute hospital; this is a critical event: half will die within 6 months. These patients have complex needs but often lack capacity to express their wishes. Often carers are expected to make decisions. Advance care planning discussions are rarely performed, despite potential benefits such more consistent supportive healthcare, a reduction in emergency admissions to the acute hospital and better resolution of carer bereavement. Design/Methods We have used the MRC complex interventions framework, a "bottom-up" methodology, to develop an intervention for patients with advanced dementia and their carers aiming to 1 define end of life care needs for both patients and carers, 2 pilot a palliative care intervention and 3 produce a framework for advance care planning for patients. The results of qualitative phase 1 work, which involved interviews with carers, hospital and primary care staff from a range of disciplines, have been used to identify key barriers and challenges. For the exploratory trial, 40 patients will be recruited to each of the control and intervention groups. The intervention will be delivered by a nurse specialist. We shall investigate and develop methodology for a phase 3 randomised controlled trial. For example we shall explore the feasibility of randomisation, how best to optimise recruitment, decide on appropriate outcomes and obtain data for power calculations. We will evaluate whether the intervention is pragmatic, feasible and deliverable on acute hospital wards and test model fidelity and its acceptability to carers, patients and staff. Discussion Results of qualitative phase 1 work

  1. Construction of the first advanced BWRs with giant crawler crane module method

    Energy Technology Data Exchange (ETDEWEB)

    Kitamura, Ichiro; Ushiroda, Koichi; Ejiri, Fumiaki [Hitachi Ltd., Ibaraki (Japan). Hitachi Works

    1995-04-01

    The construction of No. 6 and 7 plants in Kashiwazaki Kariwa Nuclear Power Station, Tokyo Electric Power Co., Inc., which are the first advanced BWRs of 1356 MW capacity and the largest class in the world, is advanced smoothly, aiming at the start of commercial operation in 1996 and 1997. The ABWR was developed aiming at further heightening the safety, operation performance and economic efficiency, and also for the construction, new construction methods were adopted and achieved effects. Hitachi Ltd. utilizes large crawler cranes up to the maximum limit, and expands the application of large section module method, and efforts are exerted for shortening the construction period, improving quality, and securing safety in the construction works. The planning and the investigation of the installation of large section module were carried out by utilizing CAD, exchanging the design information and expanding the cooperation in execution with architecture companies, in this way, the efficiency and the quality were improved. Hitachi Ltd. has promoted the construction of the turbine facilities of No. 6 plant and the nuclear reactor facilities of No. 7 plant. As the features of No. 6 and 7 plant construction, the reduction of the building size by installing internal pumps, the steel-lined reinforced concrete containment vessels and so on are mentioned. The construction schedule, and the methods of construction for No. 6 turbine building and No. 7 reactor building are reported. (K.I.).

  2. Development of Advanced Nuclide Separation and Recovery Methods using Ion-Exchanhge Techniques in Nuclear Backend

    Science.gov (United States)

    Miura, Hitoshi

    The development of compact separation and recovery methods using selective ion-exchange techniques is very important for the reprocessing and high-level liquid wastes (HLLWs) treatment in the nuclear backend field. The selective nuclide separation techniques are effective for the volume reduction of wastes and the utilization of valuable nuclides, and expected for the construction of advanced nuclear fuel cycle system and the rationalization of waste treatment. In order to accomplish the selective nuclide separation, the design and synthesis of novel adsorbents are essential for the development of compact and precise separation processes. The present paper deals with the preparation of highly functional and selective hybrid microcapsules enclosing nano-adsorbents in the alginate gel polymer matrices by sol-gel methods, their characterization and the clarification of selective adsorption properties by batch and column methods. The selective separation of Cs, Pd and Re in real HLLW was further accomplished by using novel microcapsules, and an advanced nuclide separation system was proposed by the combination of selective processes using microcapsules.

  3. Printing, folding and assembly methods for forming 3D mesostructures in advanced materials

    Science.gov (United States)

    Zhang, Yihui; Zhang, Fan; Yan, Zheng; Ma, Qiang; Li, Xiuling; Huang, Yonggang; Rogers, John A.

    2017-03-01

    A rapidly expanding area of research in materials science involves the development of routes to complex 3D structures with feature sizes in the mesoscopic range (that is, between tens of nanometres and hundreds of micrometres). A goal is to establish methods for controlling the properties of materials systems and the function of devices constructed with them, not only through chemistry and morphology, but also through 3D architectures. The resulting systems, sometimes referred to as metamaterials, offer engineered behaviours with optical, thermal, acoustic, mechanical and electronic properties that do not occur in the natural world. Impressive advances in 3D printing techniques represent some of the most broadly recognized developments in this field, but recent successes with strategies based on concepts in origami, kirigami and deterministic assembly provide additional, unique options in 3D design and high-performance materials. In this Review, we highlight the latest progress and trends in methods for fabricating 3D mesostructures, beginning with the development of advanced material inks for nozzle-based approaches to 3D printing and new schemes for 3D optical patterning. In subsequent sections, we summarize more recent methods based on folding, rolling and mechanical assembly, including their application with materials such as designer hydrogels, monocrystalline inorganic semiconductors and graphene.

  4. CHF predictor derived from a 3D thermal-hydraulic code and an advanced statistical method

    International Nuclear Information System (INIS)

    Banner, D.; Aubry, S.

    2004-01-01

    A rod bundle CHF predictor has been determined by using a 3D code (THYC) to compute local thermal-hydraulic conditions at the boiling crisis location. These local parameters have been correlated to the critical heat flux by using an advanced statistical method based on spline functions. The main characteristics of the predictor are presented in conjunction with a detailed analysis of predictions (P/M ratio) in order to prove that the usual safety methodology can be applied with such a predictor. A thermal-hydraulic design criterion is obtained (1.13) and the predictor is compared with the WRB-1 correlation. (author)

  5. Advances in research methods for information systems research data mining, data envelopment analysis, value focused thinking

    CERN Document Server

    Osei-Bryson, Kweku-Muata

    2013-01-01

    Advances in social science research methodologies and data analytic methods are changing the way research in information systems is conducted. New developments in statistical software technologies for data mining (DM) such as regression splines or decision tree induction can be used to assist researchers in systematic post-positivist theory testing and development. Established management science techniques like data envelopment analysis (DEA), and value focused thinking (VFT) can be used in combination with traditional statistical analysis and data mining techniques to more effectively explore

  6. Advanced fabrication method for the preparation of MOF thin films: Liquid-phase epitaxy approach meets spin coating method.

    KAUST Repository

    Chernikova, Valeriya

    2016-07-14

    Here we report a new and advanced method for the fabrication of highly oriented/polycrystalline metal-organic framework (MOF) thin films. Building on the attractive features of the liquid-phase epitaxy (LPE) approach, a facile spin coating method was implemented to generate MOF thin films in a high-throughput fashion. Advantageously, this approach offers a great prospective to cost-effectively construct thin-films with a significantly shortened preparation time and a lessened chemicals and solvents consumption, as compared to the conventional LPE-process. Certainly, this new spin-coating approach has been implemented successfully to construct various MOF thin films, ranging in thickness from a few micrometers down to the nanometer scale, spanning 2-D and 3-D benchmark MOF materials including Cu2(bdc)2•xH2O, Zn2(bdc)2•xH2O, HKUST-1 and ZIF-8. This method was appraised and proved effective on a variety of substrates comprising functionalized gold, silicon, glass, porous stainless steel and aluminum oxide. The facile, high-throughput and cost-effective nature of this approach, coupled with the successful thin film growth and substrate versatility, represents the next generation of methods for MOF thin film fabrication. Thereby paving the way for these unique MOF materials to address a wide range of challenges in the areas of sensing devices and membrane technology.

  7. Sensitivity analysis of infectious disease models: methods, advances and their application

    Science.gov (United States)

    Wu, Jianyong; Dhingra, Radhika; Gambhir, Manoj; Remais, Justin V.

    2013-01-01

    Sensitivity analysis (SA) can aid in identifying influential model parameters and optimizing model structure, yet infectious disease modelling has yet to adopt advanced SA techniques that are capable of providing considerable insights over traditional methods. We investigate five global SA methods—scatter plots, the Morris and Sobol’ methods, Latin hypercube sampling-partial rank correlation coefficient and the sensitivity heat map method—and detail their relative merits and pitfalls when applied to a microparasite (cholera) and macroparasite (schistosomaisis) transmission model. The methods investigated yielded similar results with respect to identifying influential parameters, but offered specific insights that vary by method. The classical methods differed in their ability to provide information on the quantitative relationship between parameters and model output, particularly over time. The heat map approach provides information about the group sensitivity of all model state variables, and the parameter sensitivity spectrum obtained using this method reveals the sensitivity of all state variables to each parameter over the course of the simulation period, especially valuable for expressing the dynamic sensitivity of a microparasite epidemic model to its parameters. A summary comparison is presented to aid infectious disease modellers in selecting appropriate methods, with the goal of improving model performance and design. PMID:23864497

  8. Advanced methods comparisons of reaction rates in the Purdue Fast Breeder Blanket Facility

    International Nuclear Information System (INIS)

    Hill, R.N.; Ott, K.O.

    1988-01-01

    A review of worldwide results revealed that reaction rates in the blanket region are generally underpredicted with the discrepancy increasing with penetration; however, these results vary widely. Experiments in the large uniform Purdue Fast Breeder Blanket Facility (FBBF) blanket yield an accurate quantification of this discrepancy. Using standard production code methods (diffusion theory with 50 group cross sections), a consistent Calculated/Experimental (C/E) drop-off was observed for various reaction rates. A 50% increase in the calculated results at the outer edge of the blanket is necessary for agreement with experiments. The usefulness of refined group constant generation utilizing specialized weighting spectra and transport theory methods in correcting this discrepancy was analyzed. Refined group constants reduce the discrepancy to half that observed using the standard method. The surprising result was that transport methods had no effect on the blanket deviations; thus, transport theory considerations do not constitute or even contribute to an explanation of the blanket discrepancies. The residual blanket C/E drop-off (about half the standard drop-off) using advanced methods must be caused by some approximations which are applied in all current methods. 27 refs., 3 figs., 1 tab

  9. Effect of method of analysis on iron content of beef from advanced meat recovery systems.

    Science.gov (United States)

    Windham, W R; Field, R A

    2000-12-01

    A field survey was conducted by the USDA, Food Safety Inspection Service (FSIS) to provide analytical data on meat obtained from beef cervical vertebrae processed by advanced meat recovery (AMR) systems. As a result, an added iron performance standard was proposed to limit the amount of marrow in AMR products. The performance standard was based on iron content of hand boned lean compared to AMR lean. Iron content was determined by a hydrochloric wet ash digestion method. The same samples were then analyzed using dry ash digestion. The objectives of the study were to determine differences in iron content of the survey samples due to the digestion method and the impact of this difference on the added iron performance standard. Iron values by the dry ash method were approximately double those of the wet ash method. The difference was a result of incomplete volatilization of the organic matrix by hydrochloric acid in the wet ash procedure. The performance standards developed from the wet and dry ash methods were 1.8 and 3.2 mg added iron 100(-1) g, respectively. Added iron levels from the dry ash method greater than 3.2 mg 100(-1) g were present in 60% of the AMR lean indicating that some marrow was present or that factors other than amount of iron in hand boned lean should be considered before a performance standard is established.

  10. A Mixed Methods Study: African American Students' Performance Trends and Perceptions Towards Advanced Placement Literature Courses and Examinations

    Science.gov (United States)

    Buford, Brandie J.

    2012-01-01

    The purpose of this mixed methods study was to describe the perceptions of African American students pertaining to their engagement in Advanced Placement English Literature and Composition course and Advanced Placement English Literature and Composition examination. A purposive sampling design was employed to select 12 participants from one urban…

  11. Alternative oil extraction methods from Echium plantagineum L. seeds using advanced techniques and green solvents.

    Science.gov (United States)

    Castejón, Natalia; Luna, Pilar; Señoráns, Francisco J

    2018-04-01

    The edible oil processing industry involves large losses of organic solvent into the atmosphere and long extraction times. In this work, fast and environmentally friendly alternatives for the production of echium oil using green solvents are proposed. Advanced extraction techniques such as Pressurized Liquid Extraction (PLE), Microwave Assisted Extraction (MAE) and Ultrasound Assisted Extraction (UAE) were evaluated to efficiently extract omega-3 rich oil from Echium plantagineum seeds. Extractions were performed with ethyl acetate, ethanol, water and ethanol:water to develop a hexane-free processing method. Optimal PLE conditions with ethanol at 150 °C during 10 min produced a very similar oil yield (31.2%) to Soxhlet using hexane for 8 h (31.3%). UAE optimized method with ethanol at mild conditions (55 °C) produced a high oil yield (29.1%). Consequently, advanced extraction techniques showed good lipid yields and furthermore, the produced echium oil had the same omega-3 fatty acid composition than traditionally extracted oil. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. SIS 2013 Statistical Conference “Advances in Latent Variables. Methods, Models and Applications”

    CERN Document Server

    Brentari, Eugenio; Qannari, El; Advances in Latent Variables : Methods, Models and Applications

    2015-01-01

    The book, belonging to the series “Studies in Theoretical and Applied Statistics – Selected Papers from the Statistical Societies”, presents a peer-reviewed selection of contributions on relevant topics organized by the editors on the occasion of the SIS 2013 Statistical Conference "Advances in Latent Variables. Methods, Models and Applications", held at the Department of Economics and Management of the University of Brescia from June 19 to 21, 2013. The focus of the book is on advances in statistical methods for analyses with latent variables. In fact, in recent years, there has been increasing interest in this broad research area from both a theoretical and an applied point of view, as the statistical latent variable approach allows the effective modeling of complex real-life phenomena in a wide range of research fields. A major goal of the volume is to bring together articles written by statisticians from different research fields, which present different approaches and experiences related to the...

  13. A study on the economics enhancement of OPR1000 applied to advanced construction methods

    International Nuclear Information System (INIS)

    Park, Ki Jo; Yoon, Eun Sang

    2007-01-01

    OPR1000 (Optimized Power Reactor 1000MW) is a totally improved design model of Korea nuclear power plants and the latest 1,000MW nuclear power plant in the Republic of Korea. Shin Kori 1 and 2 and Shin Wolsong 1 and 2 and under construction and these are OPR1000 types. Although OPR1000 is up to data 1,000MW nuclear power plant, it is not enough to be much superior to other nuclear power plants. Under the WTO and FTA circumstance of domestic and stiff overseas competition for nuclear power plants, it is necessary to enhance the economics of OPR1000. And then, the enhanced economic alternatives are reviewed and the advanced construction methods are considered. Based on research and a comprehensive review of nuclear power plant construction experiences, an alternative application of advanced construction methods is developed and compared with existing OPR1000 for schedule and economics. In this paper, economic analyses of a construction cost and a levelized electricity generation cost are performed

  14. Advanced methods for the study of PWR cores; Les methodes d'etudes avancees pour les coeurs de REP

    Energy Technology Data Exchange (ETDEWEB)

    Lambert, M.; Salvatores, St.; Ferrier, A. [Electricite de France (EDF), Service Etudes et Projets Thermiques et Nucleaires, 92 - Courbevoie (France); Pelet, J.; Nicaise, N.; Pouliquen, J.Y.; Foret, F. [FRAMATOME ANP, 92 - Paris La Defence (France); Chauliac, C. [CEA Saclay, Dir. de l' Energie Nucleaire (DEN), 91 - Gif sur Yvette (France); Johner, J. [CEA Cadarache, Dept. de Recherches sur la Fusion Controlee (DRFC), 13 - Saint Paul lez Durance (France); Cohen, Ch

    2003-07-01

    This document gathers the transparencies presented at the 6. technical session of the French nuclear energy society (SFEN) in October 2003. The transparencies of the annual meeting are presented in the introductive part: 1 - status of the French nuclear park: nuclear energy results, management of an exceptional climatic situation: the heat wave of summer 2003 and the power generation (J.C. Barral); 2 - status of the research on controlled thermonuclear fusion (J. Johner). Then follows the technical session about the advanced methods for the study of PWR reactor cores: 1 - the evolution approach of study methodologies (M. Lambert, J. Pelet); 2 - the point of view of the nuclear safety authority (D. Brenot); 3 - the improved decoupled methodology for the steam pipe rupture (S. Salvatores, J.Y. Pouliquen); 4 - the MIR method for the pellet-clad interaction (renovated IPG methodology) (E. Baud, C. Royere); 5 - the improved fuel management (IFM) studies for Koeberg (C. Cohen); 6 - principle of the methods of accident study implemented for the European pressurized reactor (EPR) (F. Foret, A. Ferrier); 7 - accident studies with the EPR, steam pipe rupture (N. Nicaise, S. Salvatores); 8 - the co-development platform, a new generation of software tools for the new methodologies (C. Chauliac). (J.S.)

  15. Methods for quantification of soil-transmitted helminths in environmental media: current techniques and recent advances

    Science.gov (United States)

    Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.

    2015-01-01

    Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788

  16. Application of advanced statistical methods in assessment of the late phase of a nuclear accident

    International Nuclear Information System (INIS)

    Hofman, R.

    2008-01-01

    The paper presents a new methodology for improving of estimates of radiological situation on terrain in the late phase of a nuclear accident. Methods of Bayesian filtering are applied to the problem. The estimates are based on combination of modeled and measured data provided by responsible authorities. Exploiting information on uncertainty of both the data sources, we are able to produce improved estimate of the true situation on terrain. We also attempt to account for model error, which is unknown and plays crucial role in accuracy of the estimates. The main contribution of this paper is application of an approach based on advanced statistical methods, which allows for estimating of model error covariance structure upon measurements. Model error is estimated on basis of measured-minus-observed residuals evaluated upon measured and modeled values. The methodology is demonstrated on a sample scenario with simulated measurements. (authors)

  17. ARN Training Course on Advance Methods for Internal Dose Assessment: Application of Ideas Guidelines

    International Nuclear Information System (INIS)

    Rojo, A.M.; Gomez Parada, I.; Puerta Yepes, N.; Gossio, S.

    2010-01-01

    Dose assessment in case of internal exposure involves the estimation of committed effective dose based on the interpretation of bioassay measurement, and the assumptions of hypotheses on the characteristics of the radioactive material and the time pattern and the pathway of intake. The IDEAS Guidelines provide a method to harmonize dose evaluations using criteria and flow chart procedures to be followed step by step. The EURADOS Working Group 7 'Internal Dosimetry', in collaboration with IAEA and Czech Technical University (CTU) in Prague, promoted the 'EURADOS/IAEA Regional Training Course on Advanced Methods for Internal Dose Assessment: Application of IDEAS Guidelines' to broaden and encourage the use of IDEAS Guidelines, which took place in Prague (Czech Republic) from 2-6 February 2009. The ARN identified the relevance of this training and asked for a place for participating on this activity. After that, the first training course in Argentina took place from 24-28 August for training local internal dosimetry experts. (authors)

  18. Research advances in theories and methods of community assembly and succession

    Directory of Open Access Journals (Sweden)

    WenJun Zhang

    2014-09-01

    Full Text Available Community succession refers to the regular and predictable process of species replacement in the environment that all species had been eliminated or that had been disturbed. Community assembly is the process that species growth and interact to establish a community. Community assembly stresses the change of community over a single phase. So far a lot of theories and methods have been proposed for community assembly and succession. In present article I introduced research advances in theories and methods of community assembly and succession. Finally, continuing my past propositions, I further proposed the unified theory and methodology on community assembly and succession. I suggested that community assembly and succession is a process of self-organization. It follows the major principles and mechanisms of self-organization. Agentbased modeling was suggested being used to describe the dynamics of community assembly and succession.

  19. INVESTIGATIONS OF THE FLOW INTO A STORAGE TANK BY MEANS OF ADVANCED EXPERIMENTAL AND THEORETICAL METHODS

    DEFF Research Database (Denmark)

    Jordan, Ulrike; Shah, Louise Jivan; Furbo, Simon

    2003-01-01

    a method called Particle Image Velocimetry (PIV) was applied. Particles with a size of 1 to 10 mm were seeded in the water and then illuminated by a laser within a narrow plane. In order to measure the three velocity components of the flow within the plane, the particle displacements between laser pulses......Advanced experimental methods were applied to study flow structures of a water jet entering a tank from the bottom. A squared experimental glass tank with a volume of about 140 l was used. Above the inlet pipe a flat plate was installed, as shown in the figure. The goal of the investigations...... that the luminescence intensity depends on the water temperature, the temperature fields in the tank can be visualized and also be recorded with a camera. The measurements were compared with calculations of the flow and temperature fields carried out with the Computational Fluid Dynamics (CFD) tool Fluent. In future...

  20. Recent advances in computational methods and clinical applications for spine imaging

    CERN Document Server

    Glocker, Ben; Klinder, Tobias; Li, Shuo

    2015-01-01

    This book contains the full papers presented at the MICCAI 2014 workshop on Computational Methods and Clinical Applications for Spine Imaging. The workshop brought together scientists and clinicians in the field of computational spine imaging. The chapters included in this book present and discuss the new advances and challenges in these fields, using several methods and techniques in order to address more efficiently different and timely applications involving signal and image acquisition, image processing and analysis, image segmentation, image registration and fusion, computer simulation, image based modeling, simulation and surgical planning, image guided robot assisted surgical and image based diagnosis. The book also includes papers and reports from the first challenge on vertebra segmentation held at the workshop.

  1. Combining geoelectrical and advanced lysimeter methods to characterize heterogeneous flow and transport under unsaturated transient conditions

    Science.gov (United States)

    Wehrer, M.; Skowronski, J.; Binley, A. M.; Slater, L. D.

    2013-12-01

    Our ability to predict flow and transport processes in the unsaturated critical zone is considerably limited by two characteristics: heterogeneity of flow and transience of boundary conditions. The causes of heterogeneous - or preferential - flow and transport are fairly well understood, yet the characterization and quantification of such processes in natural profiles remains challenging. This is due to current methods of observation, such as staining and isotope tracers, being unable to observe multiple events on the same profile and offering limited spatial information. In our study we demonstrate an approach to characterize preferential flow and transport processes applying a combination of geoelectrical methods and advanced lysimeter techniques. On an agricultural soil profile, which was transferred undisturbed into a lysimeter container, we applied systematically varied input flow boundary conditions, resembling natural precipitation events. We simultaneously measured the breakthrough of a conservative tracer. Flow and transport in the soil column were observed using electrical resistivity tomography (ERT), tensiometers, water content probes and a multicompartment suction plate (MSP). These techniques allowed a direct ground-truthing of soil moisture and pore fluid resistivity changes estimated noninvasively using ERT. We were able to image both the advancing infiltration front and the advancing tracer front using time lapse ERT. Water content changes associated with the advancing infiltration front dominated over pore fluid conductivity changes during short term precipitation events. Conversely, long term displacement of the solute front was monitored during periods of constant water content in between infiltration events. We observed preferential flow phenomena through ERT and through the MSP, which agreed in general terms. The preferential flow fraction was observed to be independent of precipitation rate. This suggests the presence of a fingering process

  2. NATO Advanced Research Workshop, 19-22 May 1997: Rapid Method for Monitoring the Environment for Biological Hazards

    National Research Council Canada - National Science Library

    1997-01-01

    The NATO Advanced Research Workshop met for the purpose of bringing to light rapid methods for monitoring the environment for biological hazards such as biological warfare agents, naturally occurring...

  3. Recent developments in atomic spectrometric methods as tools for the analysis of advanced materials with the example of advanced ceramics

    International Nuclear Information System (INIS)

    Broekaert, J.A.C.

    1995-01-01

    Atomic spectrometric methods are based on the emission, absorption and fluorescence processes and using radiation in the UV and VIS region. These methods are suitable for the direct analysis of solids. The first type of methodology is extremely attractive for routine analysis. Atomic spectrometric methods are relative methods and characterized with independent methods and these are widely available in the analytical laboratories. For solution analysis inductively coupled plasma (ICP) sources became available. For electrically non-conducting samples such as ceramics are discussed as laser ablation in combination with various types of spectrometry. (A.B.)

  4. Setting health research priorities using the CHNRI method: IV. Key conceptual advances.

    Science.gov (United States)

    Rudan, Igor

    2016-06-01

    Child Health and Nutrition Research Initiative (CHNRI) started as an initiative of the Global Forum for Health Research in Geneva, Switzerland. Its aim was to develop a method that could assist priority setting in health research investments. The first version of the CHNRI method was published in 2007-2008. The aim of this paper was to summarize the history of the development of the CHNRI method and its key conceptual advances. The guiding principle of the CHNRI method is to expose the potential of many competing health research ideas to reduce disease burden and inequities that exist in the population in a feasible and cost-effective way. The CHNRI method introduced three key conceptual advances that led to its increased popularity in comparison to other priority-setting methods and processes. First, it proposed a systematic approach to listing a large number of possible research ideas, using the "4D" framework (description, delivery, development and discovery research) and a well-defined "depth" of proposed research ideas (research instruments, avenues, options and questions). Second, it proposed a systematic approach for discriminating between many proposed research ideas based on a well-defined context and criteria. The five "standard" components of the context are the population of interest, the disease burden of interest, geographic limits, time scale and the preferred style of investing with respect to risk. The five "standard" criteria proposed for prioritization between research ideas are answerability, effectiveness, deliverability, maximum potential for disease burden reduction and the effect on equity. However, both the context and the criteria can be flexibly changed to meet the specific needs of each priority-setting exercise. Third, it facilitated consensus development through measuring collective optimism on each component of each research idea among a larger group of experts using a simple scoring system. This enabled the use of the knowledge of

  5. Advanced Online Flux Mapping of CANDU PHWR by Least-Squares Method

    International Nuclear Information System (INIS)

    Hong, In Seob; Kim, Chang Hyo; Suk, Ho Chun

    2005-01-01

    A least-squares method that solves both the core neutronics design equations and the in-core detector response equations on the least-squares principle is presented as a new advanced online flux-mapping method for CANada Deuterium Uranium (CANDU) pressurized heavy water reactors (PHWRs). The effectiveness of the new flux-mapping method is examined in terms of online flux-mapping calculations with numerically simulated true flux distribution and detector signals and those with the actual core-follow data for the Wolsong CANDU PHWRs in Korea. The effects of core neutronics models as well as the detector failures and uncertainties of measured detector signals on the effectiveness of the least-squares flux-mapping calculations are also examined.The following results are obtained. The least-squares method predicts the flux distribution in better agreement with the simulated true flux distribution than the standard core neutronics calculations by the finite difference method (FDM) computer code without using the detector signals. The adoption of the nonlinear nodal method based on the unified nodal method formulation instead of the FDM results in a significant improvement in prediction accuracy of the flux-mapping calculations. The detector signals estimated from the least-squares flux-mapping calculations are much closer to the measured detector signals than those from the flux synthesis method (FSM), the current online flux-mapping method for CANDU reactors. The effect of detector failures is relatively small so that the plant can tolerate up to 25% of detector failures without seriously affecting the plant operation. The detector signal uncertainties aggravate accuracy of the flux-mapping calculations, yet the effects of signal uncertainties of the order of 1% standard deviation can be tolerable without seriously degrading the prediction accuracy of the least-squares method. The least-squares method is disadvantageous because it requires longer CPU time than the

  6. The Advanced Aluminum Nitride Synthesis Methods and Its Applications: Patent Review.

    Science.gov (United States)

    Shishkin, Roman A; Elagin, Andrey A; Mayorova, Ekaterina S; Beketov, Askold R

    2016-01-01

    High purity nanosized aluminum nitride synthesis is a current issue for both industry and science. However, there is no up-to-date review considering the major issues and the technical solutions for different methods. This review aims to investigate the advanced methods of aluminum nitride synthesis and its development tendencies. Also the aluminum nitride application patents and prospects for development of the branch have been considered. The patent search on "aluminum nitride synthesis" has been carried out. The research activity has been analyzed. Special attention has been paid to the patenting geography and the leading researchers in aluminum nitride synthesis. Aluminum nitride synthesis methods have been divided into 6 main groups, the most studied approaches are carbothermal reduction (88 patents) and direct nitridation (107 patents). The current issues for each group have been analyzed; the main trends are purification of the final product and nanopowder synthesis. The leading researchers in aluminum nitride synthesis have represented 5 countries, namely: Japan, China, Russia, South Korea and USA. The main aluminum nitride application spheres are electronics (59,1 percent of applications) and new materials manufacturing (30,9 percent). The review deals with the state of the art data in nanosized aluminum nitride synthesis, the major issues and the technical solutions for different synthesis methods. It gives a full understanding of the development tendencies and of the current leaders in the sphere.

  7. A study on dynamic evaluation methods for human-machine interfaces in advanced control rooms

    International Nuclear Information System (INIS)

    Park, Jin Kyun

    1998-02-01

    Extensive efforts have been performed to reveal factors that largely affect to the safety of nuclear power plants (NPPs). Among them, human factors were known as a dominant cause of a severe accident, such as Three Mile Island and Chernobyl accidents. Thus a lot of efforts to resolve human factors related problems have been spent, and one of these efforts is an advanced control room (ACR) design to enhance human performance and the safety of NPPs. There are two important trends in the design of ACRs. The first one is increasing automation level, and the second one is the development of computer based compact workstations for control room operations including intelligent operator aid systems. However, several problems have been reported when another factors are not properly incorporated into the design of ACRs. Among them, one of the most important factors that significantly affect to operator performance is the design of human machine interfaces (HMIs). Thus, HMI evaluation should be emphasized to ensure appropriateness of HMI designs and the safety of NPPs. In general, two kinds of evaluations have been frequently used to assess appropriateness of the proposed HMI design. The one is the static evaluation and the other is the dynamic evaluation. Here, the static evaluation is the one based on guidelines that are extracted from various researches on HMI designs. And the dynamic evaluation generally attempts to evaluate and predict human performance through a model that can describe cognitive behaviors of human or interactions between HMIs and human. However, the static evaluation seems to be inappropriate because it can't properly capture context of task environment that strongly affects to human performance. In addition, in case of dynamic evaluations, development of a model that can sufficiently describe interactions or cognitive behaviors of human operators is very arduous and laborious. To overcome these problems, dynamic evaluation methods that can

  8. Advanced Thermal Energy Conversion of Temperature under 300°C by Thermoelectric Conversion Method

    Science.gov (United States)

    Ueda, Tadashi; Uchida, Yoshiyuki; Shingu, Hiroyasu

    Many approaches have been developing for energy conversion throughout the world. However, it is difficult to achieve the global warming countermeasure based on “The Kyoto protocol”. Until now effective utilization of low temperature thermal energy (under 300°C) is not advancing one. For example, effective utilization method has not been established for waste heat energy which arise from industry machine tools, automobiles, internal combustion engines and thermal energy from natural environment, etc. In this paper, we reported the experiment for effective utilizing of low temperature (under 300°C) thermal energy conversion. The device used for the measurement is a copper thermo device. Thermo electromotive force of 150mW/cm2 was obtained at 200°C. The obtained thermo electromotive force is about 15 times higher in comparison with generally used alumal-chromal thermocouple. Our aim is that utilizes low temperature thermal energy effectively by converting into electricity.

  9. Advances in Fabrication Materials of Honeycomb Structure Films by the Breath-Figure Method

    Science.gov (United States)

    Heng, Liping; Wang, Bin; Li, Muchen; Zhang, Yuqi; Jiang, Lei

    2013-01-01

    Creatures in nature possess almost perfect structures and properties, and exhibit harmonization and unification between structure and function. Biomimetics, mimicking nature for engineering solutions, provides a model for the development of functional surfaces with special properties. Recently, honeycomb structure materials have attracted wide attention for both fundamental research and practical applications and have become an increasingly hot research topic. Though progress in the field of breath-figure formation has been reviewed, the advance in the fabrication materials of bio-inspired honeycomb structure films has not been discussed. Here we review the recent progress of honeycomb structure fabrication materials which were prepared by the breath-figure method. The application of breath figures for the generation of all kinds of honeycomb is discussed. PMID:28809319

  10. Production of advanced materials by methods of self-propagating high-temperature synthesis

    CERN Document Server

    Tavadze, Giorgi F

    2013-01-01

    This translation from the original Russian book outlines the production of a variety of materials by methods of self-propagating high-temperature synthesis (SHS). The types of materials discussed include: hard, refractory, corrosion and wear-resistant materials, as well as other advanced and speciality materials. The authors address the issue of optimal parameters for SHS reactions occurring during processes involving a preliminary metallothermic reduction stage, and they calculate this using thermodynamic approaches. In order to confirm the effectiveness of this approach, the authors describe experiments focussing on the synthesis of elemental crysalline boron, boron carbides and nitrides. Other parts of this brief include theoretical and experimental results on single-stage production of hard alloys on the basis of titanium and zirconium borides, as well as macrokinetics of degassing and compaciton of SHS-products.This brief is suitable for academics, as well as those working in industrial manufacturing com...

  11. Further development of the Dynamic Control Assemblies Worth Measurement Method for Advanced Reactivity Computers

    International Nuclear Information System (INIS)

    Petenyi, V.; Strmensky, C.; Jagrik, J.; Minarcin, M.; Sarvaic, I.

    2005-01-01

    The dynamic control assemblies worth measurement technique is a quick method for validation of predicted control assemblies worth. The dynamic control assemblies worth measurement utilize space-time corrections for the measured out of core ionization chamber readings calculated by DYN 3D computer code. The space-time correction arising from the prompt neutron density redistribution in the measured ionization chamber reading can be directly applied in the advanced reactivity computer. The second correction concerning the difference of spatial distribution of delayed neutrons can be calculated by simulation the measurement procedure by dynamic version of the DYN 3D code. In the paper some results of dynamic control assemblies worth measurement applied for NPP Mochovce are presented (Authors)

  12. Advanced Energy Storage Devices: Basic Principles, Analytical Methods, and Rational Materials Design.

    Science.gov (United States)

    Liu, Jilei; Wang, Jin; Xu, Chaohe; Jiang, Hao; Li, Chunzhong; Zhang, Lili; Lin, Jianyi; Shen, Ze Xiang

    2018-01-01

    Tremendous efforts have been dedicated into the development of high-performance energy storage devices with nanoscale design and hybrid approaches. The boundary between the electrochemical capacitors and batteries becomes less distinctive. The same material may display capacitive or battery-like behavior depending on the electrode design and the charge storage guest ions. Therefore, the underlying mechanisms and the electrochemical processes occurring upon charge storage may be confusing for researchers who are new to the field as well as some of the chemists and material scientists already in the field. This review provides fundamentals of the similarities and differences between electrochemical capacitors and batteries from kinetic and material point of view. Basic techniques and analysis methods to distinguish the capacitive and battery-like behavior are discussed. Furthermore, guidelines for material selection, the state-of-the-art materials, and the electrode design rules to advanced electrode are proposed.

  13. Biophysics of the Eye in Computer Vision: Methods and Advanced Technologies

    Science.gov (United States)

    Hammoud, Riad I.; Hansen, Dan Witzner

    The eyes have it! This chapter describes cutting-edge computer vision methods employed in advanced vision sensing technologies for medical, safety, and security applications, where the human eye represents the object of interest for both the imager and the computer. A camera receives light from the real eye to form a sequence of digital images of it. As the eye scans the environment, or focuses on particular objects in the scene, the computer simultaneously localizes the eye position, tracks its movement over time, and infers measures such as the attention level, and the gaze direction in real time and fully automatic. The main focus of this chapter is on computer vision and pattern recognition algorithms for eye appearance variability modeling, automatic eye detection, and robust eye position tracking. This chapter offers good readings and solid methodologies to build the two fundamental low-level building blocks of a vision-based eye tracking technology.

  14. An Advanced Bayesian Method for Short-Term Probabilistic Forecasting of the Generation of Wind Power

    Directory of Open Access Journals (Sweden)

    Antonio Bracale

    2015-09-01

    Full Text Available Currently, among renewable distributed generation systems, wind generators are receiving a great deal of interest due to the great economic, technological, and environmental incentives they involve. However, the uncertainties due to the intermittent nature of wind energy make it difficult to operate electrical power systems optimally and make decisions that satisfy the needs of all the stakeholders of the electricity energy market. Thus, there is increasing interest determining how to forecast wind power production accurately. Most the methods that have been published in the relevant literature provided deterministic forecasts even though great interest has been focused recently on probabilistic forecast methods. In this paper, an advanced probabilistic method is proposed for short-term forecasting of wind power production. A mixture of two Weibull distributions was used as a probability function to model the uncertainties associated with wind speed. Then, a Bayesian inference approach with a particularly-effective, autoregressive, integrated, moving-average model was used to determine the parameters of the mixture Weibull distribution. Numerical applications also are presented to provide evidence of the forecasting performance of the Bayesian-based approach.

  15. Identification of advanced human factors engineering analysis, design and evaluation methods

    International Nuclear Information System (INIS)

    Plott, C.; Ronan, A. M.; Laux, L.; Bzostek, J.; Milanski, J.; Scheff, S.

    2006-01-01

    NUREG-0711 Rev.2, 'Human Factors Engineering Program Review Model,' provides comprehensive guidance to the Nuclear Regulatory Commission (NRC) in assessing the human factors practices employed by license applicants for Nuclear Power Plant control room designs. As software based human-system interface (HSI) technologies supplant traditional hardware-based technologies, the NRC may encounter new HSI technologies or seemingly unconventional approaches to human factors design, analysis, and evaluation methods which NUREG-0711 does not anticipate. A comprehensive survey was performed to identify advanced human factors engineering analysis, design and evaluation methods, tools, and technologies that the NRC may encounter in near term future licensee applications. A review was conducted to identify human factors methods, tools, and technologies relevant to each review element of NUREG-0711. Additionally emerging trends in technology which have the potential to impact review elements, such as Augmented Cognition, and various wireless tools and technologies were identified. The purpose of this paper is to provide an overview of the survey results and to highlight issues that could be revised or adapted to meet with emerging trends. (authors)

  16. An Advanced Method to Apply Multiple Rainfall Thresholds for Urban Flood Warnings

    Directory of Open Access Journals (Sweden)

    Jiun-Huei Jang

    2015-11-01

    Full Text Available Issuing warning information to the public when rainfall exceeds given thresholds is a simple and widely-used method to minimize flood risk; however, this method lacks sophistication when compared with hydrodynamic simulation. In this study, an advanced methodology is proposed to improve the warning effectiveness of the rainfall threshold method for urban areas through deterministic-stochastic modeling, without sacrificing simplicity and efficiency. With regards to flooding mechanisms, rainfall thresholds of different durations are divided into two groups accounting for flooding caused by drainage overload and disastrous runoff, which help in grading the warning level in terms of emergency and severity when the two are observed together. A flood warning is then classified into four levels distinguished by green, yellow, orange, and red lights in ascending order of priority that indicate the required measures, from standby, flood defense, evacuation to rescue, respectively. The proposed methodology is tested according to 22 historical events in the last 10 years for 252 urbanized townships in Taiwan. The results show satisfactory accuracy in predicting the occurrence and timing of flooding, with a logical warning time series for taking progressive measures. For systems with multiple rainfall thresholds already in place, the methodology can be used to ensure better application of rainfall thresholds in urban flood warnings.

  17. [Theory and Practice of the Constructive Jigsaw Method in Advancing Domain Knowledge and Skills in Parallel].

    Science.gov (United States)

    Masukawa, Hiroyuki

    2016-01-01

    The Learning Sciences constitute a rapidly expanding discipline that focuses on the learning potential of humans. In this paper, I will discuss the particular learning mechanism involved in the concomitant advancement of domain knowledge and 21st century skills, as well as the Constructive Jigsaw Method of knowledge construction through collaboration-that is, collaborative problem solving. An especially important focus on knowledge construction separates routine experts from adaptive experts. While routine experts develop a core set of skills that they apply throughout their lives with increasing efficiency, adaptive experts are much more likely to change their core skills and continually expand the depth of their expertise. This restructuring of core ideas and skills may reduce their efficiency in the short run but make them more flexible in the long run. The Constructive Jigsaw Method employs a learning mechanism that encourages the development of adaptive experts. Under this method, students first study a piece of material in an expert group. One member from each of several expert groups then joins a new study group, a jigsaw group. The members of this new group then combine what they have learned, creating new knowledge and a deeper understanding of the concept through collaboration, communication, and innovation.

  18. Advances in complexity of beam halo-chaos and its control methods for beam transport networks

    International Nuclear Information System (INIS)

    Fang Jinqing

    2004-11-01

    The complexity theory of beam halo-chaos in beam transport networks and its control methods for a new subject of high-tech field is discussed. It is pointed that in recent years, there has been growing interest in proton beams of high power linear accelerator due to its attractive features in possible breakthrough applications in national defense and industry. In particular, high-current accelerator driven clean activity nuclear power systems for various applications as energy resources has been one of the most focusing issues in the current research, because it provides a safer, cleaner and cheaper nuclear energy resource. However, halo-chaos in high-current beam transport networks become a key concerned issue because it can generate excessive radioactivity therefore significantly limits its applications. It is very important to study the complexity properties of beam halo-chaos and to understand the basic physical mechanisms for halo chaos formation as well as to develop effective control methods for its suppression. These are very challenging subjects for the current research. The main research advances in the subjects, including experimental investigation and the oretical research, especially some very efficient control methods developed through many years of efforts of authors are reviewed and summarized. Finally, some research outlooks are given. (author)

  19. Advanced Topics in Computational Partial Differential Equations: Numerical Methods and Diffpack Programming

    International Nuclear Information System (INIS)

    Katsaounis, T D

    2005-01-01

    The scope of this book is to present well known simple and advanced numerical methods for solving partial differential equations (PDEs) and how to implement these methods using the programming environment of the software package Diffpack. A basic background in PDEs and numerical methods is required by the potential reader. Further, a basic knowledge of the finite element method and its implementation in one and two space dimensions is required. The authors claim that no prior knowledge of the package Diffpack is required, which is true, but the reader should be at least familiar with an object oriented programming language like C++ in order to better comprehend the programming environment of Diffpack. Certainly, a prior knowledge or usage of Diffpack would be a great advantage to the reader. The book consists of 15 chapters, each one written by one or more authors. Each chapter is basically divided into two parts: the first part is about mathematical models described by PDEs and numerical methods to solve these models and the second part describes how to implement the numerical methods using the programming environment of Diffpack. Each chapter closes with a list of references on its subject. The first nine chapters cover well known numerical methods for solving the basic types of PDEs. Further, programming techniques on the serial as well as on the parallel implementation of numerical methods are also included in these chapters. The last five chapters are dedicated to applications, modelled by PDEs, in a variety of fields. In summary, the book focuses on the computational and implementational issues involved in solving partial differential equations. The potential reader should have a basic knowledge of PDEs and the finite difference and finite element methods. The examples presented are solved within the programming framework of Diffpack and the reader should have prior experience with the particular software in order to take full advantage of the book. Overall

  20. Advances in Spectral Nodal Methods applied to SN Nuclear Reactor Global calculations in Cartesian Geometry

    International Nuclear Information System (INIS)

    Barros, R.C.; Filho, H.A.; Oliveira, F.B.S.; Silva, F.C. da

    2004-01-01

    Presented here are the advances in spectral nodal methods for discrete ordinates (SN) eigenvalue problems in Cartesian geometry. These coarse-mesh methods are based on three ingredients: (i) the use of the standard discretized spatial balance SN equations; (ii) the use of the non-standard spectral diamond (SD) auxiliary equations in the multiplying regions of the domain, e.g. fuel assemblies; and (iii) the use of the non-standard spectral Green's function (SGF) auxiliary equations in the non-multiplying regions of the domain, e.g., the reflector. In slab-geometry the hybrid SD-SGF method generates numerical results that are completely free of spatial truncation errors. In X,Y-geometry, we obtain a system of two 'slab-geometry' SN equations for the node-edge average angular fluxes by transverse-integrating the X,Y-geometry SN equations separately in the y- and then in the x-directions within an arbitrary node of the spatial grid set up on the domain. In this paper, we approximate the transverse leakage terms by constants. These are the only approximations considered in the SD-SGF-constant nodal method, as the source terms, that include scattering and eventually fission events, are treated exactly. Moreover, we describe in this paper the progress of the approximate SN albedo boundary conditions for substituting the non-multiplying regions around the nuclear reactor core. We show numerical results to typical model problems to illustrate the accuracy of spectral nodal methods for coarse-mesh SN criticality calculations. (Author)

  1. STEEP STREAMS - Solid Transport Evaluation and Efficiency in Prevention: Sustainable Techniques of Rational Engineering and Advanced MethodS

    Science.gov (United States)

    Armanini, Aronne; Cardoso, Antonio H.; Di Baldassarre, Giuliano; Bellin, Alberto; Breinl, Korbinian; Canelas, Ricardo B.; Larcher, Michele; Majone, Bruno; Matos, Jorges; Meninno, Sabrina; Nucci, Elena; Rigon, Riccardo; Rosatti, Giorgio; Zardi, Dino

    2017-04-01

    The STEEP STREAMS (Solid Transport Evaluation and Efficiency in Prevention: Sustainable Techniques of Rational Engineering and Advanced MethodS) project consists of a collaboration among the Universities of Trento, Uppsala and Lisbon, who joined in a consortium within the ERANET Water JPI call WaterWorks2014. The aim of the project is to produce new rational criteria for the design of protection works against debris flows, a phenomenon consisting in hyper-concentrated flows of water and sediments, classified as catastrophic events typical of small mountainous basins (area climate change. In this context, one of the key challenges of this project is the use of comparatively coarse RCM projections to the small catchments examined in STEEP STREAMS. Given these changes, conventional protection works and their design criteria may not suffice to provide adequate levels of protection to human life and urban settlements. These structures create a storage area upstream the alluvial fans and the settlements, thereby reducing the need of channelization in areas often constrained by urban regulations. To optimize the lamination, and in particular to reduce the peak of solid mass flux, it is necessary that the deposition basin is controlled by a slit check dam, capable of inducing a controlled sedimentation of the solid mas flux. In order to achieve that, reliable design tools are needed. Driftwood represents another important factor increasing the risk, as clogging induced by the vegetal material represents a major problem for the operational reliability of slit check dams. Current procedures in compiling hazardous maps do not account for such effects. The STEEPS STREAMS project aims at developing structural innovative solutions and design criteria reliable to mitigate the impacts of flash floods and debris flows especially in presence of intense woody material transport, typical of mountain catchments.

  2. Advances in exergy analysis: a novel assessment of the Extended Exergy Accounting method

    International Nuclear Information System (INIS)

    Rocco, M.V.; Colombo, E.; Sciubba, E.

    2014-01-01

    additional insight in and more relevant information for every comparative analysis of energy conversion systems, both at a global and a local level. In the paper, traditional and advanced exergy analysis methods are briefly discussed and EEA theoretical foundations and details for its application are described in detail. Methods: The method converts not only material and energy flows, but externalities as well (labour, capital and environmental costs) into flows of equivalent primary exergy, so that all exchanges between the system and the environment can be completely accounted for on a rigorous thermodynamic basis. The current emphasis decision makers and by public opinion alike seem to be placing on sustainability generates the need for continue research in the field of systems analysis, and a preliminary review confirms that exergy may constitute a coherent and rational basis for developing global and local analysis methods. Moreover, extended exergy accounting possesses some specific and peculiar characteristics that make it more suitable for life-cycle and cradle-to-grave (or well-to-wheel) applications. Results: Taxonomy for the classification of exergy-based methods is proposed. A novel assessment of the EEA method is provided, its advantages and drawbacks are discussed and areas in need of further theoretical investigation are identified. Conclusions: Since EEA is a life-cycle method, it is argued that it represents an improvement with regard to other current methods, in that it provides additional insight into the phenomenological aspects of any “energy conversion chain”. The paper demonstrates that the Extended Exergy cost function can be used within the traditional and very well formalized Thermoeconomic framework, replacing the economic cost function in order to evaluate and optimize the consumption of resources of a system in a more complete and rational way. Practical implications: This paper contains some specific proposals as to the further development

  3. Development of Advanced Life Cycle Costing Methods for Technology Benefit/Cost/Risk Assessment

    Science.gov (United States)

    Yackovetsky, Robert (Technical Monitor)

    2002-01-01

    The overall objective of this three-year grant is to provide NASA Langley's System Analysis Branch with improved affordability tools and methods based on probabilistic cost assessment techniques. In order to accomplish this objective, the Aerospace Systems Design Laboratory (ASDL) needs to pursue more detailed affordability, technology impact, and risk prediction methods and to demonstrate them on variety of advanced commercial transports. The affordability assessment, which is a cornerstone of ASDL methods, relies on the Aircraft Life Cycle Cost Analysis (ALCCA) program originally developed by NASA Ames Research Center and enhanced by ASDL. This grant proposed to improve ALCCA in support of the project objective by updating the research, design, test, and evaluation cost module, as well as the engine development cost module. Investigations into enhancements to ALCCA include improved engine development cost, process based costing, supportability cost, and system reliability with airline loss of revenue for system downtime. A probabilistic, stand-alone version of ALCCA/FLOPS will also be developed under this grant in order to capture the uncertainty involved in technology assessments. FLOPS (FLight Optimization System program) is an aircraft synthesis and sizing code developed by NASA Langley Research Center. This probabilistic version of the coupled program will be used within a Technology Impact Forecasting (TIF) method to determine what types of technologies would have to be infused in a system in order to meet customer requirements. A probabilistic analysis of the CER's (cost estimating relationships) within ALCCA will also be carried out under this contract in order to gain some insight as to the most influential costs and the impact that code fidelity could have on future RDS (Robust Design Simulation) studies.

  4. Training toward Advanced 3D Seismic Methods for CO2 Monitoring, Verification, and Accounting

    Energy Technology Data Exchange (ETDEWEB)

    Christopher Liner

    2012-05-31

    The objective of our work is graduate and undergraduate student training related to improved 3D seismic technology that addresses key challenges related to monitoring movement and containment of CO{sub 2}, specifically better quantification and sensitivity for mapping of caprock integrity, fractures, and other potential leakage pathways. We utilize data and results developed through previous DOE-funded CO{sub 2} characterization project (DE-FG26-06NT42734) at the Dickman Field of Ness County, KS. Dickman is a type locality for the geology that will be encountered for CO{sub 2} sequestration projects from northern Oklahoma across the U.S. midcontinent to Indiana and Illinois. Since its discovery in 1962, the Dickman Field has produced about 1.7 million barrels of oil from porous Mississippian carbonates with a small structural closure at about 4400 ft drilling depth. Project data includes 3.3 square miles of 3D seismic data, 142 wells, with log, some core, and oil/water production data available. Only two wells penetrate the deep saline aquifer. In a previous DOE-funded project, geological and seismic data were integrated to create a geological property model and a flow simulation grid. We believe that sequestration of CO{sub 2} will largely occur in areas of relatively flat geology and simple near surface, similar to Dickman. The challenge is not complex geology, but development of improved, lower-cost methods for detecting natural fractures and subtle faults. Our project used numerical simulation to test methods of gathering multicomponent, full azimuth data ideal for this purpose. Our specific objectives were to apply advanced seismic methods to aide in quantifying reservoir properties and lateral continuity of CO{sub 2} sequestration targets. The purpose of the current project is graduate and undergraduate student training related to improved 3D seismic technology that addresses key challenges related to monitoring movement and containment of CO{sub 2

  5. Hydrogen production methods efficiency coupled to an advanced high temperature accelerator driven system

    Energy Technology Data Exchange (ETDEWEB)

    Rodríguez, Daniel González; Lira, Carlos Alberto Brayner de Oliveira [Universidade Federal de Pernambuco (UFPE), Recife, PE (Brazil). Departamento de Energia Nuclear; Fernández, Carlos García, E-mail: danielgonro@gmail.com, E-mail: mmhamada@ipen.br [Instituto Superior de Tecnologías y Ciencias aplicadas (InSTEC), La Habana (Cuba)

    2017-07-01

    The hydrogen economy is one of the most promising concepts for the energy future. In this scenario, oil is replaced by hydrogen as an energy carrier. This hydrogen, rather than oil, must be produced in volumes not provided by the currently employed methods. In this work two high temperature hydrogen production methods coupled to an advanced nuclear system are presented. A new design of a pebbled-bed accelerator nuclear driven system called TADSEA is chosen because of the advantages it has in matters of transmutation and safety. For the conceptual design of the high temperature electrolysis process a detailed computational fluid dynamics model was developed to analyze the solid oxide electrolytic cell that has a huge influence on the process efficiency. A detailed flowsheet of the high temperature electrolysis process coupled to TADSEA through a Brayton gas cycle was developed using chemical process simulation software: Aspen HYSYS®. The model with optimized operating conditions produces 0.1627 kg/s of hydrogen, resulting in an overall process efficiency of 34.51%, a value in the range of results reported by other authors. A conceptual design of the iodine-sulfur thermochemical water splitting cycle was also developed. The overall efficiency of the process was calculated performing an energy balance resulting in 22.56%. The values of efficiency, hydrogen production rate and energy consumption of the proposed models are in the values considered acceptable in the hydrogen economy concept, being also compatible with the TADSEA design parameters. (author)

  6. ARN Training on Advance Methods for Internal Dose Assessment: Application of Ideas Guidelines

    International Nuclear Information System (INIS)

    Rojo, A.M.; Gomez Parada, I.; Puerta Yepes, N.; Gossio, S.

    2010-01-01

    Dose assessment in case of internal exposure involves the estimation of committed effective dose based on the interpretation of bioassay measurement, and the assumptions of hypotheses on the characteristics of the radioactive material and the time pattern and the pathway of intake. The IDEAS Guidelines provide a method to harmonize dose evaluations using criteria and flow chart procedures to be followed step by step. The EURADOS Working Group 7 'Internal Dosimetry', in collaboration with IAEA and Czech Technical University (CTU) in Prague, promoted the 'EURADOS/IAEA Regional Training Course on Advanced Methods for Internal Dose Assessment: Application of IDEAS Guidelines' to broaden and encourage the use of IDEAS Guidelines, which took place in Prague (Czech Republic) from 2-6 February 2009. The ARN identified the relevance of this training and asked for a place for participating on this activity. After that, the first training course in Argentina took place from 24-28 August for training local internal dosimetry experts. This paper resumes the main characteristics of this activity. (authors) [es

  7. Recent advances in protein-protein interaction prediction: experimental and computational methods.

    Science.gov (United States)

    Jessulat, Matthew; Pitre, Sylvain; Gui, Yuan; Hooshyar, Mohsen; Omidi, Katayoun; Samanfar, Bahram; Tan, Le Hoa; Alamgir, Md; Green, James; Dehne, Frank; Golshani, Ashkan

    2011-09-01

    Proteins within the cell act as part of complex networks, which allow pathways and processes to function. Therefore, understanding how proteins interact is a significant area of current research. This review aims to present an overview of key experimental techniques (yeast two-hybrid, tandem affinity purification and protein microarrays) used to discover protein-protein interactions (PPIs), as well as to briefly discuss certain computational methods for predicting protein interactions based on gene localization, phylogenetic information, 3D structural modeling or primary protein sequence data. Due to the large-scale applicability of primary sequence-based methods, the authors have chosen to focus on this strategy for our review. There is an emphasis on a recent algorithm called Protein Interaction Prediction Engine (PIPE) that can predict global PPIs. The readers will discover recent advances both in the practical determination of protein interaction and the strategies that are available to attempt to anticipate interactions without the time and costs of experimental work. Global PPI maps can help understand the biology of complex diseases and facilitate the identification of novel drug target sites. This study describes different techniques used for PPI prediction that we believe will significantly impact the development of the field in a new future. We expect to see a growing number of similar techniques capable of large-scale PPI predictions.

  8. Application of advanced methods for the prognosis of production energy consumption

    International Nuclear Information System (INIS)

    Stetter, R; Witczak, P; Spindler, C; Hertel, J; Staiger, B

    2014-01-01

    This paper, based on a current research project, describes the application of advanced methods that are frequently used in fault-tolerance control and addresses the issue of the prognosis of energy efficiency. Today, the energy a product requires during its operation is the subject of many activities in research and development. However, the energy necessary for the production of goods is very often not analysed in comparable depth. In the field of electronics, studies come to the conclusion that about 80% of the total energy used by a product is from its production [1]. The energy consumption in production is determined very early in the product development process by designers and engineers, for example through selection of raw materials, explicit and implicit requirements concerning the manufacturing and assembly processes, or through decisions concerning the product architecture. Today, developers and engineers have at their disposal manifold design and simulation tools which can help to predict the energy consumption during operation relatively accurately. In contrast, tools with the objective to predict the energy consumption in production and disposal are not available. This paper aims to present an explorative study of the use of methods such as Fuzzy Logic to predict the production energy consumption early in the product development process

  9. Hydrogen production methods efficiency coupled to an advanced high temperature accelerator driven system

    International Nuclear Information System (INIS)

    Rodríguez, Daniel González; Lira, Carlos Alberto Brayner de Oliveira

    2017-01-01

    The hydrogen economy is one of the most promising concepts for the energy future. In this scenario, oil is replaced by hydrogen as an energy carrier. This hydrogen, rather than oil, must be produced in volumes not provided by the currently employed methods. In this work two high temperature hydrogen production methods coupled to an advanced nuclear system are presented. A new design of a pebbled-bed accelerator nuclear driven system called TADSEA is chosen because of the advantages it has in matters of transmutation and safety. For the conceptual design of the high temperature electrolysis process a detailed computational fluid dynamics model was developed to analyze the solid oxide electrolytic cell that has a huge influence on the process efficiency. A detailed flowsheet of the high temperature electrolysis process coupled to TADSEA through a Brayton gas cycle was developed using chemical process simulation software: Aspen HYSYS®. The model with optimized operating conditions produces 0.1627 kg/s of hydrogen, resulting in an overall process efficiency of 34.51%, a value in the range of results reported by other authors. A conceptual design of the iodine-sulfur thermochemical water splitting cycle was also developed. The overall efficiency of the process was calculated performing an energy balance resulting in 22.56%. The values of efficiency, hydrogen production rate and energy consumption of the proposed models are in the values considered acceptable in the hydrogen economy concept, being also compatible with the TADSEA design parameters. (author)

  10. NATO Advanced Research Workshop on Methods and Mechanisms for Producing Ions from Large Molecules

    CERN Document Server

    Ens, Werner

    1991-01-01

    A NATO Advanced Research Workshop on Methods and Mechanisms for Producing Ions from Large Molecules was held at Minaki Lodge, Minaki, Ontario, Canada, from 24 to 28 June 1990. The workshop was hosted by the time-of-flight group of the Department of Physics at the University of Manitoba, and was attended by 64 invited participants from around the world. Twenty-nine invited talks were given and 19 papers were presented as posters. Of the 48 contributions, 38 are included in these proceedings. The conference was organized to study the rapidly changing field of mass spectrometry of biomolecules. Particle-induced desorption (especially with MeV particles) has been the most effective method of producing molecular ions from biomolecules. An important part of the workshop was devoted to recent developments in this field, particularly to progress in understanding the fundamentals of the desorption process. In this respect, the meeting was similar to previous conferences in Marburg, FRG (1978); Paris, F (1980); Uppsala...

  11. A prospective study of students' and instructors' opinions on Advanced Cardiac Life Support course teaching methods.

    Science.gov (United States)

    Stempien, James; Betz, Martin

    2009-01-01

    The American Heart Association (AHA) revises the Advanced Cardiac Life Support (ACLS) course approximately every 5 years, citing the scientific literature for any changes to content and management recommendations. With ACLS 2005, the AHA also revised the methods used to teach course content. The AHA cited no evidence in making these changes. The ACLS 2005 course, distributed in early 2007, makes greater use of videos to teach students. This prospective study surveyed opinions of both students and instructors in an effort to determine the level of satisfaction with this method of teaching. During 16 consecutive ACLS courses, all students and instructors were asked to complete a questionnaire. The students provided demographic information, but completed the survey anonymously. Four questions probed the participants' opinions about the effectiveness of videos in learning ACLS skills. Experienced participants were asked to compare the new teaching methods with previous courses. Opinions were compared among several subgroups based on sex, occupation and previous experience. Of the 180 students who participated, 71% felt the videos were unequivocally useful for teaching ACLS skills. Fewer first-time students were unequivocally positive (59%) compared with those who had taken 2 or more previous courses (84%). A small proportion of students (13%) desired more hands-on practice time. Of the 16 instructors who participated, 31% felt that the videos were useful for teaching ACLS skills. No differences were found between doctors and nurses, or between men and women. The use of standardized videos in ACLS courses was felt by the majority of students and a minority of instructors to be unequivocally useful. First-time students had more doubts about the effectiveness of videos.

  12. Malignant gliomas: current perspectives in diagnosis, treatment, and early response assessment using advanced quantitative imaging methods

    Directory of Open Access Journals (Sweden)

    Ahmed R

    2014-03-01

    Full Text Available Rafay Ahmed,1 Matthew J Oborski,2 Misun Hwang,1 Frank S Lieberman,3 James M Mountz11Department of Radiology, 2Department of Bioengineering, University of Pittsburgh, Pittsburgh, PA, USA; 3Department of Neurology and Department of Medicine, Division of Hematology/Oncology, University of Pittsburgh School of Medicine, Pittsburgh, PA, USAAbstract: Malignant gliomas consist of glioblastomas, anaplastic astrocytomas, anaplastic oligodendrogliomas and anaplastic oligoastrocytomas, and some less common tumors such as anaplastic ependymomas and anaplastic gangliogliomas. Malignant gliomas have high morbidity and mortality. Even with optimal treatment, median survival is only 12–15 months for glioblastomas and 2–5 years for anaplastic gliomas. However, recent advances in imaging and quantitative analysis of image data have led to earlier diagnosis of tumors and tumor response to therapy, providing oncologists with a greater time window for therapy management. In addition, improved understanding of tumor biology, genetics, and resistance mechanisms has enhanced surgical techniques, chemotherapy methods, and radiotherapy administration. After proper diagnosis and institution of appropriate therapy, there is now a vital need for quantitative methods that can sensitively detect malignant glioma response to therapy at early follow-up times, when changes in management of nonresponders can have its greatest effect. Currently, response is largely evaluated by measuring magnetic resonance contrast and size change, but this approach does not take into account the key biologic steps that precede tumor size reduction. Molecular imaging is ideally suited to measuring early response by quantifying cellular metabolism, proliferation, and apoptosis, activities altered early in treatment. We expect that successful integration of quantitative imaging biomarker assessment into the early phase of clinical trials could provide a novel approach for testing new therapies

  13. BOOK REVIEW: Advanced Topics in Computational Partial Differential Equations: Numerical Methods and Diffpack Programming

    Science.gov (United States)

    Katsaounis, T. D.

    2005-02-01

    The scope of this book is to present well known simple and advanced numerical methods for solving partial differential equations (PDEs) and how to implement these methods using the programming environment of the software package Diffpack. A basic background in PDEs and numerical methods is required by the potential reader. Further, a basic knowledge of the finite element method and its implementation in one and two space dimensions is required. The authors claim that no prior knowledge of the package Diffpack is required, which is true, but the reader should be at least familiar with an object oriented programming language like C++ in order to better comprehend the programming environment of Diffpack. Certainly, a prior knowledge or usage of Diffpack would be a great advantage to the reader. The book consists of 15 chapters, each one written by one or more authors. Each chapter is basically divided into two parts: the first part is about mathematical models described by PDEs and numerical methods to solve these models and the second part describes how to implement the numerical methods using the programming environment of Diffpack. Each chapter closes with a list of references on its subject. The first nine chapters cover well known numerical methods for solving the basic types of PDEs. Further, programming techniques on the serial as well as on the parallel implementation of numerical methods are also included in these chapters. The last five chapters are dedicated to applications, modelled by PDEs, in a variety of fields. The first chapter is an introduction to parallel processing. It covers fundamentals of parallel processing in a simple and concrete way and no prior knowledge of the subject is required. Examples of parallel implementation of basic linear algebra operations are presented using the Message Passing Interface (MPI) programming environment. Here, some knowledge of MPI routines is required by the reader. Examples solving in parallel simple PDEs using

  14. Electromagnetic Characterization of Advanced Composites by Voxel-Based Inverse Methods, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The nondestructive characterization of advanced composites, such as carbon-fiber reinforced polymers (cfrp), by electromagnetic means is well established. What is...

  15. An Advanced Electrospinning Method of Fabricating Nanofibrous Patterned Architectures with Controlled Deposition and Desired Alignment

    Science.gov (United States)

    Rasel, Sheikh Md

    We introduce a versatile advanced method of electrospinning for fabricating various kinds of nanofibrous patterns along with desired alignment, controlled amount of deposition and locally variable density into the architectures. In this method, we employed multiple electrodes whose potentials have been altered in milliseconds with the help of microprocessor based control system. Therefore, key success of this method was that the electrical field as well as charge carrying fibers could be switched shortly from one electrode's location to another, as a result, electrospun fibers could be deposited on the designated areas with desired alignment. A wide range of nanofibrous patterned architectures were constructed using proper arrangement of multiple electrodes. By controlling the concurrent activation time of two adjacent electrodes, we demonstrated that amount of fibers going into the pattern can be adjusted and desired alignment in electrospun fibers can be obtained. We also revealed that the deposition density of electrospun fibers in different areas of patterned architectures can be varied. We showed that by controlling the deposition time between two adjacent electrodes, a number of functionally graded patterns can be generated with uniaxial alignment. We also demonstrated that this handy method was capable of producing random, aligned, and multidirectional nanofibrous mats by engaging a number of electrodes and switching them in desired patterns. A comprehensive study using finite element method was carried out to understand the effects of electrical field. Simulation results revealed that electrical field strength alters shortly based on electrode control switch patterns. Nanofibrous polyvinyl alcohol (PVA) scaffolds and its composite reinforced with wollastonite and wood flour were fabricated using rotating drum electrospinning technique. Morphological, mechanical, and thermal, properties were characterized on PVA/wollastonite and PVA/wood flour nanocomposites

  16. Study of photoelectronic properties of semiconductors by the advanced method of transient microwave photoconductivity (AMTMP)

    Science.gov (United States)

    Grabtchak, Serguei Yuryevich

    The current work has described the new experimental method, Advanced Method of Transient Microwave Photoconductivity (AMTMP) and its capabilities to study the photoelectronic properties of semiconductors. AMTMP measures not only the effect proportional to the excess conduction band electrons ("photoconductivity" itself, related to the changes in the imaginary part of the complex dielectric constant), but also the changes in the real part of the complex dielectric constant ("photodielectric effect"). In a rigorous treatment of the complex dielectric constant changes in a microwave cavity general expressions relating the changes in complex dielectric constant to two experimentally measured quantities (change in the cavity quality factor and the shift of the resonance frequency) were derived. Based on this method it was possible to systematize basic types of excitations (free electrons, plasma, trapped electrons, excitons) as bound/non-bound states. To interpret the behavior of the kinetics in semiconductors having distributions of the localized states in the band gap the simulation approach based on solving numerically multiple trapping rate equations was developed. With this approach the major basic types of distributions (rectangular, linear, exponential, Gaussian) were explored thoroughly. We tested the method on various semiconductors: two types of polycrystalline CdSe thin films (#1 and #2), semi-insulating (SI) GaAs, single crystal Si and porous Si. We identified the distributions present in CdSe #1 as an exponential distribution and in CdSe #2 as a Gaussian-like peaked distribution. The parameters of the distributions were estimated. We showed that with respect to the distribution, the transient methods always probe only the part of the real distribution which is present in a sample. The distribution becomes distorted from the initial form by the Fermi function describing the occupational probability and truncated at high energies by the demarcation level. For

  17. Detecting method of subjects' 3D positions and experimental advanced camera control system

    Science.gov (United States)

    Kato, Daiichiro; Abe, Kazuo; Ishikawa, Akio; Yamada, Mitsuho; Suzuki, Takahito; Kuwashima, Shigesumi

    1997-04-01

    Steady progress is being made in the development of an intelligent robot camera capable of automatically shooting pictures with a powerful sense of reality or tracking objects whose shooting requires advanced techniques. Currently, only experienced broadcasting cameramen can provide these pictures.TO develop an intelligent robot camera with these abilities, we need to clearly understand how a broadcasting cameraman assesses his shooting situation and how his camera is moved during shooting. We use a real- time analyzer to study a cameraman's work and his gaze movements at studios and during sports broadcasts. This time, we have developed a detecting method of subjects' 3D positions and an experimental camera control system to help us further understand the movements required for an intelligent robot camera. The features are as follows: (1) Two sensor cameras shoot a moving subject and detect colors, producing its 3D coordinates. (2) Capable of driving a camera based on camera movement data obtained by a real-time analyzer. 'Moving shoot' is the name we have given to the object position detection technology on which this system is based. We used it in a soccer game, producing computer graphics showing how players moved. These results will also be reported.

  18. REVA Advanced Fuel Design and Codes and Methods - Increasing Reliability, Operating Margin and Efficiency in Operation

    Energy Technology Data Exchange (ETDEWEB)

    Frichet, A.; Mollard, P.; Gentet, G.; Lippert, H. J.; Curva-Tivig, F.; Cole, S.; Garner, N.

    2014-07-01

    Since three decades, AREVA has been incrementally implementing upgrades in the BWR and PWR Fuel design and codes and methods leading to an ever greater fuel efficiency and easier licensing. For PWRs, AREVA is implementing upgraded versions of its HTP{sup T}M and AFA 3G technologies called HTP{sup T}M-I and AFA3G-I. These fuel assemblies feature improved robustness and dimensional stability through the ultimate optimization of their hold down system, the use of Q12, the AREVA advanced quaternary alloy for guide tube, the increase in their wall thickness and the stiffening of the spacer to guide tube connection. But an even bigger step forward has been achieved a s AREVA has successfully developed and introduces to the market the GAIA product which maintains the resistance to grid to rod fretting (GTRF) of the HTP{sup T}M product while providing addition al thermal-hydraulic margin and high resistance to Fuel Assembly bow. (Author)

  19. Limitations of the Conventional Phase Advance Method for Constant Power Operation of the Brushless DC Motor

    International Nuclear Information System (INIS)

    Lawler, J.S.

    2001-01-01

    The brushless dc motor (BDCM) has high-power density and efficiency relative to other motor types. These properties make the BDCM well suited for applications in electric vehicles provided a method can be developed for driving the motor over the 4 to 6:1 constant power speed range (CPSR) required by such applications. The present state of the art for constant power operation of the BDCM is conventional phase advance (CPA)[1]. In this paper, we identify key limitations of CPA. It is shown that the CPA has effective control over the developed power but that the current magnitude is relatively insensitive to power output and is inversely proportional to motor inductance. If the motor inductance is low, then the rms current at rated power and high speed may be several times larger than the current rating. The inductance required to maintain rms current within rating is derived analytically and is found to be large relative to that of BDCM designs using high-strength rare earth magnets. Th us, the CPA requires a BDCM with a large equivalent inductance

  20. Evaluation of DNBR calculation methods for advanced digital core protection system

    International Nuclear Information System (INIS)

    Ihn, W. K.; Hwang, D. H.; Pak, Y. H.; Yoon, T. Y.

    2003-01-01

    This study evaluated the on-line DNBR calculation methods for an advanced digital core protection system in PWR, i.e., subchannel analysis and group-channel analysis. The subchannel code MATRA and the four-channel codes CETOP-D and CETOP2 were used here. CETOP2 is most simplified DNBR analysis code which is implemented in core protection calculator in Korea standard nuclear power plants. The detailed subchannel code TORC was used as a reference calculation of DNBR. The DNBR uncertainty and margin were compared using allowable operating conditions at Yonggwang nuclear units 3-4. The MATRA code using a nine lumping-channel model resulted in smaller mean and larger standard deviation of the DNBR error distribution. CETOP-D and CETOP2 showed conservatively biased mean and relatively smaller standard deviation of the DNBR error distribution. MATRA and CETOP-D w.r.t CETOP2 showed significant increase of the DNBR available margin at normal operating condition. Taking account for the DNBR uncertainty, MATRA and CETOP-D over CETOP2 were estimated to increase the DNBR net margin by 2.5%-9.8% and 2.5%-3.3%, respectively

  1. A hybrid method for damage detection and quantification in advanced X-COR composite structures

    Science.gov (United States)

    Neerukatti, Rajesh Kumar; Rajadas, Abhishek; Borkowski, Luke; Chattopadhyay, Aditi; Huff, Daniel W.

    2016-04-01

    Advanced composite structures, such as foam core carbon fiber reinforced polymer composites, are increasingly being used in applications which require high strength, high in-plane and flexural stiffness, and low weight. However, the presence of in situ damage due to manufacturing defects and/or service conditions can complicate the failure mechanisms and compromise their strength and reliability. In this paper, the capability of detecting damages such as delaminations and foam-core separations in X-COR composite structures using non-destructive evaluation (NDE) and structural health monitoring (SHM) techniques is investigated. Two NDE techniques, flash thermography and low frequency ultrasonics, were used to detect and quantify the damage size and locations. Macro fiber composites (MFCs) were used as actuators and sensors to study the interaction of Lamb waves with delaminations and foam-core separations. The results indicate that both flash thermography and low frequency ultrasonics were capable of detecting damage in X-COR sandwich structures, although low frequency ultrasonic methods were capable of detecting through thickness damages more accurately than flash thermography. It was also observed that the presence of foam-core separations significantly changes the wave behavior when compared to delamination, which complicates the use of wave based SHM techniques. Further, a wave propagation model was developed to model the wave interaction with damages at different locations on the X-COR sandwich plate.

  2. Quantifying export flows of used electronics: advanced methods to resolve used goods within trade data.

    Science.gov (United States)

    Duan, Huabo; Miller, T Reed; Gregory, Jeremy; Kirchain, Randolph

    2014-03-18

    There is limited convincing quantitative data on the export of used electronics from the United States (U.S.). Thus, we advance a methodology to quantify the export flows of whole units of used electronics from the U.S. using detailed export trade data, and demonstrate the methodology using laptops. Since used electronics are not explicitly identified in export trade data, we hypothesize that exports with a low unit value below a used-new threshold specific to a destination world region are used. The importance of using the most disaggregated trade data set available when resolving used and new goods is illustrated. Two detailed U.S. export trade data sets were combined to arrive at quantities and unit values for each port, mode of transport, month, trade partner country, and trade code. We add rigor to the determination of the used-new threshold by utilizing both the Neighborhood valley-emphasis method (NVEM) and published sales prices. This analysis found that 748 to 1199 thousand units of used laptops were exported from the U.S. in 2010, of which 78-81% are destined for non-OECD countries. Asia was found to be the largest destination of used laptop exports across all used-new threshold methods. Latin American and the Caribbean was the second largest recipient of these exports. North America and Europe also received used laptops from the U.S. Only a small fraction of used laptops was exported to Africa. However, these quantities are lower bound estimates because not all shipments of used laptops may be shipped using the proper laptop trade code. Still, this approach has the potential to give insight into the quantity and destinations of the exports if applied to all used electronics product types across a series of years.

  3. Response monitoring using quantitative ultrasound methods and supervised dictionary learning in locally advanced breast cancer

    Science.gov (United States)

    Gangeh, Mehrdad J.; Fung, Brandon; Tadayyon, Hadi; Tran, William T.; Czarnota, Gregory J.

    2016-03-01

    A non-invasive computer-aided-theragnosis (CAT) system was developed for the early assessment of responses to neoadjuvant chemotherapy in patients with locally advanced breast cancer. The CAT system was based on quantitative ultrasound spectroscopy methods comprising several modules including feature extraction, a metric to measure the dissimilarity between "pre-" and "mid-treatment" scans, and a supervised learning algorithm for the classification of patients to responders/non-responders. One major requirement for the successful design of a high-performance CAT system is to accurately measure the changes in parametric maps before treatment onset and during the course of treatment. To this end, a unified framework based on Hilbert-Schmidt independence criterion (HSIC) was used for the design of feature extraction from parametric maps and the dissimilarity measure between the "pre-" and "mid-treatment" scans. For the feature extraction, HSIC was used to design a supervised dictionary learning (SDL) method by maximizing the dependency between the scans taken from "pre-" and "mid-treatment" with "dummy labels" given to the scans. For the dissimilarity measure, an HSIC-based metric was employed to effectively measure the changes in parametric maps as an indication of treatment effectiveness. The HSIC-based feature extraction and dissimilarity measure used a kernel function to nonlinearly transform input vectors into a higher dimensional feature space and computed the population means in the new space, where enhanced group separability was ideally obtained. The results of the classification using the developed CAT system indicated an improvement of performance compared to a CAT system with basic features using histogram of intensity.

  4. Advanced Numerical Methods for Three-Dimensional Parallel Hybrid MHD/PIC

    National Research Council Canada - National Science Library

    McCrory, Robert

    1996-01-01

    .... The main conclusion of our study is that computationally efficient and physically sound description of nonsteady plasmas typical for these applications is possible using the advanced hybrid MHD/PIC...

  5. Analysis of Contracting Methods Employed in the Advanced Concept Technology Demonstration Program

    National Research Council Canada - National Science Library

    Grimes, Jeffrey

    1998-01-01

    The Advanced Concept Technology Demonstration (ACTD) Program, initiated by DoD as a joint acquisition and warfighting community effort, is intended to exploit mature and maturing technologies to assist in solving identified military needs...

  6. Advanced Instrumentation and Control Methods for Small and Medium Reactors with IRIS Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    J. Wesley Hines; Belle R. Upadhyaya; J. Michael Doster; Robert M. Edwards; Kenneth D. Lewis; Paul Turinsky; Jamie Coble

    2011-05-31

    Development and deployment of small-scale nuclear power reactors and their maintenance, monitoring, and control are part of the mission under the Small Modular Reactor (SMR) program. The objectives of this NERI-consortium research project are to investigate, develop, and validate advanced methods for sensing, controlling, monitoring, diagnosis, and prognosis of these reactors, and to demonstrate the methods with application to one of the proposed integral pressurized water reactors (IPWR). For this project, the IPWR design by Westinghouse, the International Reactor Secure and Innovative (IRIS), has been used to demonstrate the techniques developed under this project. The research focuses on three topical areas with the following objectives. Objective 1 - Develop and apply simulation capabilities and sensitivity/uncertainty analysis methods to address sensor deployment analysis and small grid stability issues. Objective 2 - Develop and test an autonomous and fault-tolerant control architecture and apply to the IRIS system and an experimental flow control loop, with extensions to multiple reactor modules, nuclear desalination, and optimal sensor placement strategy. Objective 3 - Develop and test an integrated monitoring, diagnosis, and prognosis system for SMRs using the IRIS as a test platform, and integrate process and equipment monitoring (PEM) and process and equipment prognostics (PEP) toolboxes. The research tasks are focused on meeting the unique needs of reactors that may be deployed to remote locations or to developing countries with limited support infrastructure. These applications will require smaller, robust reactor designs with advanced technologies for sensors, instrumentation, and control. An excellent overview of SMRs is described in an article by Ingersoll (2009). The article refers to these as deliberately small reactors. Most of these have modular characteristics, with multiple units deployed at the same plant site. Additionally, the topics focus

  7. Analyzing Planck and low redshift data sets with advanced statistical methods

    Science.gov (United States)

    Eifler, Tim

    The recent ESA/NASA Planck mission has provided a key data set to constrain cosmology that is most sensitive to physics of the early Universe, such as inflation and primordial NonGaussianity (Planck 2015 results XIII). In combination with cosmological probes of the LargeScale Structure (LSS), the Planck data set is a powerful source of information to investigate late time phenomena (Planck 2015 results XIV), e.g. the accelerated expansion of the Universe, the impact of baryonic physics on the growth of structure, and the alignment of galaxies in their dark matter halos. It is the main objective of this proposal to re-analyze the archival Planck data, 1) with different, more recently developed statistical methods for cosmological parameter inference, and 2) to combine Planck and ground-based observations in an innovative way. We will make the corresponding analysis framework publicly available and believe that it will set a new standard for future CMB-LSS analyses. Advanced statistical methods, such as the Gibbs sampler (Jewell et al 2004, Wandelt et al 2004) have been critical in the analysis of Planck data. More recently, Approximate Bayesian Computation (ABC, see Weyant et al 2012, Akeret et al 2015, Ishida et al 2015, for cosmological applications) has matured to an interesting tool in cosmological likelihood analyses. It circumvents several assumptions that enter the standard Planck (and most LSS) likelihood analyses, most importantly, the assumption that the functional form of the likelihood of the CMB observables is a multivariate Gaussian. Beyond applying new statistical methods to Planck data in order to cross-check and validate existing constraints, we plan to combine Planck and DES data in a new and innovative way and run multi-probe likelihood analyses of CMB and LSS observables. The complexity of multiprobe likelihood analyses scale (non-linearly) with the level of correlations amongst the individual probes that are included. For the multi

  8. Smart Sensing of the Aux. Feed-water Pump Performance in NPP Severe Accidents Using Advanced GMDH Method

    Energy Technology Data Exchange (ETDEWEB)

    No, Young Gyu; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2016-05-15

    In order to develop and verify the models, a number of data obtained by simulating station black out (SBO) scenario for the optimized power reactor 1000 (OPR1000) using MARS code were used. Most of monitoring systems for component have been suggested by using the directly measured data. However, it is very difficult to acquire data related to safety-critical component' status. Therefore, it is necessary to develop the new method that combines the data-based equipped with learning system and data miming technique. Many data-based modeling methods have been applied successfully to nuclear engineering area, such as signal validation, plant diagnostics and event identification. Also, the data miming is the process of analyzing data from different perspectives and summarizing it into useful information. In this study, the smart sensing technique was developed using advanced group method of data handing (GMDH) model. The original GMDH is an inductive self organizing algebraic model. The advanced GMDH model is equipped with a fuzzy concept. The proposed advanced GMDH model enhances the original GMDH model by reducing the effect of outliers and noise. The advanced GMDH uses different weightings according to their importance which is specified by the fuzzy membership grade. The developed model was verified using SBO accident simulation data for the OPR1000 nuclear power plant acquired with MARS code. Also, the advanced GMDH model was trained using the simulated development data and verified with simulated test data. The development and test data sets were independent. The simulation results show that the performance of the developed advanced GMDH model was very satisfactory, as shown in Table 1. Therefore, if the developed model can be optimized using diverse and specific data, it will be possible to predict the performance of Aux. feed water pump accurately.

  9. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    Energy Technology Data Exchange (ETDEWEB)

    Harris, C.E.

    1994-09-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance. Separate articles from this report have been indexed into the database.

  10. FAA/NASA International Symposium on Advanced Structural Integrity Methods for Airframe Durability and Damage Tolerance, part 2

    Science.gov (United States)

    Harris, Charles E. (Editor)

    1994-01-01

    The international technical experts in the areas of durability and damage tolerance of metallic airframe structures were assembled to present and discuss recent research findings and the development of advanced design and analysis methods, structural concepts, and advanced materials. The principal focus of the symposium was on the dissemination of new knowledge and the peer-review of progress on the development of advanced methodologies. Papers were presented on the following topics: structural concepts for enhanced durability, damage tolerance, and maintainability; new metallic alloys and processing technology; fatigue crack initiation and small crack effects; fatigue crack growth models; fracture mechanics failure criteria for ductile materials; structural mechanics methodology for residual strength and life prediction; development of flight load spectra for design and testing; and corrosion resistance.

  11. Advances in methods for identification and characterization of plant transporter function

    DEFF Research Database (Denmark)

    Larsen, Bo; Xu, Deyang; Halkier, Barbara Ann

    2017-01-01

    Transport proteins are crucial for cellular function at all levels. Numerous importers and exporters facilitate transport of a diverse array of metabolites and ions intra- and intercellularly. Identification of transporter function is essential for understanding biological processes at both......-based approaches. In this review, we highlight examples that illustrate how new technology and tools have advanced identification and characterization of plant transporter functions....

  12. Advanced test methods for SRAMs: effective solutions for dynamic fault detection in nanoscaled technologies

    National Research Council Canada - National Science Library

    Bosio, Alberto

    2010-01-01

    ... for Dynamic Faults in SRAM MemoriesAdvanced Test Solutions for Dynamic Faults in SRAM Memories Authors of the book: Alberto Bosio, Associate Professor, LIRMM/University of Montpellier - France Luigi Dilillo, CNRS Researcher, LIRMM/CNRS - France Patrick Girard, CNRS Research Director, LIRMM/CNRS - France Serge Pravossoudovitch, Professor, LI...

  13. Advanced fire-resistant forms of activated carbon and methods of adsorbing and separating gases using same

    Science.gov (United States)

    Xiong, Yongliang; Wang, Yifeng

    2015-02-03

    Advanced, fire-resistant activated carbon compositions useful in adsorbing gases; and having vastly improved fire resistance are provided, and methods for synthesizing the compositions are also provided. The advanced compositions have high gas adsorption capacities and rapid adsorption kinetics (comparable to commercially-available activated carbon), without having any intrinsic fire hazard. They also have superior performance to Mordenites in both adsorption capacities and kinetics. In addition, the advanced compositions do not pose the fibrous inhalation hazard that exists with use of Mordenites. The fire-resistant compositions combine activated carbon mixed with one or more hydrated and/or carbonate-containing minerals that release H.sub.2O and/or CO.sub.2 when heated. This effect raises the spontaneous ignition temperature to over 500.degree. C. in most examples, and over 800.degree. C. in some examples. Also provided are methods for removing and/or separating target gases, such as Krypton or Argon, from a gas stream by using such advanced activated carbons.

  14. A forward-advancing wave expansion method for numerical solution of large-scale sound propagation problems

    Science.gov (United States)

    Rolla, L. Barrera; Rice, H. J.

    2006-09-01

    In this paper a "forward-advancing" field discretization method suitable for solving the Helmholtz equation in large-scale problems is proposed. The forward wave expansion method (FWEM) is derived from a highly efficient discretization procedure based on interpolation of wave functions known as the wave expansion method (WEM). The FWEM computes the propagated sound field by means of an exclusively forward advancing solution, neglecting the backscattered field. It is thus analogous to methods such as the (one way) parabolic equation method (PEM) (usually discretized using standard finite difference or finite element methods). These techniques do not require the inversion of large system matrices and thus enable the solution of large-scale acoustic problems where backscatter is not of interest. Calculations using FWEM are presented for two propagation problems and comparisons to data computed with analytical and theoretical solutions and show this forward approximation to be highly accurate. Examples of sound propagation over a screen in upwind and downwind refracting atmospheric conditions at low nodal spacings (0.2 per wavelength in the propagation direction) are also included to demonstrate the flexibility and efficiency of the method.

  15. A Model For Teaching Advanced Neuroscience Methods: A Student-Run Seminar to Increase Practical Understanding and Confidence.

    Science.gov (United States)

    Harrison, Theresa M; Ching, Christopher R K; Andrews, Anne M

    2016-01-01

    Neuroscience doctoral students must master specific laboratory techniques and approaches to complete their thesis work (hands-on learning). Due to the highly interdisciplinary nature of the field, learning about a diverse range of methodologies through literature surveys and coursework is also necessary for student success (hands-off learning). Traditional neuroscience coursework stresses what is known about the nervous system with relatively little emphasis on the details of the methods used to obtain this knowledge. Furthermore, hands-off learning is made difficult by a lack of detail in methods sections of primary articles, subfield-specific jargon and vague experimental rationales. We designed a student-taught course to enable first-year neuroscience doctoral students to overcome difficulties in hands-off learning by introducing a new approach to reading and presenting primary research articles that focuses on methodology. In our literature-based course students were encouraged to present a method with which they had no previous experience. To facilitate weekly discussions, "experts" were invited to class sessions. Experts were advanced graduate students who had hands-on experience with the method being covered and served as discussion co-leaders. Self-evaluation worksheets were administered on the first and last days of the 10-week course and used to assess students' confidence in discussing research and methods outside of their primary research expertise. These evaluations revealed that the course significantly increased the students' confidence in reading, presenting and discussing a wide range of advanced neuroscience methods.

  16. Extraction, Analytical and Advanced Methods for Detection of Allura Red AC (E129) in Food and Beverages Products

    Science.gov (United States)

    Rovina, Kobun; Siddiquee, Shafiquzzaman; Shaarani, Sharifudin M.

    2016-01-01

    Allura Red AC (E129) is an azo dye that widely used in drinks, juices, bakery, meat, and sweets products. High consumption of Allura Red has claimed an adverse effects of human health including allergies, food intolerance, cancer, multiple sclerosis, attention deficit hyperactivity disorder, brain damage, nausea, cardiac disease and asthma due to the reaction of aromatic azo compounds (R = R′ = aromatic). Several countries have banned and strictly controlled the uses of Allura Red in food and beverage products. This review paper is critically summarized on the available analytical and advanced methods for determination of Allura Red and also concisely discussed on the acceptable daily intake, toxicology and extraction methods. PMID:27303385

  17. Advance in study of intelligent diagnostic method for nuclear power plant

    International Nuclear Information System (INIS)

    Zhou Gang; Yang Li

    2008-01-01

    The advance of research on the application of three types of intelligent diagnostic approach based on neural network (ANN), fuzzy logic and expert system to the operation status monitoring and fault diagnosis of nuclear power plant (NPP) was reviewed. The research status and characters on status monitoring and fault diagnosis approaches based on neural network, fuzzy logic and expert system for nuclear power plant were analyzed. The development trend of applied research on intelligent diagnostic approaches for nuclear power plant was explored. The analysis results show that the research achievements on intelligent diagnostic approaches based on fuzzy logic and expert system for nuclear power plant are not much relatively. The research of intelligent diagnostic approaches for nuclear power plant concentrate on the aspect of operation status monitoring and fault diagnosis based on neural networks for nuclear power plant. The advancing tendency of intelligent diagnostic approaches for nuclear power plant is the combination of various intelligent diagnostic approaches, the combination of neural network diagnostic approaches and other diagnostic approaches as well as multiple neural network diagnostic approaches. (authors)

  18. Advanced methods for the fabrication of mixed uranium plutonium oxide, monocarbide and mononitride fuels for fast breeder reactors

    International Nuclear Information System (INIS)

    Ganguly, C.

    1988-01-01

    Mixed uranium plutonium monocarbide (MC) and mononitride (MN) are considered as advanced LMFBR fuels. 'Powder-pellet (POP) is the conventional method for production of MOX, MC and MN fuel pellets starting from UO 2 and PuO 2 powders. The POP route involves generation and handling of plutonium bearing fine powder or dust particles which has the problem of radiotoxic dust hazard. Further, fine powders have poor flowability which makes automation and remote fabrication difficult. The combination of ammonium uranyl plutonyl carbonate (AUPuC) process and low temperature oxidative sintering (LTS) minimises the radiotoxic dust hazard and the fabrication cost of MOX fuel and is considered as advanced POP route. Vibro-sol (or sphere-pac) and sol-gel microsphere pelletisation (SGMP) are advanced methods for fabrication of MOX, MC and MN fuels. Here, dust-free and free-flowing gel microspheres of the oxide or oxide-carbon are prepared from heavy metal nitrate feed solutions by 'internal' or 'external' gelation processes. The gel microspheres are subjected to controlled calcination for MOX or carbothermic reduction for MC and MN. Thereafter, the microspheres are either vibropacked in fuel pins or directly pelletised and sintered. These processes minimise radiotoxic dust hazard, facilitate automation and remote processing and ensure excellent microhomogeneity. The present paper summarises the state-of-art of the POP, vibrosol and SGMP processes for the fabrication of MOX, MC and MN fuels, highlighting the author's experience in SGMP process. (author). 21 refs., 11 figs

  19. Advanced Methods for Air Distribution in Occupied Spaces for Reduced Risk from Air-Borne Diseases and Improved Air Quality

    DEFF Research Database (Denmark)

    Bolashikov, Zhecho Dimitrov

    and by protecting medical staff, patients and visitors from cross-infection in hospital wards. The first part of the thesis focuses on improvement of inhaled air quality and thus reduction in the risk from cross-infection by advanced ventilation, providing clean air close to the occupants with personalized...... environment related to control and, handling the spread and treating patients with contagious airborne diseases, as well as problems with insufficient space in hospital wards in times of epidemics and pandemics.......The current Ph.D. thesis deals with new advanced methods of air distribution in occupied places aimed to improve the inhaled air quality and to reduce the risk from airborne cross infection among the occupants. The existing ventilation strategies nowadays are not able to provide enough clean air...

  20. Intercomparison of analysis methods for seismically isolated nuclear structures. Part 1: Advanced test data and numerical methods. Working material

    International Nuclear Information System (INIS)

    1993-01-01

    The purpose of the meeting was to review proposed contributions from CRP participating organizations to discuss in detail the experimental data on seismic isolators, to review the numerical methods for the analysis of the seismic isolators, and to perform a first comparison of the calculation results. The aim of the CRP was to validate the reliable numerical methods used for both detailed evaluation of dynamic behaviour of isolation devices and isolated nuclear structures of different nuclear power plant types. The full maturity of seismic isolation for nuclear applications was stressed, as well as the excellent behaviour of isolated structures during the recent earthquakes in Japan and the USA. Participants from Italy, USA, Japan, Russian federation, Republic of Korea, United Kingdom, India and European Commission have presented overview papers on the present programs and their status of contribution to the CRP

  1. Multiscale Design of Advanced Materials based on Hybrid Ab Initio and Quasicontinuum Methods

    Energy Technology Data Exchange (ETDEWEB)

    Luskin, Mitchell [Univ. of Minnesota, Minneapolis, MN (United States). School of Mathematics; James, Richard [Univ. of Minnesota, Minneapolis, MN (United States). School of Mathematics; Tadmor, Ellad [Univ. of Minnesota, Minneapolis, MN (United States)

    2014-03-12

    This project united researchers from mathematics, chemistry, computer science, and engineering for the development of new multiscale methods for the design of materials. Our approach was highly interdisciplinary, but it had two unifying themes: first, we utilized modern mathematical ideas about change-of-scale and state-of-the-art numerical analysis to develop computational methods and codes to solve real multiscale problems of DOE interest; and, second, we took very seriously the need for quantum mechanics-based atomistic forces, and based our methods on fast solvers of chemically accurate methods.

  2. Advanced resonance self-shielding method for gray resonance treatment in lattice physics code GALAXY

    International Nuclear Information System (INIS)

    Koike, Hiroki; Yamaji, Kazuya; Kirimura, Kazuki; Sato, Daisuke; Matsumoto, Hideki; Yamamoto, Akio

    2012-01-01

    A new resonance self-shielding method based on the equivalence theory is developed for general application to the lattice physics calculations. The present scope includes commercial light water reactor (LWR) design applications which require both calculation accuracy and calculation speed. In order to develop the new method, all the calculation processes from cross-section library preparation to effective cross-section generation are reviewed and reframed by adopting the current enhanced methodologies for lattice calculations. The new method is composed of the following four key methods: (1) cross-section library generation method with a polynomial hyperbolic tangent formulation, (2) resonance self-shielding method based on the multi-term rational approximation for general lattice geometry and gray resonance absorbers, (3) spatially dependent gray resonance self-shielding method for generation of intra-pellet power profile and (4) integrated reaction rate preservation method between the multi-group and the ultra-fine-group calculations. From the various verifications and validations, applicability of the present resonance treatment is totally confirmed. As a result, the new resonance self-shielding method is established, not only by extension of a past concentrated effort in the reactor physics research field, but also by unification of newly developed unique and challenging techniques for practical application to the lattice physics calculations. (author)

  3. Verification and Validation Process for Progressive Damage and Failure Analysis Methods in the NASA Advanced Composites Consortium

    Science.gov (United States)

    Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl

    2017-01-01

    The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.

  4. Recent Advances in Targeted and Untargeted Metabolomics by NMR and MS/NMR Methods

    Energy Technology Data Exchange (ETDEWEB)

    Bingol, Ahmet K.

    2018-04-18

    Metabolomics has made significant progress in multiple fronts in the last 18 months. This minireview aimed to give an overview of these advancements in the light of their contribution to targeted and untargeted metabolomics. New computational approaches have emerged to overcome manual absolute quantitation step of metabolites in 1D 1H NMR spectra. This provides more consistency between inter-laboratory comparisons. Integration of 2D NMR metabolomics databases under a unified web server allowed very accurate identification of the metabolites that have been catalogued in these databases. For the remaining uncatalogued and unknown metabolites, new cheminformatics approaches have been developed by combining NMR and mass spectrometry. These hybrid NMR/MS approaches accelerated the identification of unknowns in untargeted studies, and now they are allowing to profile ever larger number of metabolites in application studies.

  5. Advances in intelligent process-aware information systems concepts, methods, and technologies

    CERN Document Server

    Oberhauser, Roy; Reichert, Manfred

    2017-01-01

    This book provides a state-of-the-art perspective on intelligent process-aware information systems and presents chapters on specific facets and approaches applicable to such systems. Further, it highlights novel advances and developments in various aspects of intelligent process-aware information systems and business process management systems. Intelligence capabilities are increasingly being integrated into or created in many of today’s software products and services. Process-aware information systems provide critical computing infrastructure to support the various processes involved in the creation and delivery of business products and services. Yet the integration of intelligence capabilities into process-aware information systems is a non-trivial yet necessary evolution of these complex systems. The book’s individual chapters address adaptive process management, case management processes, autonomically-capable processes, process-oriented information logistics, process recommendations, reasoning over ...

  6. Radioeffects on the advanced breast cancer treated preoperatively by a single large dose irradiation method

    International Nuclear Information System (INIS)

    Mikuriya, Shuichi; Konoeda, Koichi; Mikami, Akihiko

    1981-01-01

    A single irradiation with a large electron dose was applied on 26 cases of advanced breast cancer in National Medical Center Hospital. Pertinent voltage from 6 to 20 MeV betatron electron was selected in accordance with tumor sizes. In eight patients, 30 Gy were given at once and other patients were irradiated with fractionated dose from 10 to 20 Gy, two or three times within 2 or 3 weeks (total 28 - 38 Gy). The radioresponse in primary and metastatic lesions was macroscopically, histopathologically and immunologically examined. Direct effects of preoperative irradiation of the primary lesions in 18 out of 24 cases (75%) were relatively remarkable by histopathological examinations. Remarkable cellular infiltrations into tumor nests of primary lesions were observed in 14 out of 24 cases (58%). Abscopal effects on metastatic lymph nodes were observed macroscopically in 7 of 20 cases (35%) and microscopic abscopal effects were seen in 10 of 20 cases (50%). In 6 cases among these ten cases macroscopic abscopal effects were associated with microscopic ones. In tests for cellular immunity, blastoid formation rates of lymphocytes induced by PHA in vitro, lymphocytes and absolute numbers of T-cells in peripheral blood slightly decreased after the irradiation. However, in four kinds of skin tests, enhancements of the response were confirmed. The crude survival rate for 3 years of Stage III cases revealed 83.3% and relative survival rate of these cases was 85.5%. A single large electron dose for the treatment of advanced breast cancer may inhibit the depression of immunoreaction in hosts. (J.P.N.)

  7. Advances in the Use of Neuroscience Methods in Research on Learning and Instruction

    Science.gov (United States)

    De Smedt, Bert

    2014-01-01

    Cognitive neuroscience offers a series of tools and methodologies that allow researchers in the field of learning and instruction to complement and extend the knowledge they have accumulated through decades of behavioral research. The appropriateness of these methods depends on the research question at hand. Cognitive neuroscience methods allow…

  8. Variational methods in the kinetic modeling of nuclear reactors: Recent advances

    International Nuclear Information System (INIS)

    Dulla, S.; Picca, P.; Ravetto, P.

    2009-01-01

    The variational approach can be very useful in the study of approximate methods, giving a sound mathematical background to numerical algorithms and computational techniques. The variational approach has been applied to nuclear reactor kinetic equations, to obtain a formulation of standard methods such as point kinetics and quasi-statics. more recently, the multipoint method has also been proposed for the efficient simulation of space-energy transients in nuclear reactors and in source-driven subcritical systems. The method is now founded on a variational basis that allows a consistent definition of integral parameters. The mathematical structure of multipoint and modal methods is also investigated, evidencing merits and shortcomings of both techniques. Some numerical results for simple systems are presented and the errors with respect to reference calculations are reported and discussed. (authors)

  9. Genomic similarity and kernel methods I: advancements by building on mathematical and statistical foundations.

    Science.gov (United States)

    Schaid, Daniel J

    2010-01-01

    Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.

  10. Advanced numerical methods for three dimensional two-phase flow calculations

    Energy Technology Data Exchange (ETDEWEB)

    Toumi, I. [Laboratoire d`Etudes Thermiques des Reacteurs, Gif sur Yvette (France); Caruge, D. [Institut de Protection et de Surete Nucleaire, Fontenay aux Roses (France)

    1997-07-01

    This paper is devoted to new numerical methods developed for both one and three dimensional two-phase flow calculations. These methods are finite volume numerical methods and are based on the use of Approximate Riemann Solvers concepts to define convective fluxes versus mean cell quantities. The first part of the paper presents the numerical method for a one dimensional hyperbolic two-fluid model including differential terms as added mass and interface pressure. This numerical solution scheme makes use of the Riemann problem solution to define backward and forward differencing to approximate spatial derivatives. The construction of this approximate Riemann solver uses an extension of Roe`s method that has been successfully used to solve gas dynamic equations. As far as the two-fluid model is hyperbolic, this numerical method seems very efficient for the numerical solution of two-phase flow problems. The scheme was applied both to shock tube problems and to standard tests for two-fluid computer codes. The second part describes the numerical method in the three dimensional case. The authors discuss also some improvements performed to obtain a fully implicit solution method that provides fast running steady state calculations. Such a scheme is not implemented in a thermal-hydraulic computer code devoted to 3-D steady-state and transient computations. Some results obtained for Pressurised Water Reactors concerning upper plenum calculations and a steady state flow in the core with rod bow effect evaluation are presented. In practice these new numerical methods have proved to be stable on non staggered grids and capable of generating accurate non oscillating solutions for two-phase flow calculations.

  11. Recent Advances in Fragment-Based QSAR and Multi-Dimensional QSAR Methods

    OpenAIRE

    Myint, Kyaw Zeyar; Xie, Xiang-Qun

    2010-01-01

    This paper provides an overview of recently developed two dimensional (2D) fragment-based QSAR methods as well as other multi-dimensional approaches. In particular, we present recent fragment-based QSAR methods such as fragment-similarity-based QSAR (FS-QSAR), fragment-based QSAR (FB-QSAR), Hologram QSAR (HQSAR), and top priority fragment QSAR in addition to 3D- and nD-QSAR methods such as comparative molecular field analysis (CoMFA), comparative molecular similarity analysis (CoMSIA), Topome...

  12. [The optimization of restoration approaches of advanced hand activity using the sensorial glove and the mCIMT method].

    Science.gov (United States)

    Mozheiko, E Yu; Prokopenko, S V; Alekseevich, G V

    To reason the choice of methods of restoration of advanced hand activity depending on severity of motor disturbance in the top extremity. Eighty-eight patients were randomized into 3 groups: 1) the mCIMT group, 2) the 'touch glove' group, 3) the control group. For assessment of physical activity of the top extremity Fugl-Meyer Assessment Upper Extremity, Nine-Hole Peg Test, Motor Assessment Scale were used. Assessment of non-use phenomenon was carried out with the Motor Activity Log scale. At a stage of severe motor dysfunction, there was a restoration of proximal departments of a hand in all groups, neither method was superior to the other. In case of moderate severity of motor deficiency of the upper extremity the most effective was the method based on the principle of biological feedback - 'a touch glove'. In the group with mild severity of motor dysfunction, the best recovery was achieved in the mCIMT group.

  13. 1st Advanced School on Exoplanetary Science : Methods of Detecting Exoplanets

    CERN Document Server

    Mancini, Luigi; Sozzetti, Alessandro

    2016-01-01

    In this book, renowned scientists describe the various techniques used to detect and characterize extrasolar planets, or exoplanets, with a view to unveiling the “tricks of the trade” of planet detection to a wider community. The radial velocity method, transit method, microlensing method, and direct imaging method are all clearly explained, drawing attention to their advantages and limitations and highlighting the complementary roles that they can play in improving the characterization of exoplanets’ physical and orbital properties. By probing the planetary frequency at different distances and in different conditions, these techniques are helping astrophysicists to reconstruct the scenarios of planetary formation and to give robust scientific answers to questions regarding the frequency of potentially habitable worlds. Twenty years have passed since the discovery of a Jupiter-mass companion to a main sequence star other than the Sun, heralding the birth of extrasolar planetary research; this book fully...

  14. Advancing methods for reliably assessing motivational interviewing fidelity using the motivational interviewing skills code.

    Science.gov (United States)

    Lord, Sarah Peregrine; Can, Doğan; Yi, Michael; Marin, Rebeca; Dunn, Christopher W; Imel, Zac E; Georgiou, Panayiotis; Narayanan, Shrikanth; Steyvers, Mark; Atkins, David C

    2015-02-01

    The current paper presents novel methods for collecting MISC data and accurately assessing reliability of behavior codes at the level of the utterance. The MISC 2.1 was used to rate MI interviews from five randomized trials targeting alcohol and drug use. Sessions were coded at the utterance-level. Utterance-based coding reliability was estimated using three methods and compared to traditional reliability estimates of session tallies. Session-level reliability was generally higher compared to reliability using utterance-based codes, suggesting that typical methods for MISC reliability may be biased. These novel methods in MI fidelity data collection and reliability assessment provided rich data for therapist feedback and further analyses. Beyond implications for fidelity coding, utterance-level coding schemes may elucidate important elements in the counselor-client interaction that could inform theories of change and the practice of MI. Copyright © 2015 Elsevier Inc. All rights reserved.

  15. Development of advanced methods and related software for human reliability evaluation within probabilistic safety analyses

    International Nuclear Information System (INIS)

    Kosmowski, K.T.; Mertens, J.; Degen, G.; Reer, B.

    1994-06-01

    Human Reliability Analysis (HRA) is an important part of Probabilistic Safety Analysis (PSA). The first part of this report consists of an overview of types of human behaviour and human error including the effect of significant performance shaping factors on human reliability. Particularly with regard to safety assessments for nuclear power plants a lot of HRA methods have been developed. The most important of these methods are presented and discussed in the report, together with techniques for incorporating HRA into PSA and with models of operator cognitive behaviour. Based on existing HRA methods the concept of a software system is described. For the development of this system the utilization of modern programming tools is proposed; the essential goal is the effective application of HRA methods. A possible integration of computeraided HRA within PSA is discussed. The features of Expert System Technology and examples of applications (PSA, HRA) are presented in four appendices. (orig.) [de

  16. Effects of advanced selection methods on sperm quality and ART outcome : a systematic review

    NARCIS (Netherlands)

    Said, Tamer M.; Land, Jolande A.

    2011-01-01

    BACKGROUND: Current routine semen preparation techniques do not inclusively target all intrinsic sperm characteristics that may impact the fertilization potential. In order to address these characteristics, several methods have been recently developed and applied to sperm selection. The objective of

  17. Advancing Inverse Sensitivity/Uncertainty Methods for Nuclear Fuel Cycle Applications

    Energy Technology Data Exchange (ETDEWEB)

    Arbanas, Goran [ORNL; Williams, Mark L [ORNL; Leal, Luiz C [ORNL; Dunn, Michael E [ORNL; Khuwaileh, Bassam A. [North Carolina State University; Wang, C [North Carolina State University; Abdel-Khalik, Hany [North Carolina State University

    2015-01-01

    The inverse sensitivity/uncertainty quantification (IS/UQ) method has recently been implemented in the Inverse Sensitivity/UnceRtainty Estimiator (INSURE) module of the AMPX system [1]. The IS/UQ method aims to quantify and prioritize the cross section measurements along with uncertainties needed to yield a given nuclear application(s) target response uncertainty, and doing this at a minimum cost. Since in some cases the extant uncertainties of the differential cross section data are already near the limits of the present-day state-of-the-art measurements, requiring significantly smaller uncertainties may be unrealistic. Therefore we have incorporated integral benchmark experiments (IBEs) data into the IS/UQ method using the generalized linear least-squares method, and have implemented it in the INSURE module. We show how the IS/UQ method could be applied to systematic and statistical uncertainties in a self-consistent way. We show how the IS/UQ method could be used to optimize uncertainties of IBEs and differential cross section data simultaneously.

  18. Parallel supercomputing: Advanced methods, algorithms, and software for large-scale linear and nonlinear problems

    Energy Technology Data Exchange (ETDEWEB)

    Carey, G.F.; Young, D.M.

    1993-12-31

    The program outlined here is directed to research on methods, algorithms, and software for distributed parallel supercomputers. Of particular interest are finite element methods and finite difference methods together with sparse iterative solution schemes for scientific and engineering computations of very large-scale systems. Both linear and nonlinear problems will be investigated. In the nonlinear case, applications with bifurcation to multiple solutions will be considered using continuation strategies. The parallelizable numerical methods of particular interest are a family of partitioning schemes embracing domain decomposition, element-by-element strategies, and multi-level techniques. The methods will be further developed incorporating parallel iterative solution algorithms with associated preconditioners in parallel computer software. The schemes will be implemented on distributed memory parallel architectures such as the CRAY MPP, Intel Paragon, the NCUBE3, and the Connection Machine. We will also consider other new architectures such as the Kendall-Square (KSQ) and proposed machines such as the TERA. The applications will focus on large-scale three-dimensional nonlinear flow and reservoir problems with strong convective transport contributions. These are legitimate grand challenge class computational fluid dynamics (CFD) problems of significant practical interest to DOE. The methods developed and algorithms will, however, be of wider interest.

  19. Recent advances in conventional and contemporary methods for remediation of heavy metal-contaminated soils.

    Science.gov (United States)

    Sharma, Swati; Tiwari, Sakshi; Hasan, Abshar; Saxena, Varun; Pandey, Lalit M

    2018-04-01

    Remediation of heavy metal-contaminated soils has been drawing our attention toward it for quite some time now and a need for developing new methods toward reclamation has come up as the need of the hour. Conventional methods of heavy metal-contaminated soil remediation have been in use for decades and have shown great results, but they have their own setbacks. The chemical and physical techniques when used singularly generally generate by-products (toxic sludge or pollutants) and are not cost-effective, while the biological process is very slow and time-consuming. Hence to overcome them, an amalgamation of two or more techniques is being used. In view of the facts, new methods of biosorption, nanoremediation as well as microbial fuel cell techniques have been developed, which utilize the metabolic activities of microorganisms for bioremediation purpose. These are cost-effective and efficient methods of remediation, which are now becoming an integral part of all environmental and bioresource technology. In this contribution, we have highlighted various augmentations in physical, chemical, and biological methods for the remediation of heavy metal-contaminated soils, weighing up their pros and cons. Further, we have discussed the amalgamation of the above techniques such as physiochemical and physiobiological methods with recent literature for the removal of heavy metals from the contaminated soils. These combinations have showed synergetic effects with a many fold increase in removal efficiency of heavy metals along with economic feasibility.

  20. A treatment planning method for better management of radiation-induced oral mucositis in locally advanced head and neck cancer

    Directory of Open Access Journals (Sweden)

    Hao Howard Zhang

    2018-01-01

    Full Text Available Purpose/Aim: To describe a two-phase intensity-modulated radiation therapy (IMRT treatment planning approach, that is, promising for reduction of oral mucositis risk in locally advanced head-and-neck cancer. Materials and Methods: Ten locally advanced head-and-neck cancer patients who underwent RT were retrospectively collected. Conventional IMRT and volumetric-modulated arc therapy (VMAT plans were generated for these patients following clinical protocol. Following the first phase of generating conventional IMRT plans, our approach utilized data from Monte Carlo-based kernel superposition dose calculations corresponding to beam apertures (generated from the conventional IMRT plans and used an exact mathematical programming-based optimization approach applying linear programming (LP to dose optimization in the second phase. Results: Compared with conventional IMRT and VMAT treatment plans, our novel method achieved better preservation of oral cavity (16%–29% lower mean dose, P < 0.01, parotid glands (6%–17% lower mean dose, P < 0.04, and spinal cord (3-11 Gy lower maximum dose, P < 0.03 and lower doses to nonorgan-at-risk/nontarget normal tissues, with the same or better target coverage. Conclusions: Our LP-based method can be practically implemented in routine clinical use with a goal of limiting radiation-induced oral mucositis for head-and-neck cancer patients.

  1. Method for the determination of the equation of state of advanced fuels based on the properties of normal fluids

    International Nuclear Information System (INIS)

    Hecht, M.J.; Catton, I.; Kastenberg, W.E.

    1976-12-01

    An equation of state based on the properties of normal fluids, the law of rectilinear averages, and the second law of thermodynamics can be derived for advanced LMFBR fuels on the basis of the vapor pressure, enthalpy of vaporization, change in heat capacity upon vaporization, and liquid density at the melting point. The method consists of estimating an equation of state by means of the law of rectilinear averages and the second law of thermodynamics, integrating by means of the second law until an instability is reached, and then extrapolating by means of a self-consistent estimation of the enthalpy of vaporization

  2. Divalent and Multivalent Activation in Phosphate Triesters: A Versatile Method for the Synthesis of Advanced Polyol Synthons.

    Science.gov (United States)

    Thomas, Christopher D; McParland, James P; Hanson, Paul R

    2009-11-01

    The construction of mono- and bicyclic phosphate trimesters possessing divalent and multivalent activation and their subsequent use in the production of advanced polyol synthons is presented. The method highlights efforts to employ phosphate tethers as removable, functionally active tethers capable of multipositional activation and their subsequent role as leaving groups in selective cleavage reactions. The development of phosphate tethers represents an integrated platform for a new and versatile tether for natural product synthesis and sheds light on new approaches to the facile construction of small molecules.

  3. Potential of advance NDT method for water measurement in a bulk paper-recycling

    International Nuclear Information System (INIS)

    Norpaiza Mohamad Hasan; Noraini Haron; Glam Hadzir Patai Mohamad; Rasif Mohd Zain; Ismail Mustapha; Syed Yusainee Syed Yahya

    2010-01-01

    Paper recycling industries usually buy their raw material from suppliers. Bulk used paper supplied to recycling industry may contain water in their internal voids. This is because the price of the used paper is currently based on their weight and has a huge potential of suppliers to add water to increase the price. The aims of our experiment are to establish the neutron calibration curve and to develop a correction factor of weight measurement during purchasing. This study presents an advance non-destructive testing technique for rapid and in-situ measurement of water content in a bulk used paper. A fast neutron source (Am-Be 241) and a portable backscattering neutron detector were used for water measurement. The experiments were conducted by measuring a series of wet paper added with certain amount of water. As a result, a neutron calibration curve for water measurement in bulk used paper was established. A total of six bands for weight correction based on the calibration curve have been proposed. (author)

  4. IMPROVED COMPUTATIONAL NEUTRONICS METHODS AND VALIDATION PROTOCOLS FOR THE ADVANCED TEST REACTOR

    Energy Technology Data Exchange (ETDEWEB)

    David W. Nigg; Joseph W. Nielsen; Benjamin M. Chase; Ronnie K. Murray; Kevin A. Steuhm

    2012-04-01

    The Idaho National Laboratory (INL) is in the process of modernizing the various reactor physics modeling and simulation tools used to support operation and safety assurance of the Advanced Test Reactor (ATR). Key accomplishments so far have encompassed both computational as well as experimental work. A new suite of stochastic and deterministic transport theory based reactor physics codes and their supporting nuclear data libraries (HELIOS, KENO6/SCALE, NEWT/SCALE, ATTILA, and an extended implementation of MCNP5) has been installed at the INL. Corresponding models of the ATR and ATRC are now operational with all five codes, demonstrating the basic feasibility of the new code packages for their intended purpose. Of particular importance, a set of as-run core depletion HELIOS calculations for all ATR cycles since August 2009 was successfully completed during 2011. This demonstration supported a decision late in the year to proceed with the phased incorporation of the HELIOS methodology into the ATR fuel cycle management process beginning in 2012. On the experimental side of the project, new hardware was fabricated, measurement protocols were finalized, and the first four of six planned physics code validation experiments based on neutron activation spectrometry were conducted at the ATRC facility. Data analysis for the first three experiments, focused on characterization of the neutron spectrum in one of the ATR flux traps, has been completed. The six experiments will ultimately form the basis for a flexible, easily-repeatable ATR physics code validation protocol that is consistent with applicable ASTM standards.

  5. Mass Spectrometry-Based Methods for Identifying Oxidized Proteins in Disease: Advances and Challenges

    Directory of Open Access Journals (Sweden)

    Ivan Verrastro

    2015-04-01

    Full Text Available Many inflammatory diseases have an oxidative aetiology, which leads to oxidative damage to biomolecules, including proteins. It is now increasingly recognized that oxidative post-translational modifications (oxPTMs of proteins affect cell signalling and behaviour, and can contribute to pathology. Moreover, oxidized proteins have potential as biomarkers for inflammatory diseases. Although many assays for generic protein oxidation and breakdown products of protein oxidation are available, only advanced tandem mass spectrometry approaches have the power to localize specific oxPTMs in identified proteins. While much work has been carried out using untargeted or discovery mass spectrometry approaches, identification of oxPTMs in disease has benefitted from the development of sophisticated targeted or semi-targeted scanning routines, combined with chemical labeling and enrichment approaches. Nevertheless, many potential pitfalls exist which can result in incorrect identifications. This review explains the limitations, advantages and challenges of all of these approaches to detecting oxidatively modified proteins, and provides an update on recent literature in which they have been used to detect and quantify protein oxidation in disease.

  6. Evaluation of advanced automatic PET segmentation methods using nonspherical thin-wall inserts

    International Nuclear Information System (INIS)

    Berthon, B.; Marshall, C.; Evans, M.; Spezi, E.

    2014-01-01

    Purpose: The use of positron emission tomography (PET) within radiotherapy treatment planning requires the availability of reliable and accurate segmentation tools. PET automatic segmentation (PET-AS) methods have been recommended for the delineation of tumors, but there is still a lack of thorough validation and cross-comparison of such methods using clinically relevant data. In particular, studies validating PET segmentation tools mainly use phantoms with thick plastic walls inserts of simple spherical geometry and have not specifically investigated the effect of the target object geometry on the delineation accuracy. Our work therefore aimed at generating clinically realistic data using nonspherical thin-wall plastic inserts, for the evaluation and comparison of a set of eight promising PET-AS approaches. Methods: Sixteen nonspherical inserts were manufactured with a plastic wall of 0.18 mm and scanned within a custom plastic phantom. These included ellipsoids and toroids derived with different volumes, as well as tubes, pear- and drop-shaped inserts with different aspect ratios. A set of six spheres of volumes ranging from 0.5 to 102 ml was used for a baseline study. A selection of eight PET-AS methods, written in house, was applied to the images obtained. The methods represented promising segmentation approaches such as adaptive iterative thresholding, region-growing, clustering and gradient-based schemes. The delineation accuracy was measured in terms of overlap with the computed tomography reference contour, using the dice similarity coefficient (DSC), and error in dimensions. Results: The delineation accuracy was lower for nonspherical inserts than for spheres of the same volume in 88% cases. Slice-by-slice gradient-based methods, showed particularly lower DSC for tori (DSC 0.76 except for tori) but showed the largest errors in the recovery of pears and drops dimensions (higher than 10% and 30% of the true length, respectively). Large errors were visible

  7. Analysis Method for Laterally Loaded Pile Groups Using an Advanced Modeling of Reinforced Concrete Sections

    Directory of Open Access Journals (Sweden)

    Stefano Stacul

    2018-02-01

    Full Text Available A Boundary Element Method (BEM approach was developed for the analysis of pile groups. The proposed method includes: the non-linear behavior of the soil by a hyperbolic modulus reduction curve; the non-linear response of reinforced concrete pile sections, also taking into account the influence of tension stiffening; the influence of suction by increasing the stiffness of shallow portions of soil and modeled using the Modified Kovacs model; pile group shadowing effect, modeled using an approach similar to that proposed in the Strain Wedge Model for pile groups analyses. The proposed BEM method saves computational effort compared to more sophisticated codes such as VERSAT-P3D, PLAXIS 3D and FLAC-3D, and provides reliable results using input data from a standard site investigation. The reliability of this method was verified by comparing results from data from full scale and centrifuge tests on single piles and pile groups. A comparison is presented between measured and computed data on a laterally loaded fixed-head pile group composed by reinforced concrete bored piles. The results of the proposed method are shown to be in good agreement with those obtained in situ.

  8. A review of recent advances in the spherical harmonics expansion method for semiconductor device simulation.

    Science.gov (United States)

    Rupp, K; Jungemann, C; Hong, S-M; Bina, M; Grasser, T; Jüngel, A

    The Boltzmann transport equation is commonly considered to be the best semi-classical description of carrier transport in semiconductors, providing precise information about the distribution of carriers with respect to time (one dimension), location (three dimensions), and momentum (three dimensions). However, numerical solutions for the seven-dimensional carrier distribution functions are very demanding. The most common solution approach is the stochastic Monte Carlo method, because the gigabytes of memory requirements of deterministic direct solution approaches has not been available until recently. As a remedy, the higher accuracy provided by solutions of the Boltzmann transport equation is often exchanged for lower computational expense by using simpler models based on macroscopic quantities such as carrier density and mean carrier velocity. Recent developments for the deterministic spherical harmonics expansion method have reduced the computational cost for solving the Boltzmann transport equation, enabling the computation of carrier distribution functions even for spatially three-dimensional device simulations within minutes to hours. We summarize recent progress for the spherical harmonics expansion method and show that small currents, reasonable execution times, and rare events such as low-frequency noise, which are all hard or even impossible to simulate with the established Monte Carlo method, can be handled in a straight-forward manner. The applicability of the method for important practical applications is demonstrated for noise simulation, small-signal analysis, hot-carrier degradation, and avalanche breakdown.

  9. ADVANCEMENTS IN TIME-SPECTRA ANALYSIS METHODS FOR LEAD SLOWING-DOWN SPECTROSCOPY

    International Nuclear Information System (INIS)

    Smith, Leon E.; Anderson, Kevin K.; Gesh, Christopher J.; Shaver, Mark W.

    2010-01-01

    Direct measurement of Pu in spent nuclear fuel remains a key challenge for safeguarding nuclear fuel cycles of today and tomorrow. Lead slowing-down spectroscopy (LSDS) is an active nondestructive assay method that has the potential to provide independent, direct measurement of Pu and U isotopic mass with an uncertainty lower than the approximately 10 percent typical of today's confirmatory assay methods. Pacific Northwest National Laboratory's (PNNL) previous work to assess the viability of LSDS for the assay of pressurized water reactor (PWR) assemblies indicated that the method could provide direct assay of Pu-239 and U-235 (and possibly Pu-240 and Pu-241) with uncertainties less than a few percent, assuming suitably efficient instrumentation, an intense pulsed neutron source, and improvements in the time-spectra analysis methods used to extract isotopic information from a complex LSDS signal. This previous simulation-based evaluation used relatively simple PWR fuel assembly definitions (e.g. constant burnup across the assembly) and a constant initial enrichment and cooling time. The time-spectra analysis method was founded on a preliminary analytical model of self-shielding intended to correct for assay-signal nonlinearities introduced by attenuation of the interrogating neutron flux within the assembly.

  10. Recent advances in sample preparation techniques and methods of sulfonamides detection - A review.

    Science.gov (United States)

    Dmitrienko, Stanislava G; Kochuk, Elena V; Apyari, Vladimir V; Tolmacheva, Veronika V; Zolotov, Yury A

    2014-11-19

    Sulfonamides (SAs) have been the most widely used antimicrobial drugs for more than 70 years, and their residues in foodstuffs and environmental samples pose serious health hazards. For this reason, sensitive and specific methods for the quantification of these compounds in numerous matrices have been developed. This review intends to provide an updated overview of the recent trends over the past five years in sample preparation techniques and methods for detecting SAs. Examples of the sample preparation techniques, including liquid-liquid and solid-phase extraction, dispersive liquid-liquid microextraction and QuEChERS, are given. Different methods of detecting the SAs present in food and feed and in environmental, pharmaceutical and biological samples are discussed. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Advanced Numerical Methods for Computing Statistical Quantities of Interest from Solutions of SPDES

    Science.gov (United States)

    2012-01-19

    uDNS(T )]‖L2(Ω) and error( var )= ‖ var [(uADM − uDNS),x, T ]‖L2(Ω). h error(E) rate(E) error( var ) rate( var ) 1/4 0.0724682 0.00319198 1/8 0.0297043 1.287...viscosity method developed by Tadmor and co-workers in [25] and several related papers and later extended to the finite element and wavelet cases...element techniques; Comput. Methods Appl. Mech. Engrg. 190 2001, 6359-6372. [18] D. Diez, M. Gunzburger and A. Kunoth, An adaptive wavelet viscosity

  12. Methods and tools for the evaluation of the sensitivity to natural radiations of advanced integrated circuits

    International Nuclear Information System (INIS)

    Peronnard, P.

    2009-10-01

    Atmospheric neutrons, whose fluxes and energies dependent on the altitude, the sun activity and the geographic coordinates, have been identified as being capable to provoke SEE (Single Event Effects), by indirect ionisation, in integrated devices issued from advanced manufacturing processes (nano-metric devices). This concerns not only avionics but also applications operating at ground level. The evaluation of the sensitivity to SEE provoked by natural radiation becomes thus a mandatory step during the selection of devices devoted to be included in applications requiring high reliability. The sensitivity to SEE can be mitigated by different approaches at different levels from manufacturing level (use of particular process technologies such as SOI - Silicon On Isolator -) to the system level (hardware/software redundancy). Independently of the adopted hardening approach, the so-called radiation ground testing are mandatory to evaluate the error rates of a device or a system. During such tests, the DUT (Device Under Test) is exposed to a flux of particles while it performs a given activity. For SEU (Single Event Upsets) radiation ground testing, two main strategies exist: static test: the circuit areas which are supposed to be sensitive to SEUs (registers, memories,...) are initialized with a reference pattern. The content of the sensitive area is periodically compared to the reference pattern to identify potential SEU. Dynamic test: the DUT performs an activity representative of the one it will execute during the final application. Static test: strategies are frequently adopted as they provide the intrinsic sensitivity, in terms of the average number of particles needed to provoke an SEU, of different sensitive areas of the device. From such a strategy can thus be obtained a 'worst case estimation' of the device sensitivity. This thesis aims at giving a description and validating the methodologies required to estimate the sensitivity to radiations of two types of

  13. Combining advanced networked technology and pedagogical methods to improve collaborative distance learning.

    Science.gov (United States)

    Staccini, Pascal; Dufour, Jean-Charles; Raps, Hervé; Fieschi, Marius

    2005-01-01

    Making educational material be available on a network cannot be reduced to merely implementing hypermedia and interactive resources on a server. A pedagogical schema has to be defined to guide students for learning and to provide teachers with guidelines to prepare valuable and upgradeable resources. Components of a learning environment, as well as interactions between students and other roles such as author, tutor and manager, can be deduced from cognitive foundations of learning, such as the constructivist approach. Scripting the way a student will to navigate among information nodes and interact with tools to build his/her own knowledge can be a good way of deducing the features of the graphic interface related to the management of the objects. We defined a typology of pedagogical resources, their data model and their logic of use. We implemented a generic and web-based authoring and publishing platform (called J@LON for Join And Learn On the Net) within an object-oriented and open-source programming environment (called Zope) embedding a content management system (called Plone). Workflow features have been used to mark the progress of students and to trace the life cycle of resources shared by the teaching staff. The platform integrated advanced on line authoring features to create interactive exercises and support live courses diffusion. The platform engine has been generalized to the whole curriculum of medical studies in our faculty; it also supports an international master of risk management in health care and will be extent to all other continuous training diploma.

  14. Recent advances in methods to assess the activity of the kinome

    DEFF Research Database (Denmark)

    Radu, Maria; Chernoff, Jonathan

    2017-01-01

    to treatment because of adaption of cellular signaling pathways to bypass targeted kinases. So that the basal and adaptive responses of kinases in cancer can be better understood, new methods have emerged that allow simultaneous and unbiased measurement of the activation state of a substantial fraction...

  15. Prediction Study of Tunnel Collapse Risk in Advance based on Efficacy Coefficient Method and Geological Forecast

    Directory of Open Access Journals (Sweden)

    QIU Daohong

    2014-08-01

    Full Text Available Collapse is one of the most common accidents in underground constructions. Risk evaluation is the method of measuring the risk of chamber collapse. To ensure the safety of construction, a risk evaluation model of tunnel collapse based on an efficacy coefficient method and geological prediction was put forward. Based on the comprehensive analysis of collapse factors, five main factors including rock uniaxial compressive strength, surrounding rock integrated coefficient, state of discontinuous structural planes, the angle between tunnel axis and major structural plane and underground water were chosen as the risk evaluation indices of tunnel collapse. The evaluation indices were quantitatively described by using TSP203 system and core-drilling to establish the risk early warning model of tunnel collapse based on the basic principle of the efficacy coefficient method. The model established in this research was applied in the collapse risk recognition of Kiaochow Bay subsea tunnel in Qingdao, China. The results showed that the collapse risk recognition method presents higher prediction accuracy and provided a new idea for the risk prediction of tunnel collapse.

  16. ADVANCED FOREIGN METHODS OF PRODUCT DEVELOPMENT PROGRAM IN THE AIRCRAFT INDUSTRY

    OpenAIRE

    N. G. Ageeva; S. V. Gromov

    2014-01-01

    The state and development prospects of administration methods of production program of product development in the aircraft industry is considered. The modern W model of the design process of life cycle is offered. The modern design technology of construction in the aircraft industry is described.

  17. ADVANCED FOREIGN METHODS OF PRODUCT DEVELOPMENT PROGRAM IN THE AIRCRAFT INDUSTRY

    Directory of Open Access Journals (Sweden)

    N. G. Ageeva

    2014-01-01

    Full Text Available The state and development prospects of administration methods of production program of product development in the aircraft industry is considered. The modern W model of the design process of life cycle is offered. The modern design technology of construction in the aircraft industry is described.

  18. Final Report: Advanced Methods for Accessing and Disseminating Nuclear Data, August 13, 1996 - March 15, 1999

    International Nuclear Information System (INIS)

    Stone, Craig A.

    1999-01-01

    Scientific Digital Visions, Inc. developed methods of accessing and dissemination nuclear data contained within the databases of the National Data Center (NNDC) at the Brookhaven National Laboratory supporting a long standing and important DOE Program to provide scientists access to NNDC Databases. The NNDC participated as a partner in this effort

  19. Theoretical comparison of advanced methods for calculating nitrous oxide fluxes using non-steady state chambers

    Science.gov (United States)

    Several flux-calculation (FC) schemes are available for determining soil-to-atmosphere emissions of nitrous oxide (N2O) and other trace gases using data from non-steady-state flux chambers. Recently developed methods claim to provide more accuracy in estimating the true pre-deployment flux (f0) comp...

  20. Application of advanced data collection and quality assurance methods in open prospective study - a case study of PONS project.

    Science.gov (United States)

    Wawrzyniak, Zbigniew M; Paczesny, Daniel; Mańczuk, Marta; Zatoński, Witold A

    2011-01-01

    Large-scale epidemiologic studies can assess health indicators differentiating social groups and important health outcomes of the incidence and mortality of cancer, cardiovascular disease, and others, to establish a solid knowledgebase for the prevention management of premature morbidity and mortality causes. This study presents new advanced methods of data collection and data management systems with current data quality control and security to ensure high quality data assessment of health indicators in the large epidemiologic PONS study (The Polish-Norwegian Study). The material for experiment is the data management design of the large-scale population study in Poland (PONS) and the managed processes are applied into establishing a high quality and solid knowledge. The functional requirements of the PONS study data collection, supported by the advanced IT web-based methods, resulted in medical data of a high quality, data security, with quality data assessment, control process and evolution monitoring are fulfilled and shared by the IT system. Data from disparate and deployed sources of information are integrated into databases via software interfaces, and archived by a multi task secure server. The practical and implemented solution of modern advanced database technologies and remote software/hardware structure successfully supports the research of the big PONS study project. Development and implementation of follow-up control of the consistency and quality of data analysis and the processes of the PONS sub-databases have excellent measurement properties of data consistency of more than 99%. The project itself, by tailored hardware/software application, shows the positive impact of Quality Assurance (QA) on the quality of outcomes analysis results, effective data management within a shorter time. This efficiency ensures the quality of the epidemiological data and indicators of health by the elimination of common errors of research questionnaires and medical

  1. Motivating and Facilitating Advancements in Space Weather Real-Time Data Availability: Factors, Data, and Access Methods

    Science.gov (United States)

    Pankratz, C. K.; Baker, D. N.; Jaynes, A. N.; Elkington, S. R.; Baltzer, T.; Sanchez, F.

    2017-12-01

    Society's growing reliance on complex and highly interconnected technological systems makes us increasingly vulnerable to the effects of space weather events - maybe more than for any other natural hazard. An extreme solar storm today could conceivably impact hundreds of the more than 1400 operating Earth satellites. Such an extreme storm could cause collapse of the electrical grid on continental scales. The effects on navigation, communication, and remote sensing of our home planet could be devastating to our social functioning. Thus, it is imperative that the scientific community address the question of just how severe events might become. At least as importantly, it is crucial that policy makers and public safety officials be informed by the facts on what might happen during extreme conditions. This requires essentially real-time alerts, warnings, and also forecasts of severe space weather events, which in turn demands measurements, models, and associated data products to be available via the most effective data discovery and access methods possible. Similarly, advancement in the fundamental scientific understanding of space weather processes is also vital, requiring that researchers have convenient and effective access to a wide variety of data sets and models from multiple sources. The space weather research community, as with many scientific communities, must access data from dispersed and often uncoordinated data repositories to acquire the data necessary for the analysis and modeling efforts that advance our understanding of solar influences and space physics on the Earth's environment. The Laboratory for Atmospheric and Space Physics (LASP), as a leading institution in both producing data products and advancing the state of scientific understanding of space weather processes, is well positioned to address many of these issues. In this presentation, we will outline the motivating factors for effective space weather data access, summarize the various data

  2. Effects of Different Palliative Jaundice Reducing Methods on Immunologic Functions in Patients with Advanced Malignant Obstructive Jaundice.

    Science.gov (United States)

    Tang, Kun; Sui, Lu-Lu; Xu, Gang; Zhang, Tong; Liu, Qiang; Liu, Xiao-Fang

    2017-08-01

    This study aimed to investigate the effects of three treatment methods on the immunological function of patients with advanced malignant obstructive jaundice (MOJ). Patients with advanced MOJ were randomly divided into three groups according to biliary drainage methods. Detection of levels of multi-indices were investigated in different time periods. After drainage, the levels of complement 3 (C3) and complement 4 (C4) were increased. Forteen days post-operation, the levels of immunoglobulin G (IgG), immunoglobulin A (IgA) and immunoglobulin M (IgM) in the group undergoing palliative surgery decreased significantly compared to those in both percutaneous transhepatic cholangio drainage (PTCD) and endoscopic retrograde biliary drainage (ERBD) groups. The level of serum endotoxin in the group undergoing palliative surgery decreased gradually. Palliative surgery for reducing jaundice is superior to PTCD and ERBD in improving immune function of patients with MOJ. Copyright© 2017, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  3. Advanced Research and Data Methods in Women's Health: Big Data Analytics, Adaptive Studies, and the Road Ahead.

    Science.gov (United States)

    Macedonia, Christian R; Johnson, Clark T; Rajapakse, Indika

    2017-02-01

    Technical advances in science have had broad implications in reproductive and women's health care. Recent innovations in population-level data collection and storage have made available an unprecedented amount of data for analysis while computational technology has evolved to permit processing of data previously thought too dense to study. "Big data" is a term used to describe data that are a combination of dramatically greater volume, complexity, and scale. The number of variables in typical big data research can readily be in the thousands, challenging the limits of traditional research methodologies. Regardless of what it is called, advanced data methods, predictive analytics, or big data, this unprecedented revolution in scientific exploration has the potential to dramatically assist research in obstetrics and gynecology broadly across subject matter. Before implementation of big data research methodologies, however, potential researchers and reviewers should be aware of strengths, strategies, study design methods, and potential pitfalls. Examination of big data research examples contained in this article provides insight into the potential and the limitations of this data science revolution and practical pathways for its useful implementation.

  4. Role of emerging methods in advanced DFT study of novel compounds with high-efficiency photovoltaic applications

    Energy Technology Data Exchange (ETDEWEB)

    Palacios, Pablo [Universidad Politecnica de Madrid, Madrid (Spain); Consejo Superior de Investigaciones Cientificas, Cantoblanco (Spain); Wahnon, Perla; Sanchez, Kefren; Aguilera, Irene; Fernandez, Julio Juan [Universidad Politecnica de Madrid, Madrid (Spain); Conesa, Jose Carlos [Consejo Superior de Investigaciones Cientificas, Cantoblanco (Spain)

    2008-07-01

    An exhaustive study of compounds with unusual electronic structure used as new efficient photovoltaic materials is presented. These intermediate band materials obtained by selective substitution of atoms of a host semiconductor by transition metals present a narrow, partially-filled band inside the semiconductor band-gap, separated from host valence and conduction bands. Therefore additional current carriers can be obtained from the absorption of low energy photons enabling electrons to get the conduction band through the intermediate band. Although standard DFT is a suitable tool for our purposes, in order to enhance prediction capabilities for these new materials there are some features, i.e. host semiconductor gap, which can be improved with advanced methods let be GW, exact exchange or hybrid functionals. Exact exchange studies have been done in several candidates for intermediate band material as well as advanced analysis of the correlation effect. Exact exchange method was implemented in our group within SIESTA. For the moment these studies have contributed to support the reliability of the prediction of intermediate band materials to such an extent that experimental samples are being grown. State-of-the-art techniques could complement the results obtained to date.

  5. Accurate Characterization of Winter Precipitation Using Multi-Angle Snowflake Camera, Visual Hull, Advanced Scattering Methods and Polarimetric Radar

    Directory of Open Access Journals (Sweden)

    Branislav M. Notaroš

    2016-06-01

    Full Text Available This article proposes and presents a novel approach to the characterization of winter precipitation and modeling of radar observables through a synergistic use of advanced optical disdrometers for microphysical and geometrical measurements of ice and snow particles (in particular, a multi-angle snowflake camera—MASC, image processing methodology, advanced method-of-moments scattering computations, and state-of-the-art polarimetric radars. The article also describes the newly built and established MASCRAD (MASC + Radar in-situ measurement site, under the umbrella of CSU-CHILL Radar, as well as the MASCRAD project and 2014/2015 winter campaign. We apply a visual hull method to reconstruct 3D shapes of ice particles based on high-resolution MASC images, and perform “particle-by-particle” scattering computations to obtain polarimetric radar observables. The article also presents and discusses selected illustrative observation data, results, and analyses for three cases with widely-differing meteorological settings that involve contrasting hydrometeor forms. Illustrative results of scattering calculations based on MASC images captured during these events, in comparison with radar data, as well as selected comparative studies of snow habits from MASC, 2D video-disdrometer, and CHILL radar data, are presented, along with the analysis of microphysical characteristics of particles. In the longer term, this work has potential to significantly improve the radar-based quantitative winter-precipitation estimation.

  6. Recent advances in sample preparation methods for analysis of endocrine disruptors from various matrices.

    Science.gov (United States)

    Singh, Baljinder; Kumar, Ashwini; Malik, Ashok Kumar

    2014-01-01

    Due to the high toxicity of endocrine disruptors (EDs), studies are being undertaken to design effective techniques for separation and detection of EDs in various matrices. Recently, research activities in this area have shown that a diverse range of chromatographic techniques are available for the quantification and analysis of EDs. Therefore, on the basis of significant, recent original publications, we aimed at providing an overview of different separation and detection methods for the determination of trace-level concentrations of selected EDs. The biological effects of EDs and current pretreatment techniques applied to EDs are also discussed. Various types of chromatographic techniques are presented for quantification, highlighting time- and cost-effective techniques that separate and quantify trace levels of multiple EDs from various environmental matrices. Reports related to methods for the quantification of EDs from various matrices primarily published since 2008 have been cited.

  7. The convergence problem for dissipative autonomous systems classical methods and recent advances

    CERN Document Server

    Haraux, Alain

    2015-01-01

    The book investigates classical and more recent methods of study for the asymptotic behavior of dissipative continuous dynamical systems with applications to ordinary and partial differential equations, the main question being convergence (or not) of the solutions to an equilibrium. After reviewing the basic concepts of topological dynamics and the definition of gradient-like systems on a metric space, the authors present a comprehensive exposition of stability theory relying on the so-called linearization method. For the convergence problem itself, when the set of equilibria is infinite, the only general results that do not require very special features of the non-linearities are presently consequences of a gradient inequality discovered by S. Lojasiewicz. The application of this inequality jointly with the so-called Liapunov-Schmidt reduction requires a rigorous exposition of Semi-Fredholm operator theory and the theory of real analytic maps on infinite dimensional Banach spaces, which cannot be found anywh...

  8. High Energy Beam Impacts on Beam Intercepting Devices: Advanced Numerical Methods and Experimental Set-Up

    CERN Document Server

    Bertarelli, A; Carra, F; Cerutti, F; Dallocchio, A; Mariani, N; Timmins, M; Peroni, L; Scapin, M

    2011-01-01

    Beam Intercepting Devices are potentially exposed to severe accidental events triggered by direct impacts of energetic particle beams. State-of-the-art numerical methods are required to simulate the behaviour of affected components. A review of the different dynamic response regimes is presented, along with an indication of the most suited tools to treat each of them. The consequences on LHC tungsten collimators of a number of beam abort scenarios were extensively studied, resorting to a novel category of numerical explicit methods, named Hydrocodes. Full shower simulations were performed providing the energy deposition distribution. Structural dynamics and shock wave propagation analyses were carried out with varying beam parameters, identifying important thresholds for collimator operation, ranging from the onset of permanent damage up to catastrophic failure. Since the main limitation of these tools lies in the limited information available on constitutive material models under extreme conditions, a dedica...

  9. High Energy Beam Impacts on Beam Intercepting Devices: Advanced Numerical Methods and Experimental Set-up

    CERN Document Server

    Bertarelli, A; Carra, F; Cerutti, F; Dallocchio, A; Mariani, N; Timmins, M; Peroni, L; Scapin, M

    2011-01-01

    Beam Intercepting Devices are potentially exposed to severe accidental events triggered by direct impacts of energetic particle beams. State-of-the-art numerical methods are required to simulate the behaviour of affected components. A review of the different dynamic response regimes is presented, along with an indication of the most suited tools to treat each of them. The consequences on LHC tungsten collimators of a number of beam abort scenarios were extensively studied, resorting to a novel category of numerical explicit methods, named Hydrocodes. Full shower simulations were performed providing the energy deposition distribution. Structural dynamics and shock wave propagation analyses were carried out with varying beam parameters, identifying important thresholds for collimator operation, ranging from the onset of permanent damage up to catastrophic failure. Since the main limitation of these tools lies in the limited information available on constitutive material models under extreme conditions, a dedica...

  10. International Symposium on Boundary Element Methods : Advances in Solid and Fluid Mechanics

    CERN Document Server

    Tseng, Kadin

    1990-01-01

    The Boundary Element Method (BEM) has become established as an effective tool for the solutions of problems in engineering science. The salient features of the BEM have been well documented in the open literature and therefore will not be elaborated here. The BEM research has progressed rapidly, especially in the past decade and continues to evolve worldwide. This Symposium was organized to provide an international forum for presentation of current research in BEM for linear and nonlinear problems in solid and fluid mechanics and related areas. To this end, papers on the following topics were included: rotary­ wing aerodynamics, unsteady aerodynamics, design and optimization, elasticity, elasto­ dynamics and elastoplasticity, fracture mechanics, acoustics, diffusion and wave motion, thermal analysis, mathematical aspects and boundary/finite element coupled methods. A special session was devoted to parallel/vector supercomputing with emphasis on mas­ sive parallelism. This Symposium was sponsored by United ...

  11. Advances in intelligent diagnosis methods for pulmonary ground-glass opacity nodules.

    Science.gov (United States)

    Yang, Jing; Wang, Hailin; Geng, Chen; Dai, Yakang; Ji, Jiansong

    2018-02-07

    Pulmonary nodule is one of the important lesions of lung cancer, mainly divided into two categories of solid nodules and ground glass nodules. The improvement of diagnosis of lung cancer has significant clinical significance, which could be realized by machine learning techniques. At present, there have been a lot of researches focusing on solid nodules. But the research on ground glass nodules started late, and lacked research results. This paper summarizes the research progress of the method of intelligent diagnosis for pulmonary nodules since 2014. It is described in details from four aspects: nodular signs, data analysis methods, prediction models and system evaluation. This paper aims to provide the research material for researchers of the clinical diagnosis and intelligent analysis of lung cancer, and further improve the precision of pulmonary ground glass nodule diagnosis.

  12. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances

    Directory of Open Access Journals (Sweden)

    Abut F

    2015-08-01

    Full Text Available Fatih Abut, Mehmet Fatih AkayDepartment of Computer Engineering, Çukurova University, Adana, TurkeyAbstract: Maximal oxygen uptake (VO2max indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance

  13. Advanced Corrections of Hydrogen Bonding and Dispersion for Semiempirical Quantum Mechanical Methods

    Czech Academy of Sciences Publication Activity Database

    Řezáč, Jan; Hobza, Pavel

    2012-01-01

    Roč. 8, č. 1 (2012), s. 141-151 ISSN 1549-9618 Grant - others:European Social Fund(XE) CZ.1.05/2.1.00/03.0058 Institutional research plan: CEZ:AV0Z40550506 Keywords : tight-binding method * noncovalent complexes * base-pairs * interaction energies Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 5.389, year: 2012

  14. Safety and reliability of pressure components with special emphasis on advanced methods of NDT. Vol. 1

    International Nuclear Information System (INIS)

    1986-01-01

    24 papers discuss various methods for nondestructive testing of materials, e.g. eddy current measurement, EMAG analyser, tomography, ultrasound, holographic interferometry, and optical sound field camera. Special consideration is given to mathematical programmes and tests allowing to determine fracture-mechanical parameters and to assess cracks in various components, system parts and individual specimens both in pressurized systems and NPP systems. Studies focus on weld seams and adjacent areas. (DG) [de

  15. Advances in methods for detection of anaerobic ammonium oxidizing (anammox) bacteria

    OpenAIRE

    Li, Meng; Gu, Ji-Dong

    2011-01-01

    Anaerobic ammonium oxidation (anammox), the biochemical process oxidizing ammonium into dinitrogen gas using nitrite as an electron acceptor, has only been recognized for its significant role in the global nitrogen cycle not long ago, and its ubiquitous distribution in a wide range of environments has changed our knowledge about the contributors to the global nitrogen cycle. Currently, several groups of methods are used in detection of anammox bacteria based on their physiological and biochem...

  16. Advanced Magnetic Materials Methods and Numerical Models for Fluidization in Microgravity and Hypogravity

    Science.gov (United States)

    Atwater, James; Wheeler, Richard, Jr.; Akse, James; Jovanovic, Goran; Reed, Brian

    2013-01-01

    To support long-duration manned missions in space such as a permanent lunar base, Mars transit, or Mars Surface Mission, improved methods for the treatment of solid wastes, particularly methods that recover valuable resources, are needed. The ability to operate under microgravity and hypogravity conditions is essential to meet this objective. The utilization of magnetic forces to manipulate granular magnetic media has provided the means to treat solid wastes under variable gravity conditions by filtration using a consolidated magnetic media bed followed by thermal processing of the solid wastes in a fluidized bed reactor. Non-uniform magnetic fields will produce a magnetic field gradient in a bed of magnetically susceptible media toward the distributor plate of a fluidized bed reactor. A correctly oriented magnetic field gradient will generate a downward direct force on magnetic media that can substitute for gravitational force in microgravity, or which may augment low levels of gravity, such as on the Moon or Mars. This approach is termed Gradient Magnetically Assisted Fluidization (G-MAFB), in which the magnitude of the force on the fluidized media depends upon the intensity of the magnetic field (H), the intensity of the field gradient (dH/dz), and the magnetic susceptibility of the media. Fluidized beds based on the G-MAFB process can operate in any gravitational environment by tuning the magnetic field appropriately. Magnetic materials and methods have been developed that enable G-MAFB operation under variable gravity conditions.

  17. Enabling Advanced Wind-Tunnel Research Methods Using the NASA Langley 12-Foot Low Speed Tunnel

    Science.gov (United States)

    Busan, Ronald C.; Rothhaar, Paul M.; Croom, Mark A.; Murphy, Patrick C.; Grafton, Sue B.; O-Neal, Anthony W.

    2014-01-01

    Design of Experiment (DOE) testing methods were used to gather wind tunnel data characterizing the aerodynamic and propulsion forces and moments acting on a complex vehicle configuration with 10 motor-driven propellers, 9 control surfaces, a tilt wing, and a tilt tail. This paper describes the potential benefits and practical implications of using DOE methods for wind tunnel testing - with an emphasis on describing how it can affect model hardware, facility hardware, and software for control and data acquisition. With up to 23 independent variables (19 model and 2 tunnel) for some vehicle configurations, this recent test also provides an excellent example of using DOE methods to assess critical coupling effects in a reasonable timeframe for complex vehicle configurations. Results for an exploratory test using conventional angle of attack sweeps to assess aerodynamic hysteresis is summarized, and DOE results are presented for an exploratory test used to set the data sampling time for the overall test. DOE results are also shown for one production test characterizing normal force in the Cruise mode for the vehicle.

  18. Advanced testing method to evaluate the performance of respirator filter media.

    Science.gov (United States)

    Wang, Qiang; Golshahi, Laleh; Chen, Da-Ren

    2016-10-02

    Filter media for respirator applications are typically exposed to the cyclic flow condition, which is different from the constant flow condition adopted in filter testing standards. To understand the real performance of respirator filter media in the field it is required to investigate the penetration of particles through respirator filters under cyclic flow conditions representing breathing flow patterns of human beings. This article reports a new testing method for studying the individual effect of breathing frequency (BF) and peak inhalation flow rate (PIFR) on the particle penetration through respirator filter media. The new method includes the use of DMA (Differential Mobility Analyzer)-classified particles having the most penetrating particle size, MPPS (at the constant flowrate of equivalent mean inhalation flow rate, MIFR) as test aerosol. Two condensation particle counters (CPCs) are applied to measure the particle concentrations at the upstream and downstream of test filter media at the same time. Given the 10 Hz sampling time of CPCs, close-to-instantaneous particle penetration could be measured. A pilot study was performed to demonstrate the new testing method. It is found that the effect of BF on the particle penetration of test respirator filter media is of importance at all the tested peak inhalation flow rates (PIFRs), which is different from those reported in the previous work.

  19. Extraction, Analytical and Advanced Methods for Detection of Allura Red AC (E129 in Food and Beverages Products

    Directory of Open Access Journals (Sweden)

    Shafiquzzaman eSiddiquee

    2016-05-01

    Full Text Available Allura Red AC (E129 is an azo dye that widely used in drinks, juices, bakery, meat and sweets products. High consumption of Allura Red has claimed an adverse effects of human health including allergies, food intolerance, cancer, multiple sclerosis, attention deficit hyperactivity disorder (ADHD, brain damage, nausea, cardiac disease and asthma due to the reaction of aromatic azo compounds (R = R' = aromatic. Several countries have banned and strictly controlled the uses of Allura Red in food and beverage products. This review paper is critically summarized on the available analytical and advanced methods for determination of Allura Red and also concisely discussed on the acceptable daily intake (ADI, toxicology and extraction methods.

  20. Experiences of end of life amongst family carers of people with advanced dementia: longitudinal cohort study with mixed methods.

    Science.gov (United States)

    Moore, Kirsten J; Davis, Sarah; Gola, Anna; Harrington, Jane; Kupeli, Nuriye; Vickerstaff, Victoria; King, Michael; Leavey, Gerard; Nazareth, Irwin; Jones, Louise; Sampson, Elizabeth L

    2017-07-03

    Many studies have examined the mental health of carers of people with dementia. Few have examined their experiences in the advanced stages of disease and into bereavement. We aimed to understand the experiences of carers during advanced dementia exploring the links between mental health and experiences of end of life care. Mixed methods longitudinal cohort study. Thirty-five family carers of people with advanced dementia (6 at home, 29 in care homes) were recruited and assessed monthly for up to nine months or until the person with dementia died, then at two and seven months into bereavement. Assessments included: Hospital Anxiety and Depression Scale, Short Form 12 health-related quality of life, 22-item Zarit Burden Interview, Brief Coping Orientation to Problems Experienced, Inventory of Complicated Grief and Satisfaction with Care at End of Life in Dementia. Subsequently, 12 carers (34%) were bereaved and 12 undertook a qualitative interview two months after death; these data were analysed thematically. We analysed quantitative and qualitative data independently and then merged findings at the point of interpretation. At study entry psychological distress was high; 26% reached caseness for depression and 41% for anxiety and median complicated grief scores were 27 [IQR 22-37] indicating that on average 11 of the 16 grief symptoms occurred at least monthly. Physical health reflected population norms (mean = 50) and median burden scores were 17 [IQR 9-30]. Three qualitative themes were identified: the importance of relationships with care services, understanding of the progression of dementia, and emotional responses to advanced dementia. An overarching theme tying these together was the carer's ability to control and influence end of life care. While carers report high levels of psychological distress during advanced dementia, the experience of end of life care in dementia may be amenable to change with the provision of sensitive and timely information about

  1. Advancing Reactive Tracer Methods for Measurement of Thermal Evolution in Geothermal Reservoirs: Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Mitchell A. Plummer; Carl D. Palmer; Earl D. Mattson; Laurence C. Hull; George D. Redden

    2011-07-01

    The injection of cold fluids into engineered geothermal system (EGS) and conventional geothermal reservoirs may be done to help extract heat from the subsurface or to maintain pressures within the reservoir (e.g., Rose et al., 2001). As these injected fluids move along fractures, they acquire heat from the rock matrix and remove it from the reservoir as they are extracted to the surface. A consequence of such injection is the migration of a cold-fluid front through the reservoir (Figure 1) that could eventually reach the production well and result in the lowering of the temperature of the produced fluids (thermal breakthrough). Efficient operation of an EGS as well as conventional geothermal systems involving cold-fluid injection requires accurate and timely information about thermal depletion of the reservoir in response to operation. In particular, accurate predictions of the time to thermal breakthrough and subsequent rate of thermal drawdown are necessary for reservoir management, design of fracture stimulation and well drilling programs, and forecasting of economic return. A potential method for estimating migration of a cold front between an injection well and a production well is through application of reactive tracer tests, using chemical whose rate of degradation is dependent on the reservoir temperature between the two wells (e.g., Robinson 1985). With repeated tests, the rate of migration of the thermal front can be determined, and the time to thermal breakthrough calculated. While the basic theory behind the concept of thermal tracers has been understood for some time, effective application of the method has yet to be demonstrated. This report describes results of a study that used several methods to investigate application of reactive tracers to monitoring the thermal evolution of a geothermal reservoir. These methods included (1) mathematical investigation of the sensitivity of known and hypothetical reactive tracers, (2) laboratory testing of novel

  2. First 3D thermal mapping of an active volcano using an advanced photogrammetric method

    Science.gov (United States)

    Antoine, Raphael; Baratoux, David; Lacogne, Julien; Lopez, Teodolina; Fauchard, Cyrille; Bretar, Frédéric; Arab-Sedze, Mélanie; Staudacher, Thomas; Jacquemoud, Stéphane; Pierrot-Deseilligny, Marc

    2014-05-01

    Thermal infrared data obtained in the [7-14 microns] spectral range are usually used in many Earth Science disciplines. These studies are exclusively based on the analysis of 2D information. In this case, a quantitative analysis of the surface energy budget remains limited, as it may be difficult to estimate the radiative contribution of the topography, the thermal influence of winds on the surface or potential imprints of subsurface flows on the soil without any precise DEM. The draping of a thermal image on a recent DEM is a common method to obtain a 3D thermal map of a surface. However, this method has many disadvantages i) errors can be significant in the orientation process of the thermal images, due to the lack of tie points between the images and the DEM; ii) the use of a recent DEM implies the use of another remote sensing technique to quantify the topography; iii) finally, the characterization of the evolution of a surface requires the simultaneous acquisition of thermal data and topographic information, which may be expensive in most cases. The stereophotogrammetry method allows to reconstitute the relief of an object from photos taken from different positions. Recently, substantial progress have been realized in the generation of high spatial resolution topographic surfaces using stereophotogrammetry. However, the presence of shadows, homogeneous textures and/or weak contrasts in the visible spectrum (e.g., flowing lavas, uniform lithologies) may prevent from the use of such method, because of the difficulties to find tie points on each image. Such situations are more favorable in the thermal infrared spectrum, as any variation in the thermal properties or geometric orientation of the surfaces may induce temperature contrasts that are detectable with a thermal camera. This system, usually functioning with a array sensor (Focal Plane Array) and an optical device, have geometric characteristics that are similar to digital cameras. Thus, it may be possible

  3. Advancing Research Methods to Detect Impact of Climate Change on Health in Grand'Anse, Haiti

    Science.gov (United States)

    Barnhart, S.; Coq, R. N.; Frederic, R.; DeRiel, E.; Camara, H.; Barnhart, K. R.

    2013-12-01

    Haiti is considered particularly vulnerable to the effects of climate change, but directly linking climate change to health effects is limited by the lack of robust data and the multiple determinants of health. Worsening storms and rising temperatures in this rugged country with high poverty is likely to adversely affect economic activity, population growth and other determinants of health. For the past two years, the Univ. of Washington has supported the public hospital in the department of Grand'Anse. Grand'Anse, a relatively contained region in SW Haiti with an area of 11,912 km2, is predominantly rural with a population of 350,000 and is bounded to the south by peaks up to 2,347 m. Grand'Anse would serve as an excellent site to assess the interface between climate change and health. The Demographic and Health Survey (DHS) shows health status is low relative to other countries. Estimates of climate change for Jeremie, the largest city in Grand'Anse, predict the mean monthly temperature will increase from 26.1 to 27.3 oC while mean monthly rainfall will decrease from 80.5 to 73.5 mm over the next 60 years. The potential impact of these changes ranges from threatening food security to greater mortality. Use of available secondary data such as indicators of climate change and DHS health status are not likely to offer sufficient resolution to detect positive or negative impacts of climate change on health. How might a mixed methods approach incorporating secondary data and quantitative and qualitative survey data on climate, economic activity, health and determinants of health address the hypothesis: Climate change does not adversely affect health? For example, in Haiti most women deliver at home. Maternal mortality is high at 350 deaths/100,000 deliveries. This compares to deliveries in facilities where the median rate is less than 100/100,000. Thus, maternal mortality is closely linked to access to health care in this rugged mountainous country. Climate change

  4. Machine learning and statistical methods for the prediction of maximal oxygen uptake: recent advances.

    Science.gov (United States)

    Abut, Fatih; Akay, Mehmet Fatih

    2015-01-01

    Maximal oxygen uptake (VO2max) indicates how many milliliters of oxygen the body can consume in a state of intense exercise per minute. VO2max plays an important role in both sport and medical sciences for different purposes, such as indicating the endurance capacity of athletes or serving as a metric in estimating the disease risk of a person. In general, the direct measurement of VO2max provides the most accurate assessment of aerobic power. However, despite a high level of accuracy, practical limitations associated with the direct measurement of VO2max, such as the requirement of expensive and sophisticated laboratory equipment or trained staff, have led to the development of various regression models for predicting VO2max. Consequently, a lot of studies have been conducted in the last years to predict VO2max of various target audiences, ranging from soccer athletes, nonexpert swimmers, cross-country skiers to healthy-fit adults, teenagers, and children. Numerous prediction models have been developed using different sets of predictor variables and a variety of machine learning and statistical methods, including support vector machine, multilayer perceptron, general regression neural network, and multiple linear regression. The purpose of this study is to give a detailed overview about the data-driven modeling studies for the prediction of VO2max conducted in recent years and to compare the performance of various VO2max prediction models reported in related literature in terms of two well-known metrics, namely, multiple correlation coefficient (R) and standard error of estimate. The survey results reveal that with respect to regression methods used to develop prediction models, support vector machine, in general, shows better performance than other methods, whereas multiple linear regression exhibits the worst performance.

  5. An advanced method to assess the diet of free-ranging large carnivores based on scats.

    Directory of Open Access Journals (Sweden)

    Bettina Wachter

    Full Text Available BACKGROUND: The diet of free-ranging carnivores is an important part of their ecology. It is often determined from prey remains in scats. In many cases, scat analyses are the most efficient method but they require correction for potential biases. When the diet is expressed as proportions of consumed mass of each prey species, the consumed prey mass to excrete one scat needs to be determined and corrected for prey body mass because the proportion of digestible to indigestible matter increases with prey body mass. Prey body mass can be corrected for by conducting feeding experiments using prey of various body masses and fitting a regression between consumed prey mass to excrete one scat and prey body mass (correction factor 1. When the diet is expressed as proportions of consumed individuals of each prey species and includes prey animals not completely consumed, the actual mass of each prey consumed by the carnivore needs to be controlled for (correction factor 2. No previous study controlled for this second bias. METHODOLOGY/PRINCIPAL FINDINGS: Here we use an extended series of feeding experiments on a large carnivore, the cheetah (Acinonyx jubatus, to establish both correction factors. In contrast to previous studies which fitted a linear regression for correction factor 1, we fitted a biologically more meaningful exponential regression model where the consumed prey mass to excrete one scat reaches an asymptote at large prey sizes. Using our protocol, we also derive correction factor 1 and 2 for other carnivore species and apply them to published studies. We show that the new method increases the number and proportion of consumed individuals in the diet for large prey animals compared to the conventional method. CONCLUSION/SIGNIFICANCE: Our results have important implications for the interpretation of scat-based studies in feeding ecology and the resolution of human-wildlife conflicts for the conservation of large carnivores.

  6. Advances in Spectral Methods for UQ in Incompressible Navier-Stokes Equations

    KAUST Repository

    Le Maitre, Olivier

    2014-01-06

    In this talk, I will present two recent contributions to the development of efficient methodologies for uncertainty propagation in the incompressible Navier-Stokes equations. The first one concerns the reduced basis approximation of stochastic steady solutions, using Proper Generalized Decompositions (PGD). An Arnoldi problem is projected to obtain a low dimensional Galerkin problem. The construction then amounts to the resolution of a sequence of uncoupled deterministic Navier-Stokes like problem and simple quadratic stochastic problems, followed by the resolution of a low-dimensional coupled quadratic stochastic problem, with a resulting complexity which has to be contrasted with the dimension of the whole Galerkin problem for classical spectral approaches. An efficient algorithm for the approximation of the stochastic pressure field is also proposed. Computations are presented for uncertain viscosity and forcing term to demonstrate the effectiveness of the reduced method. The second contribution concerns the computation of stochastic periodic solutions to the Navier-Stokes equations. The objective is to circumvent the well-known limitation of spectral methods for long-time integration. We propose to directly determine the stochastic limit-cycles through the definition of its stochastic period and an initial condition over the cycle. A modified Newton method is constructed to compute iteratively both the period and initial conditions. Owing to the periodic character of the solution, and by introducing an appropriate time-scaling, the solution can be approximated using low-degree polynomial expansions with large computational saving as a result. The methodology is illustrated for the von-Karman flow around a cylinder with stochastic inflow conditions.

  7. A Review of Methods for Sensing the Nitrogen Status in Plants: Advantages, Disadvantages and Recent Advances

    Directory of Open Access Journals (Sweden)

    Rosalia V. Ocampo-Velazquez

    2013-08-01

    Full Text Available Nitrogen (N plays a key role in the plant life cycle. It is the main plant mineral nutrient needed for chlorophyll production and other plant cell components (proteins, nucleic acids, amino acids. Crop yield is affected by plant N status. Thus, the optimization of nitrogen fertilization has become the object of intense research due to its environmental and economic impact. This article focuses on reviewing current methods and techniques used to determine plant N status. Kjeldahl digestion and Dumas combustion have been used as reference methods for N determination in plants, but they are destructive and time consuming. By using spectroradiometers, reflectometers, imagery from satellite sensors and digital cameras, optical properties have been measured to estimate N in plants, such as crop canopy reflectance, leaf transmittance, chlorophyll and polyphenol fluorescence. High correlation has been found between optical parameters and plant N status, and those techniques are not destructive. However, some drawbacks include chlorophyll saturation, atmospheric and soil interference, and the high cost of instruments. Electrical properties of plant tissue have been used to estimate quality in fruits, and water content in plants, as well as nutrient deficiency, which suggests that they have potential for use in plant N determination.

  8. Advances in the replacement and enhanced replacement method in QSAR and QSPR theories.

    Science.gov (United States)

    Mercader, Andrew G; Duchowicz, Pablo R; Fernández, Francisco M; Castro, Eduardo A

    2011-07-25

    The selection of an optimal set of molecular descriptors from a much greater pool of such regression variables is a crucial step in the development of QSAR and QSPR models. The aim of this work is to further improve this important selection process. For this reason three different alternatives for the initial steps of our recently developed enhanced replacement method (ERM) and replacement method (RM) are proposed. These approaches had previously proven to yield near optimal results with a much smaller number of linear regressions than the full search. The algorithms were tested on four different experimental data sets, formed by collections of 116, 200, 78, and 100 experimental records from different compounds and 1268, 1338, 1187, and 1306 molecular descriptors, respectively. The comparisons showed that one of the new alternatives further improves the ERM, which has shown to be superior to genetic algorithms for the selection of an optimal set of molecular descriptors from a much greater pool. The new proposed alternative also improves the simpler and the lower computational demand algorithm RM.

  9. Advances in industrial biopharmaceutical batch process monitoring: Machine-learning methods for small data problems.

    Science.gov (United States)

    Tulsyan, Aditya; Garvin, Christopher; Ündey, Cenk

    2018-04-06

    Biopharmaceutical manufacturing comprises of multiple distinct processing steps that require effective and efficient monitoring of many variables simultaneously in real-time. The state-of-the-art real-time multivariate statistical batch process monitoring (BPM) platforms have been in use in recent years to ensure comprehensive monitoring is in place as a complementary tool for continued process verification to detect weak signals. This article addresses a longstanding, industry-wide problem in BPM, referred to as the "Low-N" problem, wherein a product has a limited production history. The current best industrial practice to address the Low-N problem is to switch from a multivariate to a univariate BPM, until sufficient product history is available to build and deploy a multivariate BPM platform. Every batch run without a robust multivariate BPM platform poses risk of not detecting potential weak signals developing in the process that might have an impact on process and product performance. In this article, we propose an approach to solve the Low-N problem by generating an arbitrarily large number of in silico batches through a combination of hardware exploitation and machine-learning methods. To the best of authors' knowledge, this is the first article to provide a solution to the Low-N problem in biopharmaceutical manufacturing using machine-learning methods. Several industrial case studies from bulk drug substance manufacturing are presented to demonstrate the efficacy of the proposed approach for BPM under various Low-N scenarios. © 2018 Wiley Periodicals, Inc.

  10. Advances in digital technology and orthodontics: a reference to the Invisalign method.

    Science.gov (United States)

    Melkos, Aristides B

    2005-05-01

    Increased aesthetic demands during orthodontic treatment resulted in several treatment alternatives. However, the need to avoid conventional fixed orthodontic appliances led, with the use of computer-aided scanning, imaging, and manufacturing technology, to the development of new therapy concepts such as Invisalign. The Invisalign orthodontic technique involves a series of clear removable appliances and has been applied to correct a variety of malocclusions. The Invisalign method is an aesthetic orthodontic option for many patients, but it is suited mainly to adults or adolescents who have a fully erupted dentition and it has its indications and limitations. It handles simple to moderate non-extraction alignments better than mild to moderate extraction cases. The aligners are clear and therefore aesthetically ideal for the patient; they are comfortable to wear and, as they are removable, they provide simplicity of care and better oral hygiene. They also allow the evaluation of treatment options in detail before beginning treatment by using a virtual treatment model. It is also important to point out that this method has some disadvantages, which are associated with patient compliance, limited control over specific tooth movements, and additional documentation time. The Invisalign concept is an aesthetic alternative in orthodontic treatment, with advantages and disadvantages. It can be utilized to treat simple to moderate alignment cases, especially in adults, and serves as an additional part of the armamentarium of the orthodontist.

  11. Development of advanced methods for planning electric energy distribution systems. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Goenen, T.; Foote, B.L.; Thompson, J.C.; Fagan, J.E.

    1979-10-01

    An extensive search was made for the identification and collection of reports published in the open literature which describes distribution planning methods and techniques. In addition, a questionnaire has been prepared and sent to a large number of electric power utility companies. A large number of these companies were visited and/or their distribution planners interviewed for the identification and description of distribution system planning methods and techniques used by these electric power utility companies and other commercial entities. Distribution systems planning models were reviewed and a set of new mixed-integer programming models were developed for the optimal expansion of the distribution systems. The models help the planner to select: (1) optimum substation locations; (2) optimum substation expansions; (3) optimum substation transformer sizes; (4) optimum load transfers between substations; (5) optimum feeder routes and sizes subject to a set of specified constraints. The models permit following existing right-of-ways and avoid areas where feeders and substations cannot be constructed. The results of computer runs were analyzed for adequacy in serving projected loads within regulation limits for both normal and emergency operation.

  12. Roadmap on biosensing and photonics with advanced nano-optical methods

    KAUST Repository

    Di Fabrizio, Enzo M.

    2016-05-10

    This roadmap, through the contributions of ten groups worldwide, contains different techniques, methods and materials devoted to sensing in nanomedicine. Optics is used in different ways in the detection schemes. Raman, fluorescence and infrared spectroscopies, plasmonics, second harmonic generation and optical tweezers are all used in applications from single molecule detection (both in highly diluted and in highly concentrated solutions) to single cell manipulation. In general, each optical scheme, through device miniaturization and electromagnetic field localization, exploits an intrinsic optical enhancement mechanism in order to increase the sensitivity and selectivity of the device with respect to the complex molecular construct. The materials used for detection include nanoparticles and nanostructures fabricated with different 2D and 3D lithographic methods. It is shown that sensitivity to a single molecule is already accessible whether the system under study is a single cell or a multitude of cells in a molecular mixture. Throughout the roadmap there is an attempt to foresee and to suggest future directions in this interdisciplinary field. © 2016 IOP Publishing Ltd.

  13. Advanced All-Gas Chemical Generation of Atomic Iodine for a COIL, and Testing the COIL Operation Including This Method of Atomic Iodine Generation

    National Research Council Canada - National Science Library

    Kodymova, Jarmila; Spalek, Otomar; Jirasek, Vit; Censky, Miroslav

    2004-01-01

    This report results from a contract tasking Academy of Sciences as follows: The Grantee will investigate advanced methods for chemical generation of atomic iodine for a Chemical Oxygen-Iodine Laser (COIL...

  14. An advanced three-phase physical, experimental and numerical method for tsunami induced boulder transport

    Science.gov (United States)

    Oetjen, Jan; Engel, Max; Prasad Pudasaini, Shiva; Schüttrumpf, Holger; Brückner, Helmut

    2017-04-01

    Coasts around the world are affected by high-energy wave events like storm surges or tsunamis depending on their regional climatological and geological settings. By focusing on tsunami impacts, we combine the abilities and experiences of different scientific fields aiming at improved insights of near- and onshore tsunami hydrodynamics. We investigate the transport of coarse clasts - so called boulders - due to tsunami impacts by a multi-methodology approach of numerical modelling, laboratory experiments, and sedimentary field records. Coupled numerical hydrodynamic and boulder transport models (BTM) are widely applied for analysing the impact characteristics of the transport by tsunami, such as wave height and flow velocity. Numerical models able to simulate past tsunami events and the corresponding boulder transport patterns with high accuracy and acceptable computational effort can be utilized as powerful forecasting models predicting the impact of a coast approaching tsunami. We have conducted small-scale physical experiments in the tilting flume with real shaped boulder models. Utilizing the structure from motion technique (Westoby et al., 2012) we reconstructed real boulders from a field study on the Island of Bonaire (Lesser Antilles, Caribbean Sea, Engel & May, 2012). The obtained three-dimensional boulder meshes are utilized for creating downscaled replica of the real boulder for physical experiments. The results of the irregular shaped boulder are compared to experiments with regular shaped boulder models to achieve a better insight about the shape related influence on transport patterns. The numerical model is based on the general two-phase mass flow model by Pudasaini (2012) enhanced for boulder transport simulations. The boulder is implemented using the immersed boundary technique (Peskin, 2002) and the direct forcing approach. In this method Cartesian grids (fluid and particle phase) and Lagrangian meshes (boulder) are combined. By applying the

  15. The Rheological Behavior of Multiphase Liquids Characterized by an Advanced Dilatometric Method

    Science.gov (United States)

    Helo, C. S.; Hess, K.; Potuzak, M.; Dingwell, D. B.

    2006-12-01

    Silicic volcanic rocks often contain a glassy groundmatrix. During formation of the glass when the liquid line of descent intersects the glass transition interval, all magmatic processes (e.g. crystallization, nucleation of bubbles) are frozen in. A corresponding `glass transition temperature' Tg can be determined by measuring changes in the temperature dependence of enthalpy (H) or volume (V) of the sample during subsequent reheating. The glass transition temperature is strongly dependent on the cooling rate which the sample experienced when the glass matrix was formed. Especially for silica-rich rocks the calorimetric analysis is usually hampered by the loss of the characteristic peaks in the time derivative curves, due to the presence of phenocrysts, microlites, or bubbles. However, it is possible to extract two characteristic features from the dilatometric dV/dt curve (in contrast to dH/dt), a dilatometrical `onset' and a `softening' temperature (Tg on and Tg soft, respectively). The method has been calibrated using a series well-known synthetic melt compositions ranging in NBO/T (non-bridging oxygen NBO over tetrahedral coordinated cations T) from 0 - 0.6. For these melts the viscosities at Tg on and Tg soft have been calculated. The viscosity at Tg on shows no compositional dependence on composition at fixed cooling and heating rates, and the precision is comparable to the conventional dilatometrical and calorimetrical measurements. The determination of Tg soft is even more precise (only approx. half the standard deviation) in contrast to the standard methods. Moreover, with one single measurement it is possible to determine directly the fragility of the melt, defined as the temperature dependence of the structural relaxation time at Tg on, by using a new parameter Fv = (Tg soft - Tg on) / Tg on. Measurements on selected pristine samples from several silicic volcanic centers have shown different trends. Throughout repeated heating / cooling with fixed rates

  16. Direct methods for limit and shakedown analysis of structures advanced computational algorithms and material modelling

    CERN Document Server

    Pisano, Aurora; Weichert, Dieter

    2015-01-01

    Articles in this book examine various materials and how to determine directly the limit state of a structure, in the sense of limit analysis and shakedown analysis. Apart from classical applications in mechanical and civil engineering contexts, the book reports on the emerging field of material design beyond the elastic limit, which has further industrial design and technological applications. Readers will discover that “Direct Methods” and the techniques presented here can in fact be used to numerically estimate the strength of structured materials such as composites or nano-materials, which represent fruitful fields of future applications.   Leading researchers outline the latest computational tools and optimization techniques and explore the possibility of obtaining information on the limit state of a structure whose post-elastic loading path and constitutive behavior are not well defined or well known. Readers will discover how Direct Methods allow rapid and direct access to requested information in...

  17. Recent advances in the application of capillary electromigration methods for food analysis and Foodomics.

    Science.gov (United States)

    Herrero, Miguel; García-Cañas, Virginia; Simo, Carolina; Cifuentes, Alejandro

    2010-01-01

    The use of capillary electromigration methods to analyze foods and food components is reviewed in this work. Papers that were published during the period April 2007 to March 2009 are included following the previous review by García-Cañas and Cifuentes (Electrophoresis, 2008, 29, 294-309). These works include the analysis of amino acids, biogenic amines, peptides, proteins, DNAs, carbohydrates, phenols, polyphenols, pigments, toxins, pesticides, vitamins, additives, small organic and inorganic ions and other compounds found in foods and beverages, as well as those applications of CE for monitoring food interactions and food processing. The use of microchips, CE-MS, chiral-CE as well as other foreseen trends in food analysis are also discussed including their possibilities in the very new field of Foodomics.

  18. Recent Advances in Silicon Nanowire Biosensors: Synthesis Methods, Properties, and Applications

    Science.gov (United States)

    Namdari, Pooria; Daraee, Hadis; Eatemadi, Ali

    2016-09-01

    The application of silicon nanowire (SiNW) biosensor as a subtle, label-free, and electrical tool has been extensively demonstrated by several researchers over the past few decades. Human ability to delicately fabricate and control its chemical configuration, morphology, and arrangement either separately or in combination with other materials as lead to the development of a nanomaterial with specific and efficient electronic and catalytic properties useful in the fields of biological sciences and renewable energy. This review illuminates on the various synthetic methods of SiNW, with its optical and electrical properties that make them one of the most applicable nanomaterials in the field of biomolecule sensing, photoelectrochemical conversion, and diseases diagnostics.

  19. Advanced nuclear power plant regulation using risk-informed and performance-based methods

    International Nuclear Information System (INIS)

    Modarres, Mohammad

    2009-01-01

    This paper proposes and discusses implications of a largely probabilistic regulatory framework using best-estimate, goal-driven, risk-informed, and performance-based methods. This framework relies on continuous probabilistic assessment of performance of a set of time-dependent, safety-critical systems, structures, components, and procedures that assure attainment of a broad set of overarching technology-neutral protective, mitigative, and preventive goals under all phases of plant operations. In this framework acceptable levels of performance are set through formal apportionment so that they are commensurate with the overarching goals. Regulatory acceptance would be the based on the confidence level with which the plant conforms to these goals and performance objectives. The proposed framework uses the traditional defense-in-depth design and operation regulatory philosophy when uncertainty in conforming to specific goals and objectives is high. Finally, the paper discusses the steps needed to develop a corresponding technology-neutral regulatory approach from the proposed framework

  20. Advanced experimental method for self-optimizing control system to a new energy converce plant

    International Nuclear Information System (INIS)

    Vasiliev, V.V.

    1992-01-01

    The progress in the development and studying of new methods of producing electric energy, based on direct conversion of heat, light, fuel or chemical energy into electric energy, raises the problem of more effective use of their power characteristics. In this paper, disclosure is made of a self-optimizing control system for an abject with a unimodal quality function. The system comprises an object, a divider, a band-pass filter, an averaging filter, a multiplier, a final control element, and adder and further includes a search signal generator. The fashion and the system are presented in the USSR No. 684510, in the USA No. 4179730, in France No. 2386854, in Germany No. 2814963, in Japan No. 1369882

  1. Advances on the molecular characterization, clinical relevance, and detection methods of Gadiform parvalbumin allergens.

    Science.gov (United States)

    Fernandes, Telmo J R; Costa, Joana; Carrapatoso, Isabel; Oliveira, Maria Beatriz P P; Mafra, Isabel

    2017-10-13

    Gadiform order includes several fish families, from which Gadidae and Merlucciidae are part of, comprising the most commercially important and highly appreciated fish species, such as cod, pollock, haddock, and hake. Parvalbumins, classified as calcium-binding proteins, are considered the main components involved in the majority of fish allergies. Nine and thirteen parvalbumins were identified in different fish species from Gadidae and Merlucciidae families, respectively. This review intends to describe their molecular characterization and the clinical relevance, as well as the prevalence of fish allergy. In addition, the main protein- and DNA-based methods to detect fish allergens are fully reviewed owing to their importance in the safeguard of sensitized/allergic individuals.

  2. Academia, advocacy, and industry: a collaborative method for clinical research advancement.

    Science.gov (United States)

    Vanzo, Rena J; Lortz, Amanda; Calhoun, Amy R U L; Carey, John C

    2014-07-01

    Professionals who work in academia, advocacy, and industry often carry out mutually exclusive activities related to research and clinical care. However, there are several examples of collaboration among such professionals that ultimately allows for improved scientific and clinical understanding. This commentary recounts our particular experience (a collaboration between geneticists at the Universities of Minnesota and Utah, the 4p- Support Group, and Lineagen, Inc) and reviews other similar projects. We formally propose this collaborative method as a conduit for future clinical research programs. Specifically, we encourage academicians, directors of family/advocacy/support groups, and members of industry to establish partnerships and document their experiences. The medical community as a whole will benefit from such partnerships and, specifically, families will teach us lessons that could never be learned in a laboratory or textbook. © 2014 Wiley Periodicals, Inc.

  3. Advanced methods for the analysis, design, and optimization of SMA-based aerostructures

    Science.gov (United States)

    Hartl, D. J.; Lagoudas, D. C.; Calkins, F. T.

    2011-09-01

    Engineers continue to apply shape memory alloys to aerospace actuation applications due to their high energy density, robust solid-state actuation, and silent and shock-free operation. Past design and development of such actuators relied on experimental trial and error and empirically derived graphical methods. Over the last two decades, however, it has been repeatedly demonstrated that existing SMA constitutive models can capture stabilized SMA transformation behaviors with sufficient accuracy. This work builds upon past successes and suggests a general framework by which predictive tools can be used to assess the responses of many possible design configurations in an automated fashion. By applying methods of design optimization, it is shown that the integrated implementation of appropriate analysis tools can guide engineers and designers to the best design configurations. A general design optimization framework is proposed for the consideration of any SMA component or assembly of such components that applies when the set of design variables includes many members. This is accomplished by relying on commercially available software and utilizing tools already well established in the design optimization community. Such tools are combined with finite element analysis (FEA) packages that consider a multitude of structural effects. The foundation of this work is a three-dimensional thermomechanical constitutive model for SMAs applicable for arbitrarily shaped bodies. A reduced-order implementation also allows computationally efficient analysis of structural components such as wires, rods, beams and shells. The use of multiple optimization schemes, the consideration of assembled components, and the accuracy of the implemented constitutive model in full and reduced-order forms are all demonstrated.

  4. Improvements of defects sizing reliability in steam generators tubes through advanced NDT methods

    International Nuclear Information System (INIS)

    Benoist, B.; Gondard, C.

    1994-01-01

    As the population of nuclear power plants ages, new defects are appearing in steam generator tubes (stress corrosion cracking, corrosion pitting and intergranular corrosion). Utilities are requiring additional data to characterise defects after their detection, i.e. their depth, length and orientation, in order to optimise any tube plugging decision. Eddy current (E.C.) inspection is the reference Non Destructive Testing (NDT) method for SG's tubes inspection, due the long experience gained in the field and its rapidity. But in some cases, such as circumferential crack or multi-directional crack areas, its capabilities are limited (depth evaluation). Therefore, the application of ultrasonic (U.T) inspection as a complementary method can be helpful. We present in this paper new developments in the field of rotating probes testing and data processing which improve defect detection and sizing. Present on-site E.C inspections use axial bobbin coils running along all the length of the tubes. It allows a first and fast inspection for volumetric defects (deposits, wears, bumps,..). For the areas such as tube sheets and tube support plates, rotating probes have been developed in order to improve the circumferential resolution (detection of transversal defects). To emphasize our experience, inspection on the retired SG in DAMPIERRE has been undertaken with the rotating probe. Real E.C signals of primary wall stress corrosion cracking (PWSCC) will be presented. The detection procedure is based on visual examination of E.C. images (2D surface mappings or Lissajours figures). Image processing is used for automatic detection of defect signals. A first approach consists to use image enhancement techniques such as median filter. Sobel gradient, thresholding, binary morphology, to obtain a binary image leading to the defect areas. Results will be shown on artificial defects and on the DAMPIERRE signals. (Author) 10 refs

  5. Practical advances in cortisol (F) and dehydroepiandrosterone sulfate (DS) radioimmunoassay using the microfilter paper method

    Energy Technology Data Exchange (ETDEWEB)

    Pang, S.; Shine, S.; Levine, L.S.; New, M.I.

    1980-04-01

    A new reliable microfilter paper technique for measuring F and DS by RIA has been developed to overcome the difficulties of venipuncture and the inconvenience of storage and transportation of serum samples. This method requires only a drop of whole capillary blood, to impregnate filter paper for obtaining a 1/8-inch disc specimen. The dried filter paper specimen was eluted with buffer, and a further diluted eluate was used in RIA directly for DS and, after ether extraction, for F. Steroid concentrations of F and DS were not detectable in either hemolysate of 20 ..mu..l of packed washed red blood cells or dexamethasone-suppressed whole blood specimens. The percent recoveries of both added radiolabeled and unlabeled steroid from whole blood filter paper specimens were similar to those of the plasma studies. These steroid concentrations remained unchanged in dried whole blood on filter paper for up to 30 days when stored at room temperature. Evaluation of plasma volume in disc samples with varying hematocrits (27% to 51%) indicated a constant plasma volume in the disc (1.52 to 1.72 ..mu..l per 1/8-inch disc) regardless of the hematocrit range. Furthermore, plasma concentrations of these steroids in the filter paper disc, based on a mean constant plasma volume in the disc (1.66 ..mu..l per 1/8-inch disc), were similar to that of the plasma sample. This indicates that the microfilter paper method has the specificity, sensitivity, accuracy, and precision of RIA of DS and F in whole plasma, and simplifies sample collection, storage, and transportation.

  6. Practical advances in cortisol (F) and dehydroepiandrosterone sulfate (DS) radioimmunoassay using the microfilter paper method

    International Nuclear Information System (INIS)

    Pang, S.; Shine, S.; Levine, L.S.; New, M.I.

    1980-01-01

    A new reliable microfilter paper technique for measuring F and DS by RIA has been developed to overcome the difficulties of venipuncture and the inconvenience of storage and transportation of serum samples. This method requires only a drop of whole capillary blood, to impregnate filter paper for obtaining a 1/8-inch disc specimen. The dried filter paper specimen was eluted with buffer, and a further diluted eluate was used in RIA directly for DS and, after ether extraction, for F. Steroid concentrations of F and DS were not detectable in either hemolysate of 20 μl of packed washed red blood cells or dexamethasone-suppressed whole blood specimens. The percent recoveries of both added radiolabeled and unlabeled steroid from whole blood filter paper specimens were similar to those of the plasma studies. These steroid concentrations remained unchanged in dried whole blood on filter paper for up to 30 days when stored at room temperature. Evaluation of plasma volume in disc samples with varying hematocrits (27% to 51%) indicated a constant plasma volume in the disc (1.52 to 1.72 μl per 1/8-inch disc) regardless of the hematocrit range. Furthermore, plasma concentrations of these steroids in the filter paper disc, based on a mean constant plasma volume in the disc (1.66 μl per 1/8-inch disc), were similar to that of the plasma sample. This indicates that the microfilter paper method has the specificity, sensitivity, accuracy, and precision of RIA of DS and F in whole plasma, and simplifies sample collection, storage, and transportation

  7. Method and Process Development of Advanced Atmospheric Plasma Spraying for Thermal Barrier Coatings

    Science.gov (United States)

    Mihm, Sebastian; Duda, Thomas; Gruner, Heiko; Thomas, Georg; Dzur, Birger

    2012-06-01

    Over the last few years, global economic growth has triggered a dramatic increase in the demand for resources, resulting in steady rise in prices for energy and raw materials. In the gas turbine manufacturing sector, process optimizations of cost-intensive production steps involve a heightened potential of savings and form the basis for securing future competitive advantages in the market. In this context, the atmospheric plasma spraying (APS) process for thermal barrier coatings (TBC) has been optimized. A constraint for the optimization of the APS coating process is the use of the existing coating equipment. Furthermore, the current coating quality and characteristics must not change so as to avoid new qualification and testing. Using experience in APS and empirically gained data, the process optimization plan included the variation of e.g. the plasma gas composition and flow-rate, the electrical power, the arrangement and angle of the powder injectors in relation to the plasma jet, the grain size distribution of the spray powder and the plasma torch movement procedures such as spray distance, offset and iteration. In particular, plasma properties (enthalpy, velocity and temperature), powder injection conditions (injection point, injection speed, grain size and distribution) and the coating lamination (coating pattern and spraying distance) are examined. The optimized process and resulting coating were compared to the current situation using several diagnostic methods. The improved process significantly reduces costs and achieves the requirement of comparable coating quality. Furthermore, a contribution was made towards better comprehension of the APS of ceramics and the definition of a better method for future process developments.

  8. Advanced methods for the analysis, design, and optimization of SMA-based aerostructures

    International Nuclear Information System (INIS)

    Hartl, D J; Lagoudas, D C; Calkins, F T

    2011-01-01

    Engineers continue to apply shape memory alloys to aerospace actuation applications due to their high energy density, robust solid-state actuation, and silent and shock-free operation. Past design and development of such actuators relied on experimental trial and error and empirically derived graphical methods. Over the last two decades, however, it has been repeatedly demonstrated that existing SMA constitutive models can capture stabilized SMA transformation behaviors with sufficient accuracy. This work builds upon past successes and suggests a general framework by which predictive tools can be used to assess the responses of many possible design configurations in an automated fashion. By applying methods of design optimization, it is shown that the integrated implementation of appropriate analysis tools can guide engineers and designers to the best design configurations. A general design optimization framework is proposed for the consideration of any SMA component or assembly of such components that applies when the set of design variables includes many members. This is accomplished by relying on commercially available software and utilizing tools already well established in the design optimization community. Such tools are combined with finite element analysis (FEA) packages that consider a multitude of structural effects. The foundation of this work is a three-dimensional thermomechanical constitutive model for SMAs applicable for arbitrarily shaped bodies. A reduced-order implementation also allows computationally efficient analysis of structural components such as wires, rods, beams and shells. The use of multiple optimization schemes, the consideration of assembled components, and the accuracy of the implemented constitutive model in full and reduced-order forms are all demonstrated

  9. Advancing New 3D Seismic Interpretation Methods for Exploration and Development of Fractured Tight Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    James Reeves

    2005-01-31

    In a study funded by the U.S. Department of Energy and GeoSpectrum, Inc., new P-wave 3D seismic interpretation methods to characterize fractured gas reservoirs are developed. A data driven exploratory approach is used to determine empirical relationships for reservoir properties. Fractures are predicted using seismic lineament mapping through a series of horizon and time slices in the reservoir zone. A seismic lineament is a linear feature seen in a slice through the seismic volume that has negligible vertical offset. We interpret that in regions of high seismic lineament density there is a greater likelihood of fractured reservoir. Seismic AVO attributes are developed to map brittle reservoir rock (low clay) and gas content. Brittle rocks are interpreted to be more fractured when seismic lineaments are present. The most important attribute developed in this study is the gas sensitive phase gradient (a new AVO attribute), as reservoir fractures may provide a plumbing system for both water and gas. Success is obtained when economic gas and oil discoveries are found. In a gas field previously plagued with poor drilling results, four new wells were spotted using the new methodology and recently drilled. The wells have estimated best of 12-months production indicators of 2106, 1652, 941, and 227 MCFGPD. The latter well was drilled in a region of swarming seismic lineaments but has poor gas sensitive phase gradient (AVO) and clay volume attributes. GeoSpectrum advised the unit operators that this location did not appear to have significant Lower Dakota gas before the well was drilled. The other three wells are considered good wells in this part of the basin and among the best wells in the area. These new drilling results have nearly doubled the gas production and the value of the field. The interpretation method is ready for commercialization and gas exploration and development. The new technology is adaptable to conventional lower cost 3D seismic surveys.

  10. Advanced methods for the quantification of trabecular bone structure and density in micro computed tomography images

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Jing

    2011-07-01

    Bone remodeling is a life long process composed of bone formation and resorption. Imbalance between bone formation and resorption is a cause of metabolic bone diseases. Thus, the understanding of factors that affect the remodeling balance is of great importance. Conventionally bone structure is measured using histomorphometry of thin stained sections which is destructive and non-reproducible. In contrast, volumetric microcomputed tomography ({mu}CT) imaging is a powerful tool for quantifying bone quality of small samples non-destructively. The aim of this thesis is to develop an analysis tool to quantify trabecular bone of mouse tibiae with high efficiency, accuracy and reproducibility. Materials and Methods: The trabecular volume of interest (VOI) definition in the proximal metaphysis of mouse tibiae includes three segmentation steps: the periosteal surface, the primary spongiosa and the proximal metaphysis. All these segmentation algorithms are hybrid volume growing-based approaches including automatic threshold estimation, volume growing with different criterions and combined morphological operations. To preserve the connectivity of the trabecular network volume growing with local adaptive thresholding (LAT) is used for the segmentation of the trabeculae. In order to accelerate this process the algorithm is only applied to voxels with gray values in an interval defined by two global thresholds. These are automatically determined and depend on the voxel-to-object-size ratio of the dataset. Standard bone structural parameters were implemented [29, 30, 62]. For the assessment of tissue mineral density (TMD), a calibration phantom made of epoxy resin-based material with two hydroxyapatite (HA) inserts was developed. Experiments were performed with the {mu}CT FORBILD scanner of the IMP to validate the homogeneity of the phantom inserts, the water equivalence of the epoxy resin-based plastic, the effect of beam hardening and the stability of the {mu}CT calibration

  11. NERI PROJECT 99-119. TASK 1. ADVANCED CONTROL TOOLS AND METHODS. FINAL REPORT

    Energy Technology Data Exchange (ETDEWEB)

    March-Leuba, J.A.

    2002-09-09

    Nuclear plants of the 21st century will employ higher levels of automation and fault tolerance to increase availability, reduce accident risk, and lower operating costs. Key developments in control algorithms, fault diagnostics, fault tolerance, and communication in a distributed system are needed to implement the fully automated plant. Equally challenging will be integrating developments in separate information and control fields into a cohesive system, which collectively achieves the overall goals of improved performance, safety, reliability, maintainability, and cost-effectiveness. Under the Nuclear Energy Research Initiative (NERI), the U. S. Department of Energy is sponsoring a project to address some of the technical issues involved in meeting the long-range goal of 21st century reactor control systems. This project, ''A New Paradigm for Automated Development Of Highly Reliable Control Architectures For Future Nuclear Plants,'' involves researchers from Oak Ridge National Laboratory, University of Tennessee, and North Carolina State University. This paper documents a research effort to develop methods for automated generation of control systems that can be traced directly to the design requirements. Our final goal is to allow the designer to specify only high-level requirements and stress factors that the control system must survive (e.g. a list of transients, or a requirement to withstand a single failure.) To this end, the ''control engine'' automatically selects and validates control algorithms and parameters that are optimized to the current state of the plant, and that have been tested under the prescribed stress factors. The control engine then automatically generates the control software from validated algorithms. Examples of stress factors that the control system must ''survive'' are: transient events (e.g., set-point changes, or expected occurrences such a load rejection,) and postulated

  12. Accurate Characterization of Winter Precipitation Using In-Situ Instrumentation, CSU-CHILL Radar, and Advanced Scattering Methods

    Science.gov (United States)

    Newman, A. J.; Notaros, B. M.; Bringi, V. N.; Kleinkort, C.; Huang, G. J.; Kennedy, P.; Thurai, M.

    2015-12-01

    We present a novel approach to remote sensing and characterization of winter precipitation and modeling of radar observables through a synergistic use of advanced in-situ instrumentation for microphysical and geometrical measurements of ice and snow particles, image processing methodology to reconstruct complex particle three-dimensional (3D) shapes, computational electromagnetics to analyze realistic precipitation scattering, and state-of-the-art polarimetric radar. Our in-situ measurement site at the Easton Valley View Airport, La Salle, Colorado, shown in the figure, consists of two advanced optical imaging disdrometers within a 2/3-scaled double fence intercomparison reference wind shield, and also includes PLUVIO snow measuring gauge, VAISALA weather station, and collocated NCAR GPS advanced upper-air system sounding system. Our primary radar is the CSU-CHILL radar, with a dual-offset Gregorian antenna featuring very high polarization purity and excellent side-lobe performance in any plane, and the in-situ instrumentation site being very conveniently located at a range of 12.92 km from the radar. A multi-angle snowflake camera (MASC) is used to capture multiple different high-resolution views of an ice particle in free-fall, along with its fall speed. We apply a visual hull geometrical method for reconstruction of 3D shapes of particles based on the images collected by the MASC, and convert these shapes into models for computational electromagnetic scattering analysis, using a higher order method of moments. A two-dimensional video disdrometer (2DVD), collocated with the MASC, provides 2D contours of a hydrometeor, along with the fall speed and other important parameters. We use the fall speed from the MASC and the 2DVD, along with state parameters measured at the Easton site, to estimate the particle mass (Böhm's method), and then the dielectric constant of particles, based on a Maxwell-Garnet formula. By calculation of the "particle-by-particle" scattering

  13. An advanced boundary element method (BEM) implementation for the forward problem of electromagnetic source imaging

    International Nuclear Information System (INIS)

    Akalin-Acar, Zeynep; Gencer, Nevzat G

    2004-01-01

    The forward problem of electromagnetic source imaging has two components: a numerical model to solve the related integral equations and a model of the head geometry. This study is on the boundary element method (BEM) implementation for numerical solutions and realistic head modelling. The use of second-order (quadratic) isoparametric elements and the recursive integration technique increase the accuracy in the solutions. Two new formulations are developed for the calculation of the transfer matrices to obtain the potential and magnetic field patterns using realistic head models. The formulations incorporate the use of the isolated problem approach for increased accuracy in solutions. If a personal computer is used for computations, each transfer matrix is calculated in 2.2 h. After this pre-computation period, solutions for arbitrary source configurations can be obtained in milliseconds for a realistic head model. A hybrid algorithm that uses snakes, morphological operations, region growing and thresholding is used for segmentation. The scalp, skull, grey matter, white matter and eyes are segmented from the multimodal magnetic resonance images and meshes for the corresponding surfaces are created. A mesh generation algorithm is developed for modelling the intersecting tissue compartments, such as eyes. To obtain more accurate results quadratic elements are used in the realistic meshes. The resultant BEM implementation provides more accurate forward problem solutions and more efficient calculations. Thus it can be the firm basis of the future inverse problem solutions

  14. Advanced Methods for Robot-Environment Interaction towards an Industrial Robot Aware of Its Volume

    Directory of Open Access Journals (Sweden)

    Fabrizio Romanelli

    2011-01-01

    Full Text Available A fundamental aspect of robot-environment interaction in industrial environments is given by the capability of the control system to model the structured and unstructured environment features. Industrial robots have to perform complex tasks at high speeds and have to satisfy hard cycle times while maintaining the operations extremely precise. The capability of the robot to perceive the presence of environmental objects is something still missing in the real industrial context. Although anthropomorphic robot producers have faced problems related to the interaction between robot and its environment, there is not an exhaustive study on the capabilities of the robot being aware of its volume and on the tools eventually mounted on its flange. In this paper, a solution to model the environment of the robot in order to make it capable of perceiving and avoiding collisions with the objects in its surroundings is shown. Furthermore, the model will be extended to take also into account the volume of the robot tool in order to extend the perception capabilities of the entire system. Testing results will be showed in order to validate the method, proving that the system is able to cope with complex real surroundings.

  15. Bayesian calibration of terrestrial ecosystem models: a study of advanced Markov chain Monte Carlo methods

    Science.gov (United States)

    Lu, Dan; Ricciuto, Daniel; Walker, Anthony; Safta, Cosmin; Munger, William

    2017-09-01

    Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results in a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. The result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.

  16. Improved Test Planning and Analysis Through the Use of Advanced Statistical Methods

    Science.gov (United States)

    Green, Lawrence L.; Maxwell, Katherine A.; Glass, David E.; Vaughn, Wallace L.; Barger, Weston; Cook, Mylan

    2016-01-01

    The goal of this work is, through computational simulations, to provide statistically-based evidence to convince the testing community that a distributed testing approach is superior to a clustered testing approach for most situations. For clustered testing, numerous, repeated test points are acquired at a limited number of test conditions. For distributed testing, only one or a few test points are requested at many different conditions. The statistical techniques of Analysis of Variance (ANOVA), Design of Experiments (DOE) and Response Surface Methods (RSM) are applied to enable distributed test planning, data analysis and test augmentation. The D-Optimal class of DOE is used to plan an optimally efficient single- and multi-factor test. The resulting simulated test data are analyzed via ANOVA and a parametric model is constructed using RSM. Finally, ANOVA can be used to plan a second round of testing to augment the existing data set with new data points. The use of these techniques is demonstrated through several illustrative examples. To date, many thousands of comparisons have been performed and the results strongly support the conclusion that the distributed testing approach outperforms the clustered testing approach.

  17. Recent advances in thermoluminescence and photostimulated luminescence detection methods for irradiated foods

    International Nuclear Information System (INIS)

    Sanderson, D.C.W.; Carmichael, L.A.; Naylor, J.D.

    1996-01-01

    Thermoluminescence (TL) and photostimulated luminescence (PSL) are radiation-specific phenomena resulting from energy storage by trapped charge carriers in dielectric materials following irradiation. Releasing such stored energy by thermal or optical stimulation can result in detectable luminescence emission during the relaxation processes which follow. These approaches can be applied to inorganic components present either as inherent parts of foods or as adhering contaminants, and to bio-inorganic systems. The strengths of these techniques lies in their radiation-specificity, and the wide range of sample types which may by analysed. The Scottish Universities Research and Reactor Centre (SURRC) has been involved in the development and application of luminescence methods since 1986, during which time over 4000 analyses of more than 800 different food samples have been performed for research purposes, or in support of UK food labelling regulations. This paper discusses the present scope of luminescence techniques, and identifies areas where recent work has extended the range of applications, and indicates areas where further investigations may be worthwhile. (author)

  18. Bayesian calibration of terrestrial ecosystem models: a study of advanced Markov chain Monte Carlo methods

    Directory of Open Access Journals (Sweden)

    D. Lu

    2017-09-01

    Full Text Available Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results in a better model fit and predictive performance compared to the popular adaptive Metropolis (AM scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. The result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.

  19. Advanced methods for preparation and characterization of infrared detector materials. [mercury cadmium tellurides

    Science.gov (United States)

    Lehoczky, S. L.; Szofran, F. R.; Martin, B. G.

    1980-01-01

    Mercury cadmium telluride crystals were prepared by the Bridgman method with a wide range of crystal growth rates and temperature gradients adequate to prevent constitutional supercooling under diffusion-limited, steady state, growth conditions. The longitudinal compositional gradients for different growth conditions and alloy compositions were calculated and compared with experimental data to develop a quantitative model of the crystal growth kinetics for the Hg(i-x)CdxTe alloys, and measurements were performed to ascertain the effect of growth conditions on radial compositional gradients. The pseudobinary HgTe-CdTe constitutional phase diagram was determined by precision differential thermal analysis measurements and used to calculate the segregation coefficient of Cd as a function of x and interface temperature. Computer algorithms specific to Hg(1-x)CdxTe were developed for calculations of the charge carrier concentrations, charge carrier mobilities, Hall coefficient, optical absorptance, and Fermi energy as functions of x, temperature, ionized donor and acceptor concentrations, and neutral defect concentrations.

  20. Advanced DNA-Based Point-of-Care Diagnostic Methods for Plant Diseases Detection

    Directory of Open Access Journals (Sweden)

    Han Yih Lau

    2017-12-01

    Full Text Available Diagnostic technologies for the detection of plant pathogens with point-of-care capability and high multiplexing ability are an essential tool in the fight to reduce the large agricultural production losses caused by plant diseases. The main desirable characteristics for such diagnostic assays are high specificity, sensitivity, reproducibility, quickness, cost efficiency and high-throughput multiplex detection capability. This article describes and discusses various DNA-based point-of care diagnostic methods for applications in plant disease detection. Polymerase chain reaction (PCR is the most common DNA amplification technology used for detecting various plant and animal pathogens. However, subsequent to PCR based assays, several types of nucleic acid amplification technologies have been developed to achieve higher sensitivity, rapid detection as well as suitable for field applications such as loop-mediated isothermal amplification, helicase-dependent amplification, rolling circle amplification, recombinase polymerase amplification, and molecular inversion probe. The principle behind these technologies has been thoroughly discussed in several review papers; herein we emphasize the application of these technologies to detect plant pathogens by outlining the advantages and disadvantages of each technology in detail.

  1. Characterization of mechanical properties of aluminized coatings in advanced gas turbine blades using a small punch method

    Energy Technology Data Exchange (ETDEWEB)

    Sugita, Y.; Ito, M. [Chuba Electric Power Co., Nagoya (Japan). Electric Power R and D Center; Sakurai, S. [Hitachi Ltd. (Japan). Mechanical Engineering Research Lab.; Bloomer, T.E.; Kameda, J. [Ames Lab., IA (United States)]|[Iowa State Univ., Ames, IA (United States). Center for Advanced Technology Development

    1997-04-01

    Advanced technologies of superalloy casting and coatings enable one to enhance the performance of combined cycle gas turbines for electric power generation by increasing the firing temperature. This paper describes examination of the microstructure/composition and mechanical properties (22--950 C) in aluminized CoCrAlY coatings of advanced gas turbine blades using scanning Auger microprobe and a small punch (SP) testing method. Aluminized coatings consisted of layered structure divided into four regimes: (1) Al enriched and Cr depleted region, (2) Al and Cr graded region, (3) fine grained microstructure with a mixture of Al and Cr enriched phases and (4) Ni/Co interdiffusion zone adjacent to the interface. SP specimens were prepared in order that the specimen surface would be located in the various coating regions. SP tests indicated strong dependence of the fracture properties on the various coatings regimes. Coatings 1 and 2 with very high microhardness showed much easier formation of brittle cracks in a wide temperature range, compared to coatings 3 and 4 although the coating 2 had ductility improvement at 950 C. The coating 3 had lower room temperature ductility than the coating 4. However, the ductility in the coating 3 exceeded that in the region 4 above 730 C due to a precipitous ductility increase. The integrity of aluminized coatings while in-service is discussed in light of the variation of the low cycle fatigue life as well as the ductility in the layered structure.

  2. Advancing internal erosion monitoring using seismic methods in field and laboratory studies

    Science.gov (United States)

    Parekh, Minal L.

    This dissertation presents research involving laboratory and field investigation of passive and active methods for monitoring and assessing earthen embankment infrastructure such as dams and levees. Internal erosion occurs as soil particles in an earthen structure migrate to an exit point under seepage forces. This process is a primary failure mode for dams and levees. Current dam and levee monitoring practices are not able to identify early stages of internal erosion, and often the result is loss of structure utility and costly repairs. This research contributes to innovations for detection and monitoring by studying internal erosion and monitoring through field experiments, laboratory experiments, and social and political framing. The field research in this dissertation included two studies (2009 and 2012) of a full-scale earthen embankment at the IJkdijk in the Netherlands. In both of these tests, internal erosion occurred as evidenced by seepage followed by sand traces and boils, and in 2009, eventual failure. With the benefit of arrays of closely spaced piezometers, pore pressure trends indicated internal erosion near the initiation time. Temporally and spatially dense pore water pressure measurements detected two pore water pressure transitions characteristic to the development of internal erosion, even in piezometers located away from the backward erosion activity. At the first transition, the backward erosion caused anomalous pressure decrease in piezometers, even under constant or increasing upstream water level. At the second transition, measurements stabilized as backward erosion extended further upstream of the piezometers, as shown in the 2009 test. The transitions provide an indication of the temporal development and the spatial extent of backward erosion. The 2012 IJkdijk test also included passive acoustic emissions (AE) monitoring. This study analyzed AE activity over the course of the 7-day test using a grid of geophones installed on the

  3. Developing advanced X-ray scattering methods combined with crystallography and computation.

    Science.gov (United States)

    Perry, J Jefferson P; Tainer, John A

    2013-03-01

    The extensive use of small angle X-ray scattering (SAXS) over the last few years is rapidly providing new insights into protein interactions, complex formation and conformational states in solution. This SAXS methodology allows for detailed biophysical quantification of samples of interest. Initial analyses provide a judgment of sample quality, revealing the potential presence of aggregation, the overall extent of folding or disorder, the radius of gyration, maximum particle dimensions and oligomerization state. Structural characterizations include ab initio approaches from SAXS data alone, and when combined with previously determined crystal/NMR, atomistic modeling can further enhance structural solutions and assess validity. This combination can provide definitions of architectures, spatial organizations of protein domains within a complex, including those not determined by crystallography or NMR, as well as defining key conformational states of a protein interaction. SAXS is not generally constrained by macromolecule size, and the rapid collection of data in a 96-well plate format provides methods to screen sample conditions. This includes screening for co-factors, substrates, differing protein or nucleotide partners or small molecule inhibitors, to more fully characterize the variations within assembly states and key conformational changes. Such analyses may be useful for screening constructs and conditions to determine those most likely to promote crystal growth of a complex under study. Moreover, these high throughput structural determinations can be leveraged to define how polymorphisms affect assembly formations and activities. This is in addition to potentially providing architectural characterizations of complexes and interactions for systems biology-based research, and distinctions in assemblies and interactions in comparative genomics. Thus, SAXS combined with crystallography/NMR and computation provides a unique set of tools that should be considered

  4. Sensitivity analysis of exergy destruction in a real combined cycle power plant based on advanced exergy method

    International Nuclear Information System (INIS)

    Boyaghchi, Fateme Ahmadi; Molaie, Hanieh

    2015-01-01

    Highlights: • The advanced exergy destruction components of a real CCPP are calculated. • The TIT and r c variation are investigated on exergy destruction parts of the cycle. • The TIT and r c growth increase the improvement potential in the most of components. • The TIT and r c growth decrease the unavoidable part in some components. - Abstract: The advanced exergy analysis extends engineering knowledge beyond the respective conventional methods by improving the design and operation of energy conversion systems. In advanced exergy analysis, the exergy destruction is splitting into endogenous/exogenous and avoidable/unavoidable parts. In this study, an advanced exergy analysis of a real combined cycle power plant (CCPP) with supplementary firing is done. The endogenous/exogenous irreversibilities of each component as well as their combination with avoidable/unavoidable irreversibilities are determined. A parametric study is presented discussing the sensitivity of various performance indicators to the turbine inlet temperature (TIT), and compressor pressure ratio (r c ). It is observed that the thermal and exergy efficiencies increase when TIT and r c rise. Results show that combustion chamber (CC) concentrates most of the exergy destruction (more than 62%), dominantly in unavoidable endogenous form which is decreased by 11.89% and 13.12% while the avoidable endogenous exergy destruction increase and is multiplied by the factors of 1.3 and 8.6 with increasing TIT and r c , respectively. In addition, TIT growth strongly increases the endogenous avoidable exergy destruction in high pressure superheater (HP.SUP), CC and low pressure evaporator (LP.EVAP). It, also, increases the exogenous avoidable exergy destruction of HP.SUP and low pressure steam turbine (LP.ST) and leads to the high decrement in the endogenous exergy destruction of the preheater (PRE) by about 98.8%. Furthermore, r c growth extremely rises the endogenous avoidable exergy destruction of gas

  5. A Demonstration of Advanced Safety Analysis Tools and Methods Applied to Large Break LOCA and Fuel Analysis for PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Szilard, Ronaldo Henriques [Idaho National Laboratory; Smith, Curtis Lee [Idaho National Laboratory; Martineau, Richard Charles [Idaho National Laboratory

    2016-03-01

    The U.S. Nuclear Regulatory Commission (NRC) is currently proposing a rulemaking designated as 10 CFR 50.46c to revise the loss-of-coolant accident (LOCA)/emergency core cooling system acceptance criteria to include the effects of higher burnup on fuel/cladding performance. We propose a demonstration problem of a representative four-loop PWR plant to study the impact of this new rule in the US nuclear fleet. Within the scope of evaluation for the 10 CFR 50.46c rule, aspects of safety, operations, and economics are considered in the industry application demonstration presented in this paper. An advanced safety analysis approach is used, by integrating the probabilistic element with deterministic methods for LOCA analysis, a novel approach to solving these types of multi-physics, multi-scale problems.

  6. A Bayesian analysis of rare B decays with advanced Monte Carlo methods

    Energy Technology Data Exchange (ETDEWEB)

    Beaujean, Frederik

    2012-11-12

    Searching for new physics in rare B meson decays governed by b {yields} s transitions, we perform a model-independent global fit of the short-distance couplings C{sub 7}, C{sub 9}, and C{sub 10} of the {Delta}B=1 effective field theory. We assume the standard-model set of b {yields} s{gamma} and b {yields} sl{sup +}l{sup -} operators with real-valued C{sub i}. A total of 59 measurements by the experiments BaBar, Belle, CDF, CLEO, and LHCb of observables in B{yields}K{sup *}{gamma}, B{yields}K{sup (*)}l{sup +}l{sup -}, and B{sub s}{yields}{mu}{sup +}{mu}{sup -} decays are used in the fit. Our analysis is the first of its kind to harness the full power of the Bayesian approach to probability theory. All main sources of theory uncertainty explicitly enter the fit in the form of nuisance parameters. We make optimal use of the experimental information to simultaneously constrain theWilson coefficients as well as hadronic form factors - the dominant theory uncertainty. Generating samples from the posterior probability distribution to compute marginal distributions and predict observables by uncertainty propagation is a formidable numerical challenge for two reasons. First, the posterior has multiple well separated maxima and degeneracies. Second, the computation of the theory predictions is very time consuming. A single posterior evaluation requires O(1s), and a few million evaluations are needed. Population Monte Carlo (PMC) provides a solution to both issues; a mixture density is iteratively adapted to the posterior, and samples are drawn in a massively parallel way using importance sampling. The major shortcoming of PMC is the need for cogent knowledge of the posterior at the initial stage. In an effort towards a general black-box Monte Carlo sampling algorithm, we present a new method to extract the necessary information in a reliable and automatic manner from Markov chains with the help of hierarchical clustering. Exploiting the latest 2012 measurements, the fit

  7. Advanced methods of process/quality control in nuclear reactor fuel manufacture. Proceedings of a technical committee meeting

    International Nuclear Information System (INIS)

    2000-07-01

    Nuclear fuel plays an essential role in ensuring the competitiveness of nuclear energy and its acceptance by the public. The economic and market situation is not favorable at present for nuclear fuel designers and suppliers. The reduction in fuel prices (mainly to compete with fossil fuels) and in the number of fuel assemblies to be delivered to customers (mainly due to burnup increase) has been offset by the rising number of safety and other requirements, e.g. the choice of fuel and structural materials and the qualification of equipment. In this respect, higher burnup and thermal rates, longer fuel cycles and the use of MOX fuels are the real means to improve the economics of the nuclear fuel cycle as a whole. Therefore, utilities and fuel vendors have recently initiated new research and development programmes aimed at improving fuel quality, design and materials to produce robust and reliable fuel for safe and reliable reactor operation more demanding conditions. In this connection, improvement of fuel quality occupies an important place and this requires continuous effort on the part of fuel researchers, designers and producers. In the early years of commercial fuel fabrication, emphasis was given to advancements in quality control/quality assurance related mainly to the product itself. Now, the emphasis is transferred to improvements in process control and to implementation of overall total quality management (TQM) programmes. In the area of fuel quality control, statistical methods are now widely implemented, replacing 100% inspection. The IAEA, recognizing the importance of obtaining and maintaining high standards in fuel fabrication, has paid particular attention to this subject. In response to the rapid progress in development and implementation of advanced methods of process/quality control in nuclear fuel manufacture and on the recommendation of the International Working Group on Water Reactor Fuel Performance and Technology, the IAEA conducted a

  8. Introduction to special section of the Journal of Family Psychology, advances in mixed methods in family psychology: integrative and applied solutions for family science.

    Science.gov (United States)

    Weisner, Thomas S; Fiese, Barbara H

    2011-12-01

    Mixed methods in family psychology refer to the systematic integration of qualitative and quantitative techniques to represent family processes and settings. Over the past decade, significant advances have been made in study design, analytic strategies, and technological support (such as software) that allow for the integration of quantitative and qualitative methods and for making appropriate inferences from mixed methods. This special section of the Journal of Family Psychology illustrates how mixed methods may be used to advance knowledge in family science through identifying important cultural differences in family structure, beliefs, and practices, and revealing patterns of family relationships to generate new measurement paradigms and inform clinical practice. Guidance is offered to advance mixed methods research in family psychology through sound principles of peer review.

  9. Cost-Benefit Analysis for the Advanced Near Net Shape Technology (ANNST) Method for Fabricating Stiffened Cylinders

    Science.gov (United States)

    Stoner, Mary Cecilia; Hehir, Austin R.; Ivanco, Marie L.; Domack, Marcia S.

    2016-01-01

    This cost-benefit analysis assesses the benefits of the Advanced Near Net Shape Technology (ANNST) manufacturing process for fabricating integrally stiffened cylinders. These preliminary, rough order-of-magnitude results report a 46 to 58 percent reduction in production costs and a 7-percent reduction in weight over the conventional metallic manufacturing technique used in this study for comparison. Production cost savings of 35 to 58 percent were reported over the composite manufacturing technique used in this study for comparison; however, the ANNST concept was heavier. In this study, the predicted return on investment of equipment required for the ANNST method was ten cryogenic tank barrels when compared with conventional metallic manufacturing. The ANNST method was compared with the conventional multi-piece metallic construction and composite processes for fabricating integrally stiffened cylinders. A case study compared these three alternatives for manufacturing a cylinder of specified geometry, with particular focus placed on production costs and process complexity, with cost analyses performed by the analogy and parametric methods. Furthermore, a scalability study was conducted for three tank diameters to assess the highest potential payoff of the ANNST process for manufacture of large-diameter cryogenic tanks. The analytical hierarchy process (AHP) was subsequently used with a group of selected subject matter experts to assess the value of the various benefits achieved by the ANNST method for potential stakeholders. The AHP study results revealed that decreased final cylinder mass and quality assurance were the most valued benefits of cylinder manufacturing methods, therefore emphasizing the relevance of the benefits achieved with the ANNST process for future projects.

  10. Evaluation of standard and advanced preprocessing methods for the univariate analysis of blood serum 1H-NMR spectra.

    Science.gov (United States)

    De Meyer, Tim; Sinnaeve, Davy; Van Gasse, Bjorn; Rietzschel, Ernst-R; De Buyzere, Marc L; Langlois, Michel R; Bekaert, Sofie; Martins, José C; Van Criekinge, Wim

    2010-10-01

    Proton nuclear magnetic resonance ((1)H-NMR)-based metabolomics enables the high-resolution and high-throughput assessment of a broad spectrum of metabolites in biofluids. Despite the straightforward character of the experimental methodology, the analysis of spectral profiles is rather complex, particularly due to the requirement of numerous data preprocessing steps. Here, we evaluate how several of the most common preprocessing procedures affect the subsequent univariate analyses of blood serum spectra, with a particular focus on how the standard methods perform compared to more advanced examples. Carr-Purcell-Meiboom-Gill 1D (1)H spectra were obtained for 240 serum samples from healthy subjects of the Asklepios study. We studied the impact of different preprocessing steps--integral (standard method) and probabilistic quotient normalization; no, equidistant (standard), and adaptive-intelligent binning; mean (standard) and maximum bin intensity data summation--on the resonance intensities of three different types of metabolites: triglycerides, glucose, and creatinine. The effects were evaluated by correlating the differently preprocessed NMR data with the independently measured metabolite concentrations. The analyses revealed that the standard methods performed inferiorly and that a combination of probabilistic quotient normalization after adaptive-intelligent binning and maximum intensity variable definition yielded the best overall results (triglycerides, R = 0.98; glucose, R = 0.76; creatinine, R = 0.70). Therefore, at least in the case of serum metabolomics, these or equivalent methods should be preferred above the standard preprocessing methods, particularly for univariate analyses. Additional optimization of the normalization procedure might further improve the analyses.

  11. Understanding advanced statistical methods

    CERN Document Server

    Westfall, Peter

    2013-01-01

    Introduction: Probability, Statistics, and ScienceReality, Nature, Science, and ModelsStatistical Processes: Nature, Design and Measurement, and DataModelsDeterministic ModelsVariabilityParametersPurely Probabilistic Statistical ModelsStatistical Models with Both Deterministic and Probabilistic ComponentsStatistical InferenceGood and Bad ModelsUses of Probability ModelsRandom Variables and Their Probability DistributionsIntroductionTypes of Random Variables: Nominal, Ordinal, and ContinuousDiscrete Probability Distribution FunctionsContinuous Probability Distribution FunctionsSome Calculus-Derivatives and Least SquaresMore Calculus-Integrals and Cumulative Distribution FunctionsProbability Calculation and SimulationIntroductionAnalytic Calculations, Discrete and Continuous CasesSimulation-Based ApproximationGenerating Random NumbersIdentifying DistributionsIntroductionIdentifying Distributions from Theory AloneUsing Data: Estimating Distributions via the HistogramQuantiles: Theoretical and Data-Based Estimate...

  12. Development of human performance evaluation methods and systems for human factors validation in an advanced control room

    International Nuclear Information System (INIS)

    Ha, Jun Su

    2008-02-01

    Advanced control room (ACR) human-machine interface (HMI) design of advanced nuclear power plants (NPPs) such as APR (advanced power reactor)-1400 can be validated through performance-based tests to determine whether it acceptably supports safe operation of the plant. In this paper, plant performance, personnel task performance, situation awareness, workload, teamwork, and anthropometric/ physiological factor are considered as factors for the human performance evaluation. For development of measures in each of the factors, measures generally used in various industries and empirically proven to be useful are adopted as main measures with some modifications. In addition, helpful measures are developed as complementary measures in order to overcome some of the limitations associated with the main measures. The development of the measures is addressed based on the theoretical and empirical background and also based on the regulatory guidelines. A computerized system, which is called HUPESS (human performance evaluation support system), is developed based on the measures developed in this paper. The development of HUPESS is described with respect to the system configuration, the development process, and integrated measurement, evaluation, and analysis. HUPESS supports evaluators (or experimenters) to effectively measure, analyze, and evaluate the human performance for the HMI design validation in ACRs. Hence HUPESS is expected to be used as an effective tool for the human factors validation in the ACR of Shin Kori 3 and 4 NPPs (APR-1400 type) which are under construction in South-Korea. Also two measures of attentional-resource effectiveness based on cost-benefit analysis are developed. One of them is Fixation to Importance Ratio (FIR) which represents the attentional resources spent on an information source compared to the importance of the information source. The other measure is selective attention effectiveness (SAE) which incorporates the FIRs for all information

  13. New Analytical Methods for the Surface/ Interface and the Micro-Structures in Advanced Nanocomposite Materials by Synchrotron Radiation

    Directory of Open Access Journals (Sweden)

    K. Nakamae

    2010-12-01

    Full Text Available Analytical methods of surface/interface structure and micro-structure in advanced nanocomposite materials by using the synchrotron radiation are introduced. Recent results obtained by the energy-tunable and highly collimated brilliant X-rays, in-situ wide angle/small angle X-ray diffraction with high accuracy are reviewed. It is shown that small angle X-ray scattering is one of the best methods to characterize nanoparticle dispersibility, filler aggregate/agglomerate structures and in-situ observation of hierarchical structure deformation in filled rubber under cyclic stretch. Grazing Incidence(small and wide angle X-ray Scattering are powerful to analyze the sintering process of metal nanoparticle by in-situ observation as well as the orientation of polymer molecules and crystalline orientation at very thin surface layer (ca 7nm of polymer film. While the interaction and conformation of adsorbed molecule at interface can be investigated by using high energy X-ray XPS with Enough deep position (ca 9 micron m.

  14. Characterization and detection of Vero cells infected with Herpes Simplex Virus type 1 using Raman spectroscopy and advanced statistical methods.

    Science.gov (United States)

    Salman, A; Shufan, E; Zeiri, L; Huleihel, M

    2014-07-01

    Herpes viruses are involved in a variety of human disorders. Herpes Simplex Virus type 1 (HSV-1) is the most common among the herpes viruses and is primarily involved in human cutaneous disorders. Although the symptoms of infection by this virus are usually minimal, in some cases HSV-1 might cause serious infections in the eyes and the brain leading to blindness and even death. A drug, acyclovir, is available to counter this virus. The drug is most effective when used during the early stages of the infection, which makes early detection and identification of these viral infections highly important for successful treatment. In the present study we evaluated the potential of Raman spectroscopy as a sensitive, rapid, and reliable method for the detection and identification of HSV-1 viral infections in cell cultures. Using Raman spectroscopy followed by advanced statistical methods enabled us, with sensitivity approaching 100%, to differentiate between a control group of Vero cells and another group of Vero cells that had been infected with HSV-1. Cell sites that were "rich in membrane" gave the best results in the differentiation between the two categories. The major changes were observed in the 1195-1726 cm(-1) range of the Raman spectrum. The features in this range are attributed mainly to proteins, lipids, and nucleic acids. Copyright © 2014. Published by Elsevier Inc.

  15. Surface renewal: an advanced micrometeorological method for measuring and processing field-scale energy flux density data.

    Science.gov (United States)

    McElrone, Andrew J; Shapland, Thomas M; Calderon, Arturo; Fitzmaurice, Li; Paw U, Kyaw Tha; Snyder, Richard L

    2013-12-12

    Advanced micrometeorological methods have become increasingly important in soil, crop, and environmental sciences. For many scientists without formal training in atmospheric science, these techniques are relatively inaccessible. Surface renewal and other flux measurement methods require an understanding of boundary layer meteorology and extensive training in instrumentation and multiple data management programs. To improve accessibility of these techniques, we describe the underlying theory of surface renewal measurements, demonstrate how to set up a field station for surface renewal with eddy covariance calibration, and utilize our open-source turnkey data logger program to perform flux data acquisition and processing. The new turnkey program returns to the user a simple data table with the corrected fluxes and quality control parameters, and eliminates the need for researchers to shuttle between multiple processing programs to obtain the final flux data. An example of data generated from these measurements demonstrates how crop water use is measured with this technique. The output information is useful to growers for making irrigation decisions in a variety of agricultural ecosystems. These stations are currently deployed in numerous field experiments by researchers in our group and the California Department of Water Resources in the following crops: rice, wine and raisin grape vineyards, alfalfa, almond, walnut, peach, lemon, avocado, and corn.

  16. Advanced quadrature sets and acceleration and preconditioning techniques for the discrete ordinates method in parallel computing environments

    Science.gov (United States)

    Longoni, Gianluca

    In the nuclear science and engineering field, radiation transport calculations play a key-role in the design and optimization of nuclear devices. The linear Boltzmann equation describes the angular, energy and spatial variations of the particle or radiation distribution. The discrete ordinates method (S N) is the most widely used technique for solving the linear Boltzmann equation. However, for realistic problems, the memory and computing time require the use of supercomputers. This research is devoted to the development of new formulations for the SN method, especially for highly angular dependent problems, in parallel environments. The present research work addresses two main issues affecting the accuracy and performance of SN transport theory methods: quadrature sets and acceleration techniques. New advanced quadrature techniques which allow for large numbers of angles with a capability for local angular refinement have been developed. These techniques have been integrated into the 3-D SN PENTRAN (Parallel Environment Neutral-particle TRANsport) code and applied to highly angular dependent problems, such as CT-Scan devices, that are widely used to obtain detailed 3-D images for industrial/medical applications. In addition, the accurate simulation of core physics and shielding problems with strong heterogeneities and transport effects requires the numerical solution of the transport equation. In general, the convergence rate of the solution methods for the transport equation is reduced for large problems with optically thick regions and scattering ratios approaching unity. To remedy this situation, new acceleration algorithms based on the Even-Parity Simplified SN (EP-SSN) method have been developed. A new stand-alone code system, PENSSn (Parallel Environment Neutral-particle Simplified SN), has been developed based on the EP-SSN method. The code is designed for parallel computing environments with spatial, angular and hybrid (spatial/angular) domain

  17. Effect of advanced location methods on search and rescue duration for general aviation aircraft accidents in the contiguous United States

    Science.gov (United States)

    Wallace, Ryan J.

    The purpose of this study was to determine the impact of advanced search and rescue devices and techniques on search duration for general aviation aircraft crashes. The study assessed three categories of emergency locator transmitters, including 121.5 MHz, 406 MHz, and GPS-Assisted 406 MHz devices. The impact of the COSPAS-SARSAT organization ceasing satellite monitoring for 121.5 MHz ELTs in 2009 was factored into the study. Additionally, the effect of using radar forensic analysis and cellular phone forensic search methods were also assessed. The study's data was derived from an Air Force Rescue Coordination Center database and included 365 historical general aviation search and rescue missions conducted between 2006 and 2011. Highly skewed data was transformed to meet normality requirements for parametric testing. The significance of each ELT model was assessed using a combination of Brown-Forsythe Means Testing or Orthogonal Contrast Testing. ANOVA and Brown-Forsythe Means testing was used to evaluate cellular phone and radar forensic search methods. A Spearman's Rho test was used to determine if the use of multiple search methods produced an additive effect in search efficiency. Aircraft which utilized an Emergency Locator Transmitter resulted in a shorter search duration than those which did not use such devices. Aircraft utilizing GPS-Aided 406 MHz ELTs appeared to require less time to locate than if equipped with other ELT models, however, this assessment requires further study due to limited data. Aircraft equipped with 406 MHz ELTs required slightly less time to locate than aircraft equipped with older 121.5 MHz ELTs. The study found no substantial difference in the search durations for 121.5 MHz ELTs monitored by COSPAS-SARSAT verses those which were not. Significance testing revealed that the use of cellular phone forensic data and radar forensic data both resulted in substantially higher mission search durations. Some possible explanations for this

  18. Innovative and Advanced Coupled Neutron Transport and Thermal Hydraulic Method (Tool) for the Design, Analysis and Optimization of VHTR/NGNP Prismatic Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Rahnema, Farzad; Garimeela, Srinivas; Ougouag, Abderrafi; Zhang, Dingkang

    2013-11-29

    This project will develop a 3D, advanced coarse mesh transport method (COMET-Hex) for steady- state and transient analyses in advanced very high-temperature reactors (VHTRs). The project will lead to a coupled neutronics and thermal hydraulic (T/H) core simulation tool with fuel depletion capability. The computational tool will be developed in hexagonal geometry, based solely on transport theory without (spatial) homogenization in complicated 3D geometries. In addition to the hexagonal geometry extension, collaborators will concurrently develop three additional capabilities to increase the code’s versatility as an advanced and robust core simulator for VHTRs. First, the project team will develop and implement a depletion method within the core simulator. Second, the team will develop an elementary (proof-of-concept) 1D time-dependent transport method for efficient transient analyses. The third capability will be a thermal hydraulic method coupled to the neutronics transport module for VHTRs. Current advancements in reactor core design are pushing VHTRs toward greater core and fuel heterogeneity to pursue higher burn-ups, efficiently transmute used fuel, maximize energy production, and improve plant economics and safety. As a result, an accurate and efficient neutron transport, with capabilities to treat heterogeneous burnable poison effects, is highly desirable for predicting VHTR neutronics performance. This research project’s primary objective is to advance the state of the art for reactor analysis.

  19. Investigating Microbe-Mineral Interactions: Recent Advances in X-Ray and Electron Microscopy and Redox-Sensitive Methods

    Science.gov (United States)

    Miot, Jennyfer; Benzerara, Karim; Kappler, Andreas

    2014-05-01

    Microbe-mineral interactions occur in diverse modern environments, from the deep sea and subsurface rocks to soils and surface aquatic environments. They may have played a central role in the geochemical cycling of major (e.g., C, Fe, Ca, Mn, S, P) and trace (e.g., Ni, Mo, As, Cr) elements over Earth's history. Such interactions include electron transfer at the microbe-mineral interface that left traces in the rock record. Geomicrobiology consists in studying interactions at these organic-mineral interfaces in modern samples and looking for traces of past microbe-mineral interactions recorded in ancient rocks. Specific tools are required to probe these interfaces and to understand the mechanisms of interaction between microbes and minerals from the scale of the biofilm to the nanometer scale. In this review, we focus on recent advances in electron microscopy, in particular in cryoelectron microscopy, and on a panel of electrochemical and synchrotron-based methods that have recently provided new understanding and imaging of the microbe-mineral interface, ultimately opening new fields to be explored.

  20. Numerical Investigation of Cross Flow Phenomena in a Tight-Lattice Rod Bundle Using Advanced Interface Tracking Method

    Science.gov (United States)

    Zhang, Weizhong; Yoshida, Hiroyuki; Ose, Yasuo; Ohnuki, Akira; Akimoto, Hajime; Hotta, Akitoshi; Fujimura, Ken

    In relation to the design of an innovative FLexible-fuel-cycle Water Reactor (FLWR), investigation of thermal-hydraulic performance in tight-lattice rod bundles of the FLWR is being carried out at Japan Atomic Energy Agency (JAEA). The FLWR core adopts a tight triangular lattice arrangement with about 1 mm gap clearance between adjacent fuel rods. In view of importance of accurate prediction of cross flow between subchannels in the evaluation of the boiling transition (BT) in the FLWR core, this study presents a statistical evaluation of numerical simulation results obtained by a detailed two-phase flow simulation code, TPFIT, which employs an advanced interface tracking method. In order to clarify mechanisms of cross flow in such tight lattice rod bundles, the TPFIT is applied to simulate water-steam two-phase flow in two modeled subchannels. Attention is focused on instantaneous fluctuation characteristics of cross flow. With the calculation of correlation coefficients between differential pressure and gas/liquid mixing coefficients, time scales of cross flow are evaluated, and effects of mixing section length, flow pattern and gap spacing on correlation coefficients are investigated. Differences in mechanism between gas and liquid cross flows are pointed out.

  1. The finite volume method in computational fluid dynamics an advanced introduction with OpenFOAM and Matlab

    CERN Document Server

    Moukalled, F; Darwish, M

    2016-01-01

    This textbook explores both the theoretical foundation of the Finite Volume Method (FVM) and its applications in Computational Fluid Dynamics (CFD). Readers will discover a thorough explanation of the FVM numerics and algorithms used for the simulation of incompressible and compressible fluid flows, along with a detailed examination of the components needed for the development of a collocated unstructured pressure-based CFD solver. Two particular CFD codes are explored. The first is uFVM, a three-dimensional unstructured pressure-based finite volume academic CFD code, implemented within Matlab. The second is OpenFOAM®, an open source framework used in the development of a range of CFD programs for the simulation of industrial scale flow problems. With over 220 figures, numerous examples and more than one hundred exercise on FVM numerics, programming, and applications, this textbook is suitable for use in an introductory course on the FVM, in an advanced course on numerics, and as a reference for CFD programm...

  2. Assessing health impacts in complex eco-epidemiological settings in the humid tropics: Advancing tools and methods

    International Nuclear Information System (INIS)

    Winkler, Mirko S.; Divall, Mark J.; Krieger, Gary R.; Balge, Marci Z.; Singer, Burton H.; Utzinger, Juerg

    2010-01-01

    In the developing world, large-scale projects in the extractive industry and natural resources sectors are often controversial and associated with long-term adverse health consequences to local communities. In many industrialised countries, health impact assessment (HIA) has been institutionalized for the mitigation of anticipated negative health effects while enhancing the benefits of projects, programmes and policies. However, in developing country settings, relatively few HIAs have been performed. Hence, more HIAs with a focus on low- and middle-income countries are needed to advance and refine tools and methods for impact assessment and subsequent mitigation measures. We present a promising HIA approach, developed within the frame of a large gold-mining project in the Democratic Republic of the Congo. The articulation of environmental health areas, the spatial delineation of potentially affected communities and the use of a diversity of sources to obtain quality baseline health data are utilized for risk profiling. We demonstrate how these tools and data are fed into a risk analysis matrix, which facilitates ranking of potential health impacts for subsequent prioritization of mitigation strategies. The outcomes encapsulate a multitude of environmental and health determinants in a systematic manner, and will assist decision-makers in the development of mitigation measures that minimize potential adverse health effects and enhance positive ones.

  3. Predictive Method for Correct Identification of Archaeological Charred Grape Seeds: Support for Advances in Knowledge of Grape Domestication Process

    Science.gov (United States)

    Ucchesu, Mariano; Orrù, Martino; Grillo, Oscar; Venora, Gianfranco; Paglietti, Giacomo; Ardu, Andrea; Bacchetta, Gianluigi

    2016-01-01

    The identification of archaeological charred grape seeds is a difficult task due to the alteration of the morphological seeds shape. In archaeobotanical studies, for the correct discrimination between Vitis vinifera subsp. sylvestris and Vitis vinifera subsp. vinifera grape seeds it is very important to understand the history and origin of the domesticated grapevine. In this work, different carbonisation experiments were carried out using a hearth to reproduce the same burning conditions that occurred in archaeological contexts. In addition, several carbonisation trials on modern wild and cultivated grape seeds were performed using a muffle furnace. For comparison with archaeological materials, modern grape seed samples were obtained using seven different temperatures of carbonisation ranging between 180 and 340ºC for 120 min. Analysing the grape seed size and shape by computer vision techniques, and applying the stepwise linear discriminant analysis (LDA) method, discrimination of the wild from the cultivated charred grape seeds was possible. An overall correct classification of 93.3% was achieved. Applying the same statistical procedure to compare modern charred with archaeological grape seeds, found in Sardinia and dating back to the Early Bronze Age (2017–1751 2σ cal. BC), allowed 75.0% of the cases to be identified as wild grape. The proposed method proved to be a useful and effective procedure in identifying, with high accuracy, the charred grape seeds found in archaeological sites. Moreover, it may be considered valid support for advances in the knowledge and comprehension of viticulture adoption and the grape domestication process. The same methodology may also be successful when applied to other plant remains, and provide important information about the history of domesticated plants. PMID:26901361

  4. Exploring the relationship between the engineering and physical sciences and the health and life sciences by advanced bibliometric methods.

    Science.gov (United States)

    Waltman, Ludo; van Raan, Anthony F J; Smart, Sue

    2014-01-01

    We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach to analyze the 'EPS-HLS interface' is based on term map visualizations of HLS research fields. We consider 16 clinical fields and five life science fields. On the basis of expert judgment, EPS research in these fields is studied by identifying EPS-related terms in the term maps. In the second approach, a large-scale citation-based network analysis is applied to publications from all fields of science. We work with about 22,000 clusters of publications, each representing a topic in the scientific literature. Citation relations are used to identify topics at the EPS-HLS interface. The two approaches complement each other. The advantages of working with textual data compensate for the limitations of working with citation relations and the other way around. An important advantage of working with textual data is in the in-depth qualitative insights it provides. Working with citation relations, on the other hand, yields many relevant quantitative statistics. We find that EPS research contributes to HLS developments mainly in the following five ways: new materials and their properties; chemical methods for analysis and molecular synthesis; imaging of parts of the body as well as of biomaterial surfaces; medical engineering mainly related to imaging, radiation therapy, signal processing technology, and other medical instrumentation; mathematical and statistical methods for data analysis. In our analysis, about 10% of all EPS and HLS publications are classified as being at the EPS-HLS interface. This percentage has remained more or less constant during the past decade.

  5. Advances in Alkenone Paleotemperature Proxies: Analytical Methods, Novel Structures and Haptophyte Species, Biosynthesis, New indices and Ecological Aspects

    Science.gov (United States)

    Huang, Y.; Longo, W. M.; Zheng, Y.; Richter, N.; Dillon, J. T.; Theroux, S.; D'Andrea, W. J.; Toney, J. L.; Wang, L.; Amaral-Zettler, L. A.

    2017-12-01

    Alkenones are mature, well-established paleo-sea surface temperature proxies that have been widely applied for more than three decades. However, recent advances across a broad range of alkenone-related topics at Brown University are inviting new paleoclimate and paleo-environmental applications for these classic biomarkers. In this presentation, I will summarize our progress in the following areas: (1) Discovery of a freshwater alkenone-producing haptophyte species and structural elucidation of novel alkenone structures unique to the species, performing in-situ temperature calibrations, and classifying alkenone-producing haptophytes into three groups based on molecular ecological approaches (with the new species belonging to Group I Isochrysidales); (2) A global survey of Group I haptophyte distributions and environmental conditions favoring the presence of this alga, as well as examples of using Group I alkenones for paleotemperature reconstructions; (3) New gas chromatographic columns that allow unprecedented resolution of alkenones and alkenoates and associated structural isomers, and development of a new suite of paleotemperature and paleoenvironmental proxies; (4) A new liquid chromatographic separation technique that allows efficient cleanup of alkenones and alkenoates (without the need for saponification) for subsequent coelution-free gas chromatographic analysis; (5) Novel structural features revealed by new analytical methods that now allow a comprehensive re-assessment of taxonomic features of various haptophyte species, with principal component analysis capable of fully resolving species biomarker distributions; (6) Development of UK37 double prime (UK37'') for Group II haptophytes (e.g., those occurring in saline lakes and estuaries), that differs from the traditional unsaturation indices used for SST reconstructions; (7) New assessment of how mixed inputs from different alkenone groups may affect SST reconstructions in marginal ocean environments and

  6. Assessment of Crack Detection in Heavy-Walled Cast Stainless Steel Piping Welds Using Advanced Low-Frequency Ultrasonic Methods

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, Michael T.; Crawford, Susan L.; Cumblidge, Stephen E.; Denslow, Kayte M.; Diaz, Aaron A.; Doctor, Steven R.

    2007-03-01

    Studies conducted at the Pacific Northwest National Laboratory in Richland, Washington, have focused on assessing the effectiveness and reliability of novel approaches to nondestructive examination (NDE) for inspecting coarse-grained, cast stainless steel reactor components. The primary objective of this work is to provide information to the U.S. Nuclear Regulatory Commission on the effectiveness and reliability of advanced NDE methods as related to the inservice inspection of safety-related components in pressurized water reactors (PWRs). This report provides progress, recent developments, and results from an assessment of low frequency ultrasonic testing (UT) for detection of inside surface-breaking cracks in cast stainless steel reactor piping weldments as applied from the outside surface of the components. Vintage centrifugally cast stainless steel piping segments were examined to assess the capability of low-frequency UT to adequately penetrate challenging microstructures and determine acoustic propagation limitations or conditions that may interfere with reliable flaw detection. In addition, welded specimens containing mechanical and thermal fatigue cracks were examined. The specimens were fabricated using vintage centrifugally cast and statically cast stainless steel materials, which are typical of configurations installed in PWR primary coolant circuits. Ultrasonic studies on the vintage centrifugally cast stainless steel piping segments were conducted with a 400-kHz synthetic aperture focusing technique and phased array technology applied at 500 kHz, 750 kHz, and 1.0 MHz. Flaw detection and characterization on the welded specimens was performed with the phased array method operating at the frequencies stated above. This report documents the methodologies used and provides results from laboratory studies to assess baseline material noise, crack detection, and length-sizing capability for low-frequency UT in cast stainless steel piping.

  7. Improvement in treatment results for hypopharyngeal cancer through the advance of treatment methods. The usefulness of team practice

    International Nuclear Information System (INIS)

    Umeno, Hirohito; Chijiwa, Hideki; Sakamoto, Kikuo; Nakashima, Tadashi; Sueyoshi, Susumu; Fujita, Hiromasa; Inoue, Yojiro; Kiyokawa, Kensuke; Mori, Kazunori

    2006-01-01

    Treatment results were analyzed in 392 hypopharyngeal cancer patients who were radically treated at Kurume University Hospital between 1960 and 2003. In the analysis, they were divided into three chronological groups: the first consisting of 37 patients treated between 1960 and 1970, the second of 122 patients treated between 1971 and 1988, and the third of 233 patients treated between 1989 and 2003. The cause-specific 5-year survival rate for the second group was 38%. In contrast, the survival rate for the third group rose to 67%. The first choice of treatment for early hypopharyngeal cancer in the first and second groups was partial pharyngectomy. However, that of the third group was CO 2 laser resection or radiotherapy followed by laser resection. The cause-specific 5-year survival rate with laser surgery was 86%. Over the years, the method of reconstruction after total resection for advanced hypopharyngeal cancer has changed. At present free jejunum reconstructive surgery after total pharyngo-laryngo-esophagectomy is considered to be a safe and most stable method, because it enables resection of the primary lesion with sufficient margin as required. Failure of reconstruction by jejunum graft was detected in only three of 137 patients (2%) who received total pharyngo-laryngo-esophagectomy. In patients who received free jejunum reconstructive surgery, the rate of cause of death tied to primary or metastatic lymph node decreased by resecting the primary lesion with sufficient margin as required, bilateral neck dissection, bilateral Rouviere and paratracheal neck dissection, and post-operative radiation. The findings indicate that treatment results for hypopharyngeal cancer improved dramatically by team practice involving a head and neck surgeon, surgeon, plastic surgeon and radiation oncologist. (author)

  8. Bioinformatics Methods and Tools to Advance Clinical Care. Findings from the Yearbook 2015 Section on Bioinformatics and Translational Informatics.

    Science.gov (United States)

    Soualmia, L F; Lecroq, T

    2015-08-13

    To summarize excellent current research in the field of Bioinformatics and Translational Informatics with application in the health domain and clinical care. We provide a synopsis of the articles selected for the IMIA Yearbook 2015, from which we attempt to derive a synthetic overview of current and future activities in the field. As last year, a first step of selection was performed by querying MEDLINE with a list of MeSH descriptors completed by a list of terms adapted to the section. Each section editor has evaluated separately the set of 1,594 articles and the evaluation results were merged for retaining 15 articles for peer-review. The selection and evaluation process of this Yearbook's section on Bioinformatics and Translational Informatics yielded four excellent articles regarding data management and genome medicine that are mainly tool-based papers. In the first article, the authors present PPISURV a tool for uncovering the role of specific genes in cancer survival outcome. The second article describes the classifier PredictSNP which combines six performing tools for predicting disease-related mutations. In the third article, by presenting a high-coverage map of the human proteome using high resolution mass spectrometry, the authors highlight the need for using mass spectrometry to complement genome annotation. The fourth article is also related to patient survival and decision support. The authors present datamining methods of large-scale datasets of past transplants. The objective is to identify chances of survival. The current research activities still attest the continuous convergence of Bioinformatics and Medical Informatics, with a focus this year on dedicated tools and methods to advance clinical care. Indeed, there is a need for powerful tools for managing and interpreting complex, large-scale genomic and biological datasets, but also a need for user-friendly tools developed for the clinicians in their daily practice. All the recent research and

  9. Contribution of developing advanced engineering methods in interdisciplinary studying the piston rings from 1.6 spark ignited Ford engine at Technical University of Cluj-Napoca

    Science.gov (United States)

    -Aurel Cherecheş, Ioan; -Ioana Borzan, Adela; -Laurean Băldean, Doru

    2017-10-01

    Study of construction and wearing process in the case of piston-rings and other significant components from internal combustion engines leads at any time to creative and useful optimizing ideas, both in designing and manufacturing phases. Main objective of the present paper is to realize an interdisciplinary research using advanced methods in piston-rings evaluation of a common vehicle on the streets which is Ford Focus FYDD. Specific objectives are a theoretical study of the idea for advanced analysis method in piston-rings evaluation and an applied research developed in at Technical University from Cluj-Napoca with the motor vehicle caught in the repairing process.

  10. Cost-Benefit Analysis for the Advanced Near Net Shape Technology (ANNST) Method for Fabricating Stiffened Cylinders

    Science.gov (United States)

    Ivanco, Marie L.; Domack, Marcia S.; Stoner, Mary Cecilia; Hehir, Austin R.

    2016-01-01

    Low Technology Readiness Levels (TRLs) and high levels of uncertainty make it challenging to develop cost estimates of new technologies in the R&D phase. It is however essential for NASA to understand the costs and benefits associated with novel concepts, in order to prioritize research investments and evaluate the potential for technology transfer and commercialization. This paper proposes a framework to perform a cost-benefit analysis of a technology in the R&D phase. This framework was developed and used to assess the Advanced Near Net Shape Technology (ANNST) manufacturing process for fabricating integrally stiffened cylinders. The ANNST method was compared with the conventional multi-piece metallic construction and composite processes for fabricating integrally stiffened cylinders. Following the definition of a case study for a cryogenic tank cylinder of specified geometry, data was gathered through interviews with Subject Matter Experts (SMEs), with particular focus placed on production costs and process complexity. This data served as the basis to produce process flowcharts and timelines, mass estimates, and rough order-of-magnitude cost and schedule estimates. The scalability of the results was subsequently investigated to understand the variability of the results based on tank size. Lastly, once costs and benefits were identified, the Analytic Hierarchy Process (AHP) was used to assess the relative value of these achieved benefits for potential stakeholders. These preliminary, rough order-of-magnitude results predict a 46 to 58 percent reduction in production costs and a 7-percent reduction in weight over the conventional metallic manufacturing technique used in this study for comparison. Compared to the composite manufacturing technique, these results predict cost savings of 35 to 58 percent; however, the ANNST concept was heavier. In this study, the predicted return on investment of equipment required for the ANNST method was ten cryogenic tank barrels

  11. Interactive 3D imaging technologies: application in advanced methods of jaw bone reconstruction using stem cells/pre-osteoblasts in oral surgery.

    Science.gov (United States)

    Wojtowicz, Andrzej; Jodko, Monika; Perek, Jan; Popowski, Wojciech

    2014-09-01

    Cone beam computed tomography has created a specific revolution in maxillofacial imaging, facilitating the transition of diagnosis from 2D to 3D, and expanded the role of imaging from diagnosis to the possibility of actual planning. There are many varieties of cone beam computed tomography-related software available, from basic DICOM viewers to very advanced planning modules, such as InVivo Anatomage, and SimPlant (Materialise Dental). Through the use of these programs scans can be processed into a three-dimensional high-quality simulation which enables planning of the overall treatment. In this article methods of visualization are demonstrated and compared, in the example of 2 cases of reconstruction of advanced jaw bone defects using tissue engineering. Advanced imaging methods allow one to plan a miniinvasive treatment, including assessment of the bone defect's shape and localization, planning a surgical approach and individual graft preparation.

  12. Advancing methods for health priority setting practice through the contribution of systems theory: Lessons from a case study in Ethiopia.

    Science.gov (United States)

    Petricca, Kadia; Bekele, Asfaw; Berta, Whitney; Gibson, Jennifer; Pain, Clare

    2017-12-09

    Setting priorities for health services is a complex and value laden process. Over the past twenty years, there has been considerable scholarly attention paid to strengthening fairness and legitimacy using the prominent ethical framework, Accountability for Reasonableness (A4R). A variety of case studies applying A4R have advanced our conceptual understanding of procedural fairness, and have highlighted the significance of context through its application. There is a paucity of research, however, that rigorously examines how and to what extent context influences health priority setting processes and the establishment of procedural fairness. We argue here that to study context rigorously requires taking a holistic view of the system by examining the dynamics and interrelationships within it. Using the Transformative Systems Change Framework (TSCF), this investigation sought to examine the influence of system factors on priority setting practice and procedural fairness. A qualitative case study of Ethiopian district health planning was undertaken in 2010 and 2011. Methods included 58 qualitative interviews with decision makers, participant observation, and document analysis. Data analysis followed in three phases: i) an inductive analysis of district health priority setting to highlight experiences across each of the three districts selected, ii) deductive analysis applying A4R and the TSCF independently; and iii) a synthesis of concepts of priority setting practice and procedural fairness within a broader, theoretical understanding of the system. Through the application of the TSCF, a nuanced understanding of priority setting practice is revealed that situates this process within a system of interdependent components that include: norms, operations, regulations, and resources. This paper offers a practical guide attuned to system features influencing the design, implementation, and sustainability of greater fairness in health priority setting practice. Copyright

  13. Reactor physics methods, models, and applications used to support the conceptual design of the Advanced Neutron Source

    International Nuclear Information System (INIS)

    Gehin, J.C.; Worley, B.A.; Renier, J.P.; Wemple, C.A.; Jahshan, S.N.; Ryskammp, J.M.

    1995-08-01

    This report summarizes the neutronics analysis performed during 1991 and 1992 in support of characterization of the conceptual design of the Advanced Neutron Source (ANS). The methods used in the analysis, parametric studies, and key results supporting the design and safety evaluations of the conceptual design are presented. The analysis approach used during the conceptual design phase followed the same approach used in early ANS evaluations: (1) a strong reliance on Monte Carlo theory for beginning-of-cycle reactor performance calculations and (2) a reliance on few-group diffusion theory for reactor fuel cycle analysis and for evaluation of reactor performance at specific time steps over the fuel cycle. The Monte Carlo analysis was carried out using the MCNP continuous-energy code, and the few- group diffusion theory calculations were performed using the VENTURE and PDQ code systems. The MCNP code was used primarily for its capability to model the reflector components in realistic geometries as well as the inherent circumvention of cross-section processing requirements and use of energy-collapsed cross sections. The MCNP code was used for evaluations of reflector component reactivity effects and of heat loads in these components. The code was also used as a benchmark comparison against the diffusion-theory estimates of key reactor parameters such as region fluxes, control rod worths, reactivity coefficients, and material worths. The VENTURE and PDQ codes were used to provide independent evaluations of burnup effects, power distributions, and small perturbation worths. The performance and safety calculations performed over the subject time period are summarized, and key results are provided. The key results include flux and power distributions over the fuel cycle, silicon production rates, fuel burnup rates, component reactivities, control rod worths, component heat loads, shutdown reactivity margins, reactivity coefficients, and isotope production rates

  14. A comparative study of two digestion methods employed for the determination boron in ferroboron used as an advanced shielding material

    International Nuclear Information System (INIS)

    Kamble, Granthali S.; Manisha, V.; Venkatesh, K.

    2015-01-01

    Shielding of nuclear reactor core is an important requirement of fast reactors. An important objective of future Fast Breeder Reactors (FBRs) is to reduce the volume of shields. A large number of materials have been considered for use to reduce the neutron flux to acceptable levels. A shield material which brings down the energy of neutrons by elastic and inelastic scattering along with absorption will be more effective. Ferro boron is identified as one of the advanced shielding materials considered for use in future FBRs, planned to be constructed in India. Ferroboron is an economical and indigenously available material which qualifies as a promising shield material through literature survey and scoping calculations. Experiments have been conducted in KAMINI reactor to understand the effectiveness of prospective shield material Ferro-boron as an in-core shield material for future FBRs. The Ferro boron used in these experiments contained 11.8% and 15% of boron. Precise determination of boron content in these ferro boron samples is very important to determine its effectiveness as a shield material. In this work a comparative study was carried out to determine the boron content in ferro boron samples. In the first method the sample was treated with incremental amounts of nitric acid under reflux (to prevent rigorous reaction and volatalisation of boron). The solution was gradually heated and the solution was filtered through a Whatman Filter paper no. 41. The undissolved ferro boron residue collected in the filter paper after filtration, is transferred to a platinum crucible; mixed with sodium carbonate and is ashed. The crucible is placed over a burner for 1 h to fuse the contents. The fused mass is leached in dilute hydrochloric acid, added to the nitric acid filtrate and made up to pre-determined volume

  15. Grid-Free LES 3D Vortex Method for the Simulation of Tubulent Flows Over Advanced Lifting Surfaces, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Turbulent flows associated with advanced aerodynamic designs represent a considerable challenge for accurate prediction. For example, the flow past low-speed wings...

  16. Advanced Instrumentation and Control Methods for Small and Medium Reactors with IRIS Demonstration. Final Report. Volume 1

    International Nuclear Information System (INIS)

    Hines, J. Wesley; Upadhyaya, Belle R.; Doster, J. Michael; Edwards, Robert M.; Lewis, Kenneth D.; Turinsky, Paul; Coble, Jamie

    2011-01-01

    Development and deployment of small-scale nuclear power reactors and their maintenance, monitoring, and control are part of the mission under the Small Modular Reactor (SMR) program. The objectives of this NERI-consortium research project are to investigate, develop, and validate advanced methods for sensing, controlling, monitoring, diagnosis, and prognosis of these reactors, and to demonstrate the methods with application to one of the proposed integral pressurized water reactors (IPWR). For this project, the IPWR design by Westinghouse, the International Reactor Secure and Innovative (IRIS), has been used to demonstrate the techniques developed under this project. The research focuses on three topical areas with the following objectives. Objective 1 - Develop and apply simulation capabilities and sensitivity/uncertainty analysis methods to address sensor deployment analysis and small grid stability issues. Objective 2 - Develop and test an autonomous and fault-tolerant control architecture and apply to the IRIS system and an experimental flow control loop, with extensions to multiple reactor modules, nuclear desalination, and optimal sensor placement strategy. Objective 3 - Develop and test an integrated monitoring, diagnosis, and prognosis system for SMRs using the IRIS as a test platform, and integrate process and equipment monitoring (PEM) and process and equipment prognostics (PEP) toolboxes. The research tasks are focused on meeting the unique needs of reactors that may be deployed to remote locations or to developing countries with limited support infrastructure. These applications will require smaller, robust reactor designs with advanced technologies for sensors, instrumentation, and control. An excellent overview of SMRs is described in an article by Ingersoll (2009). The article refers to these as deliberately small reactors. Most of these have modular characteristics, with multiple units deployed at the same plant site. Additionally, the topics focus

  17. A Takagi–Sugeno fuzzy power-distribution method for a prototypical advanced reactor considering pump degradation

    OpenAIRE

    Yue Yuan; Jamie Coble

    2017-01-01

    Advanced reactor designs often feature longer operating cycles between refueling and new concepts of operation beyond traditional baseload electricity production. Owing to this increased complexity, traditional proportional–integral control may not be sufficient across all potential operating regimes. The prototypical advanced reactor (PAR) design features two independent reactor modules, each connected to a single dedicated steam generator that feeds a common balance of plant for electricity...

  18. "It's the Method, Stupid." Interrelations between Methodological and Theoretical Advances: The Example of Comparing Higher Education Systems Internationally

    Science.gov (United States)

    Hoelscher, Michael

    2017-01-01

    This article argues that strong interrelations between methodological and theoretical advances exist. Progress in, especially comparative, methods may have important impacts on theory evaluation. By using the example of the "Varieties of Capitalism" approach and an international comparison of higher education systems, it can be shown…

  19. Earth observing data and methods for advancing water harvesting technologies in the semi-arid rain-fed environments of India

    Science.gov (United States)

    Sharma, C.; Thenkabail, P.; Sharma, R. R.

    2011-01-01

    The paper develops approaches and methods of modeling and mapping land and water productivity of rain-fed crops in semi-arid environments of India using hyperspectral, hyperspatial, and advanced multispectral remote sensing data and linking the same to field-plot data and climate station data. The overarching goal is to provide information to advance water harvesting technologies in the agricultural croplands of the semi-arid environments of India by conducting research in a representative pilot site in Jodhpur, Rajasthan, India. ?? 2011 IEEE.

  20. Exploring the relationship between the engineering and physical sciences and the health and life sciences by advanced bibliometric methods

    NARCIS (Netherlands)

    Waltman, L.R.; Van, Raan A.F.J.; Smart, S.

    2014-01-01

    We investigate the extent to which advances in the health and life sciences (HLS) are dependent on research in the engineering and physical sciences (EPS), particularly physics, chemistry, mathematics, and engineering. The analysis combines two different bibliometric approaches. The first approach

  1. Probe diagnostics in the far scrape-off layer plasma of Korea Superconducting Tokamak Advanced Research tokamak using a sideband harmonic method

    International Nuclear Information System (INIS)

    Kim, Dong-Hwan; Hong, Suk-Ho; Park, Il-Seo; Lee, Hyo-Chang; Kang, Hyun-Ju; Chung, Chin-Wook

    2015-01-01

    Plasma characteristics in the far scrape-off layer region of tokamak play a crucial role in the stable plasma operation and its sustainability. Due to the huge facility, electrical diagnostic systems to measure plasma properties have extremely long cable length resulting in large stray current. To overcome this problem, a sideband harmonic method was applied to the Korea Superconducting Tokamak Advanced Research tokamak plasma. The sideband method allows the measurement of the electron temperature and the plasma density without the effect of the stray current. The measured plasma densities are compared with those from the interferometer, and the results show reliability of the method

  2. Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales

    Energy Technology Data Exchange (ETDEWEB)

    Kollias, Pavlos [McGill Univ., Montreal, QC (Canada

    2016-09-06

    This the final report for the DE-SC0007096 - Advancing Clouds Lifecycle Representation in Numerical Models Using Innovative Analysis Methods that Bridge ARM Observations and Models Over a Breadth of Scales - PI: Pavlos Kollias. The final report outline the main findings of the research conducted using the aforementioned award in the area of cloud research from the cloud scale (10-100 m) to the mesoscale (20-50 km).

  3. Advanced computational methods for the assessment of reactor core behaviour during reactivity initiated accidents. Final report; Fortschrittliche Rechenmethoden zum Kernverhalten bei Reaktivitaetsstoerfaellen. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Pautz, A.; Perin, Y.; Pasichnyk, I.; Velkov, K.; Zwermann, W.; Seubert, A.; Klein, M.; Gallner, L.; Krzycacz-Hausmann, B.

    2012-05-15

    The document at hand serves as the final report for the reactor safety research project RS1183 ''Advanced Computational Methods for the Assessment of Reactor Core Behavior During Reactivity-Initiated Accidents''. The work performed in the framework of this project was dedicated to the development, validation and application of advanced computational methods for the simulation of transients and accidents of nuclear installations. These simulation tools describe in particular the behavior of the reactor core (with respect to neutronics, thermal-hydraulics and thermal mechanics) at a very high level of detail. The overall goal of this project was the deployment of a modern nuclear computational chain which provides, besides advanced 3D tools for coupled neutronics/ thermal-hydraulics full core calculations, also appropriate tools for the generation of multi-group cross sections and Monte Carlo models for the verification of the individual calculational steps. This computational chain shall primarily be deployed for light water reactors (LWR), but should beyond that also be applicable for innovative reactor concepts. Thus, validation on computational benchmarks and critical experiments was of paramount importance. Finally, appropriate methods for uncertainty and sensitivity analysis were to be integrated into the computational framework, in order to assess and quantify the uncertainties due to insufficient knowledge of data, as well as due to methodological aspects.

  4. Research and development project of production method of advanced chemicals by utilizing marine organisms. Kokino kagaku seihin nado seizoho (kaiyo seibutsu katsuyo) no kenkyu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    1990-10-01

    The research results in fiscal 1989 were reported of Research and development project of production method of advanced chemicals by utilizing marine organisms'' promoted by NEDO. After marine organisms were sampled in warm current and subtropical sea areas, aerobic bacteria, yeasts, algae and invertebrates were separated each other. The separation method of microorganisms habitable in cold areas was also found by their separated culture. Amounts and species of zooplanktons and phytoplanktons were surveyed in tropical sea areas to settle on organim sampling plans. The analysis method of lipide in marine microorganisms was found by applying conventional chemical analysis methods to it, and features of its constitutive fatty acids were clarified. The screening method of adhesive microorganisms with protein-decomposing function was found, and such bacteria could be separated. Patents and documents of the equipment related to sampling, separation, culture and preservation of marine organisms were surveyed, and a remote control organism sampler was designed basically.

  5. A Takagi–Sugeno fuzzy power-distribution method for a prototypical advanced reactor considering pump degradation

    Directory of Open Access Journals (Sweden)

    Yue Yuan

    2017-08-01

    Full Text Available Advanced reactor designs often feature longer operating cycles between refueling and new concepts of operation beyond traditional baseload electricity production. Owing to this increased complexity, traditional proportional–integral control may not be sufficient across all potential operating regimes. The prototypical advanced reactor (PAR design features two independent reactor modules, each connected to a single dedicated steam generator that feeds a common balance of plant for electricity generation and process heat applications. In the current research, the PAR is expected to operate in a load-following manner to produce electricity to meet grid demand over a 24-hour period. Over the operational lifetime of the PAR system, primary and intermediate sodium pumps are expected to degrade in performance. The independent operation of the two reactor modules in the PAR may allow the system to continue operating under degraded pump performance by shifting the power production between reactor modules in order to meet overall load demands. This paper proposes a Takagi–Sugeno (T–S fuzzy logic-based power distribution system. Two T–S fuzzy power distribution controllers have been designed and tested. Simulation shows that the devised T–S fuzzy controllers provide improved performance over traditional controls during daily load-following operation under different levels of pump degradation.

  6. A Takagi-Sugeno fuzzy power-distribution method for a prototypical advanced reactor considering pump degradation

    Energy Technology Data Exchange (ETDEWEB)

    Yuan, Yue [Institute of Nuclear and New Energy Technology, Tsinghua University, Collaborative Innovation Center of Advanced Nuclear Energy Technology, Key Laboratory of Advanced Reactor Engineering and Safety of Ministry of Education, Beijing (China); Coble, Jamie [Dept. of Nuclear Engineering, University of Tennessee, Knoxville (United States)

    2017-08-15

    Advanced reactor designs often feature longer operating cycles between refueling and new concepts of operation beyond traditional baseload electricity production. Owing to this increased complexity, traditional proportional–integral control may not be sufficient across all potential operating regimes. The prototypical advanced reactor (PAR) design features two independent reactor modules, each connected to a single dedicated steam generator that feeds a common balance of plant for electricity generation and process heat applications. In the current research, the PAR is expected to operate in a load-following manner to produce electricity to meet grid demand over a 24-hour period. Over the operational lifetime of the PAR system, primary and intermediate sodium pumps are expected to degrade in performance. The independent operation of the two reactor modules in the PAR may allow the system to continue operating under degraded pump performance by shifting the power production between reactor modules in order to meet overall load demands. This paper proposes a Takagi–Sugeno (T–S) fuzzy logic-based power distribution system. Two T–S fuzzy power distribution controllers have been designed and tested. Simulation shows that the devised T–S fuzzy controllers provide improved performance over traditional controls during daily load-following operation under different levels of pump degradation.

  7. A Fast Numerical Method for the Calculation of the Equilibrium Isotopic Composition of a Transmutation System in an Advanced Fuel Cycle

    Directory of Open Access Journals (Sweden)

    F. Álvarez-Velarde

    2012-01-01

    Full Text Available A fast numerical method for the calculation in a zero-dimensional approach of the equilibrium isotopic composition of an iteratively used transmutation system in an advanced fuel cycle, based on the Banach fixed point theorem, is described in this paper. The method divides the fuel cycle in successive stages: fuel fabrication, storage, irradiation inside the transmutation system, cooling, reprocessing, and incorporation of the external material into the new fresh fuel. The change of the fuel isotopic composition, represented by an isotope vector, is described in a matrix formulation. The resulting matrix equations are solved using direct methods with arbitrary precision arithmetic. The method has been successfully applied to a double-strata fuel cycle with light water reactors and accelerator-driven subcritical systems. After comparison to the results of the EVOLCODE 2.0 burn-up code, the observed differences are about a few percents in the mass estimations of the main actinides.

  8. Experimental tests and qualification of analytical methods to address thermohydraulic phenomena in advanced water cooled reactors. Proceedings of a technical committee meeting

    International Nuclear Information System (INIS)

    2000-05-01

    Worldwide there is considerable experience in nuclear power technology, especially in water cooled reactor technology. Of the operating plants, in September 1998, 346 were light water reactors (LWRs) totalling 306 GW(e) and 29 were heavy water reactors (HWRs) totalling 15 GW(e). The accumulated experience and lessons learned from these plants are being incorporated into new advanced reactor designs. Utility requirements documents have been formulated to guide these design activities by incorporating this experience, and results from research and development programmes, with the aim of reducing costs and licensing uncertainties by establishing the technical bases for the new designs. Common goals for advanced designs are high availability, user-friendly features, competitive economics and compliance with internationally recognized safety objectives. Large water cooled reactors with power outputs of 1300 MW(e) and above, which possess inherent safety characteristics (e.g. negative Doppler moderator temperature coefficients, and negative moderator void coefficient) and incorporate proven, active engineered systems to accomplish safety functions are being developed. Other designs with power outputs from, for example, 220 MW(e) up to about 1300 MW(e) which also possess inherent safety characteristics and which place more emphasis on utilization of passive safety systems are being developed. Passive systems are based on natural forces and phenomena such as natural convection and gravity, making safety functions less dependent on active systems and components like pumps and diesel generators. In some cases, further experimental tests for the thermohydraulic conditions of interest in advanced designs can provide improved understanding of the phenomena. Further, analytical methods to predict reactor thermohydraulic behaviour can be qualified for use by comparison with the experimental results. These activities should ultimately result in more economical designs. The

  9. Recent Advances in the Analysis of Macromolecular Interactions Using the Matrix-Free Method of Sedimentation in the Analytical Ultracentrifuge

    Directory of Open Access Journals (Sweden)

    Stephen E. Harding

    2015-03-01

    Full Text Available Sedimentation in the analytical ultracentrifuge is a matrix free solution technique with no immobilisation, columns, or membranes required and can be used to study self-association and complex or “hetero”-interactions, stoichiometry, reversibility and interaction strength of a wide variety of macromolecular types and across a very large dynamic range (dissociation constants from 10−12 M to 10−1 M. We extend an earlier review specifically highlighting advances in sedimentation velocity and sedimentation equilibrium in the analytical ultracentrifuge applied to protein interactions and mucoadhesion and to review recent applications in protein self-association (tetanus toxoid, agrin, protein-like carbohydrate association (aminocelluloses, carbohydrate-protein interactions (polysaccharide-gliadin, nucleic-acid protein (G-duplexes, nucleic acid-carbohydrate (DNA-chitosan and finally carbohydrate-carbohydrate (xanthan-chitosan and a ternary polysaccharide complex interactions.

  10. The Status and Promise of Advanced M&V: An Overview of “M&V 2.0” Methods, Tools, and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Franconi, Ellen [Rocky Mountain Inst., Boulder, CO (United States); Gee, Matt [Univ. of Chicago, IL (United States); Goldberg, Miriam [DNV GL, Oslo (Norway); Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Guiterman, Tim [EnergySavvy, Seattle, WA (United States); Li, Michael [U.S. Department of Energy (DOE), Baltimore, MD (United States); Smith, Brian Arthur [Pacific Gas and Electric, San Francisco, CA (United States)

    2017-04-11

    Advanced measurement and verification (M&V) of energy efficiency savings, often referred to as M&V 2.0 or advanced M&V, is currently an object of much industry attention. Thus far, however, there has been a lack of clarity about what techniques M&V 2.0 includes, how those techniques differ from traditional approaches, what the key considerations are for their use, and what value propositions M&V 2.0 presents to different stakeholders. The objective of this paper is to provide background information and frame key discussion points related to advanced M&V. The paper identifies the benefits, methods, and requirements of advanced M&V and outlines key technical issues for applying these methods. It presents an overview of the distinguishing elements of M&V 2.0 tools and of how the industry is addressing needs for tool testing, consistency, and standardization, and it identifies opportunities for collaboration. In this paper, we consider two key features of M&V 2.0: (1) automated analytics that can provide ongoing, near-real-time savings estimates, and (2) increased data granularity in terms of frequency, volume, or end-use detail. Greater data granularity for large numbers of customers, such as that derived from comprehensive implementation of advanced metering infrastructure (AMI) systems, leads to very large data volumes. This drives interest in automated processing systems. It is worth noting, however, that automated processing can provide value even when applied to less granular data, such as monthly consumption data series. Likewise, more granular data, such as interval or end-use data, delivers value with or without automated processing, provided the processing is manageable. But it is the combination of greater data detail with automated processing that offers the greatest opportunity for value. Using M&V methods that capture load shapes together with automated processing1 can determine savings in near-real time to provide stakeholders with more timely and

  11. WE-EF-303-04: An Advanced Image Processing Method to Improve the Spatial Resolution of Proton Radiographies

    Energy Technology Data Exchange (ETDEWEB)

    Rinaldi, I [Lyon 1 University and CNRS/IN2P3, UMR 5822, Villeurbanne (France); Ludwig Maximilian University, Garching, DE (Germany); Heidelberg University Hospital, Heidelberg, DE (Germany); Parodi, K [Ludwig Maximilian University, Garching, DE (Germany); Heidelberg University Hospital, Heidelberg, DE (Germany); Krah, N [Heidelberg Collaboratory for Image Processing, Heidelberg, DE (Germany)

    2015-06-15

    Purpose: We present an optimization method to improve the spatial resolution and the water equivalent thickness accuracy of proton radiographies. Methods: The method is designed for imaging systems measuring only the residual range of protons without relying on tracker detectors to determine the beam trajectory before and after the target. Specifically, the method was used for an imaging set-up consisting of a stack of 61 parallel-plate ionization chambers (PPIC) working as a range telescope. The method uses a decomposition approach of the residual range signal measured by the PPIC and constructs subimages with small size pixels geometrically rearranged and appropriately averaged to be merged into a final single radiography. The method was tested using Monte Carlo simulated and experimental proton radiographies of a PMMA step phantom and an anthropomorphic head phantom. Results: For the step phantom, the effective spatial resolution was found to be 4 and 3 times higher than the nominal resolution for the simulated and experimental radiographies, respectively. For the head phantom, a gamma index was calculated to quantify the conformity of the simulated proton radiographies with a digitally reconstructed X-ray radiography convolved with a Gaussian kernel equal to the proton beam spot-size. For DTA=2.5 mm and RD=2.5%, the passing ratio was 100%/85% for the optimized/non-optimized case, respectively. An extension of the method allows reducing the dose given to the patient during radiography acquisition. We show that despite a dose reduction of 25 times (leading to a dose of 0.016 mGy for the current imaging set-up), the image quality of the optimized radiographies remains fairly unaffected for both the simulated and experimental results. Conclusion: The optimization method leads to a significant increase of the spatial resolution allowing recovering image details that are unresolved in non-optimized radiographies. These results represent a major step towards clinical

  12. Stenting as a palliative method in the management of advanced squamous cell carcinoma of the oesophagus and gastro-oesophageal junction

    Directory of Open Access Journals (Sweden)

    Janusz Wlodarczyk

    2016-03-01

    Full Text Available Advanced squamous cell carcinoma of the oesophagus and gastroesophageal junction usually requires palliative treatment, and the method of choice is stenting. There are several types of stents currently available, including: self-expandable metallic stents (fully or partially covered; self-expandable plastic stents; biodegradable stents. Each of the mentioned stents has its advantages and limitations, and requires a proper, patient-tailored selection. Due to the close anatomical relationship between the oesophagus and bronchial tree, some patients may require bilateral stenting. Oesophageal stenting may not only be considered as a palliative procedure, but can also be implemented to alleviate dysphagia during preoperative chemotherapy and/or radiotherapy.

  13. Advancing Dose-Response Assessment Methods for Environmental Regulatory Impact Analysis: A Bayesian Belief Network Approach Applied to Inorganic Arsenic.

    Science.gov (United States)

    Zabinski, Joseph W; Garcia-Vargas, Gonzalo; Rubio-Andrade, Marisela; Fry, Rebecca C; Gibson, Jacqueline MacDonald

    2016-05-10

    Dose-response functions used in regulatory risk assessment are based on studies of whole organisms and fail to incorporate genetic and metabolomic data. Bayesian belief networks (BBNs) could provide a powerful framework for incorporating such data, but no prior research has examined this possibility. To address this gap, we develop a BBN-based model predicting birthweight at gestational age from arsenic exposure via drinking water and maternal metabolic indicators using a cohort of 200 pregnant women from an arsenic-endemic region of Mexico. We compare BBN predictions to those of prevailing slope-factor and reference-dose approaches. The BBN outperforms prevailing approaches in balancing false-positive and false-negative rates. Whereas the slope-factor approach had 2% sensitivity and 99% specificity and the reference-dose approach had 100% sensitivity and 0% specificity, the BBN's sensitivity and specificity were 71% and 30%, respectively. BBNs offer a promising opportunity to advance health risk assessment by incorporating modern genetic and metabolomic data.

  14. Large Scale Screening of Low Cost Ferritic Steel Designs For Advanced Ultra Supercritical Boiler Using First Principles Methods

    Energy Technology Data Exchange (ETDEWEB)

    Ouyang, Lizhi [Tennessee State Univ. Nashville, TN (United States)

    2016-11-29

    Advanced Ultra Supercritical Boiler (AUSC) requires materials that can operate in corrosive environment at temperature and pressure as high as 760°C (or 1400°F) and 5000psi, respectively, while at the same time maintain good ductility at low temperature. We develop automated simulation software tools to enable fast large scale screening studies of candidate designs. While direct evaluation of creep rupture strength and ductility are currently not feasible, properties such as energy, elastic constants, surface energy, interface energy, and stack fault energy can be used to assess their relative ductility and creeping strength. We implemented software to automate the complex calculations to minimize human inputs in the tedious screening studies which involve model structures generation, settings for first principles calculations, results analysis and reporting. The software developed in the project and library of computed mechanical properties of phases found in ferritic steels, many are complex solid solutions estimated for the first time, will certainly help the development of low cost ferritic steel for AUSC.

  15. Advanced method for automatic processing of seismic and infra-sound data; Methodes avancees de traitement automatique de donnees sismiques et infrasoniques

    Energy Technology Data Exchange (ETDEWEB)

    Cansi, Y.; Crusem, R. [CEA Centre d`Etudes de Limeil, 94 - Villeneuve-Saint-Georges (France)

    1997-11-01

    Governmental organizations have manifested their need for rapid and precise information in the two main fields covered by operational seismology, i.e.: major earthquake alerts and the detection of nuclear explosions. To satisfy both of these constraints, it is necessary to implement increasingly elaborate automation methods for processing the data. Automatic processing methods are mainly based on the flowing elementary steps: detection of a seismic signal on a recording; identification of the type of wave associated with the signal; linking of the different detected arrivals to the same seismic event; localization of the source, which also determines the characteristics of the event. Otherwise, two main categories of processing may be distinguished: methods suitable for large aperture networks, which are characterized by single-channel treatment for detection and identification, and antenna-type methods which are based on searching for consistent signals on the scale of the net work. Within the two main fields of research mentioned here, our effort has focused on regional-scale seismic waves in relation to large-aperture networks as well as on detection techniques using a mini-network (antenna). We have taken advantage of the extensive set of examples in order to implement an automatic procedure for identifying regional seismic waves on single-channel recordings. With the mini-networks, we have developed a novel method universally applicable and successfully applied to various different types of recording (e.g. seismic, micro-barometric, etc) and networks adapted to different wavelength bands. (authors) 7 refs.

  16. Advances in the discontinuous Galerkin method: Hybrid schemes and applications to the reactive infiltration instability in an upwelling compacting mantle

    Science.gov (United States)

    Schiemenz, Alan R.

    High-order methods are emerging in the scientific computing community as superior alternatives to the classical finite difference, finite volume, and continuous finite element methods. The discontinuous Galerkin (DG) method in particular combines many of the positive features of all of these methods. This thesis presents two projects involving the DG method. First, a Hybrid scheme is presented, which implements DG areas where the solution is considered smooth, while dropping the order of the scheme elsewhere and implementing a finite volume scheme with high-order, non-oscillatory solution reconstructions suitable for unstructured mesh. Two such reconstructions from the ENO class are considered in the Hybrid. Successful numerical results are presented for nonlinear systems of conservation laws in one dimension. Second, the high-order discontinuous Galerkin and Fourier spectral methods are applied to an application modeling three-phase fluid flow through a porous medium, undergoing solid-fluid reaction due to the reactive infiltration instability (RII). This model incorporates a solid upwelling term and an equation to track the abundance of the reacting mineral orthopyroxene (opx). After validating the numerical discretization, results are given that provide new insight into the formation of melt channels in the Earth's mantle. Mantle heterogeneities are observed to be one catalyst for the development of melt channels, and the dissolution of opx produces interesting bifurcations in the melt channels. An alternative formulation is considered where the mass transfer rate relative to velocity is taken to be infinitely large. In this setting, the stiffest terms are removed, greatly reducing the cost of time integration.

  17. An Experience of Forest Inventory by Photo Interpretation Method Based on Advanced Firmware and Digital Aerial Photographs of New Generation

    Directory of Open Access Journals (Sweden)

    V. I. Arkhipov

    2014-10-01

    Full Text Available The main stages of the developed technology of forest inventory by interpretation method, named «From survey – to project», with the use of modern aerial survey data, special software and hardware are discussed in the paper. A need for development of high-end technology of forest inventory is due to increasing demands of state, business, and civil community for actual and correct information about forests. The tasks of research were: integration software and hardware into single technology, testing on the real object, and development of recommendations for introduction into production and forming of system of preparing specialists for forest interpretation. Positive results of experimental works by measurement and analytical forest interpretation in stereo regime on base of photogrammetric software were obtained by specialists from Russia, Croatia, Belarus, and Sweden. In the technology «From survey – to project», the following instruments are used: photogrammetric complex Vision Map A3, digital photogrammetric system Photomod, program «ESAUL», GIS ArcGIS, special hardware for stereo visualization. Results of testing this technology are shown on example of model territory. Comparison of results of forest inventory obtained by interpretation method and results of control inventory obtained by enumeration method demonstrated that errors of determination of main forest inventory characteristics do not exceed the norms. The advantages of practical use of the technology are shown. It has been noted that forest inventory by interpretation method is a complex psychophysiological process and it requires an attraction of specialists with high qualification on base of special training. It is indicated the necessity of forming system for training forest inventory specialists on interpretation method. The designed and prepared curriculums and training manuals for interpretation method in forestry are listed.

  18. Validation of noninvasive methods to predict the presence of gastroesophageal varices in a cohort of patients with compensated advanced chronic liver disease.

    Science.gov (United States)

    Llop, Elba; Lopez, Marta; de la Revilla, Juan; Fernandez, Natalia; Trapero, Maria; Hernandez, Marta; Fernández-Carrillo, Carlos; Pons, Fernando; Martinez, Jose Luis; Calleja, Jose Luis

    2017-11-01

    The aim was to validate noninvasive methods to predict the presence of gastroesophageal varices (GEV) in patients with suspected compensated advanced chronic liver disease. We retrospectively reviewed clinical and radiological data collected prospectively between September 2013 and September 2015. We reviewed 442 consecutive patients with suspected compensated advanced chronic liver disease measured by transient elastography (TE) and a gastroscopy. We evaluated platelets, spleen diameter, TE, liver stiffness × spleen size/platelets (LSPS), variceal risk index (VRI), Baveno VI strategy, and Augustin algorithm. One hundred sixty-one out of 442 patients were included. Patients with GEV were compared with patients without GEV and showed statistically significant differences in platelet count (117 SD 51 vs 149 SD 62; P = 0.02), spleen diameter (13.0 SD 1.9 vs 11.5 SD 2; P = 0.003), and TE (28 SD 15 vs 19 SD 10; P = 0.001). Single methods (platelet count and TE) diagnosed correctly 51% and 71.4% of patients. Combined methods (LSPS, VRI, Baveno VI, and Augustin algorithm) diagnosed correctly 78%, 83.6%, 45.3%, and 57.1% of patients. Patients with GEV misdiagnosed: platelets 5/161 (3.1%), TE 6/161 (3.7%), LSPS 16/159 (10%), VRI 18/159 (11.3%), Baveno VI 3/161 (1.8%), and Augustin algorithm 6/161 (3.7%). Rate of unnecessary gastroscopies: platelets 46%, TE 25%, LSPS 13%, VRI 6%, Baveno VI 53%, and Augustin algorithm 39.1%. A significant number of patients were classified correctly using TE, LSPS, and VRI; however, LSPS and VRI had unacceptable rates of misdiagnoses. TE is the best noninvasive single method and the Baveno VI strategy the best combined method. © 2017 Journal of Gastroenterology and Hepatology Foundation and John Wiley & Sons Australia, Ltd.

  19. AgriSense-STARS: Advancing Methods of Agricultural Monitoring for Food Security in Smallholder Regions - the Case for Tanzania

    Science.gov (United States)

    Dempewolf, J.; Becker-Reshef, I.; Nakalembe, C. L.; Tumbo, S.; Maurice, S.; Mbilinyi, B.; Ntikha, O.; Hansen, M.; Justice, C. J.; Adusei, B.; Kongo, V.

    2015-12-01

    In-season monitoring of crop conditions provides critical information for agricultural policy and decision making and most importantly for food security planning and management. Nationwide agricultural monitoring in countries dominated by smallholder farming systems, generally relies on extensive networks of field data collectors. In Tanzania, extension agents make up this network and report on conditions across the country, approaching a "near-census". Data is collected on paper which is resource and time intensive, as well as prone to errors. Data quality is ambiguous and there is a general lack of clear and functional feedback loops between farmers, extension agents, analysts and decision makers. Moreover, the data are not spatially explicit, limiting the usefulness for analysis and quality of policy outcomes. Despite significant advances in remote sensing and information communication technologies (ICT) for monitoring agriculture, the full potential of these new tools is yet to be realized in Tanzania. Their use is constrained by the lack of resources, skills and infrastructure to access and process these data. The use of ICT technologies for data collection, processing and analysis is equally limited. The AgriSense-STARS project is developing and testing a system for national-scale in-season monitoring of smallholder agriculture using a combination of three main tools, 1) GLAM-East Africa, an automated MODIS satellite image processing system, 2) field data collection using GeoODK and unmanned aerial vehicles (UAVs), and 3) the Tanzania Crop Monitor, a collaborative online portal for data management and reporting. These tools are developed and applied in Tanzania through the National Food Security Division of the Ministry of Agriculture, Food Security and Cooperatives (MAFC) within a statistically representative sampling framework (area frame) that ensures data quality, representability and resource efficiency.

  20. Advanced analytical method of nereistoxin using mixed-mode cationic exchange solid-phase extraction and GC/MS.

    Science.gov (United States)

    Park, Yujin; Choe, Sanggil; Lee, Heesang; Jo, Jiyeong; Park, Yonghoon; Kim, Eunmi; Pyo, Jaesung; Jung, Jee H

    2015-07-01

    Nereistoxin(NTX) was originated from a marine annelid worm Lumbriconereis heteropoda and its analogue pesticides including cartap, bensultap, thiocyclam and thiobensultap have been commonly used in agriculture, because of their low toxicity and high insecticidal activity. However, NTX has been reported about its inhibitory neuro toxicity in human and animal body, by blocking nicotinic acetylcholine receptor and it cause significant neuromuscular toxicity, resulting in respiratory failure. We developed a new method to determine NTX in biological fluid. The method involves mixed-mode cationic exchange based solid phase extraction and gas chromatography/mass spectrometry for final identification and quantitative analysis. The limit of detection and recovery were substantially better than those of other methods using liquid-liquid extraction or headspace solid phase microextraction. The good recoveries (97±14%) in blood samples were obtained and calibration curves over the range 0.05-20 mg/L have R2 values greater than 0.99. The developed method was applied to a fatal case of cartap intoxication of 74 years old woman who ingested cartap hydrochloride for suicide. Cartap and NTX were detected from postmortem specimens and the cause of the death was ruled to be nereistoxin intoxication. The concentrations of NTX were 2.58 mg/L, 3.36 mg/L and 1479.7 mg/L in heart, femoral blood and stomach liquid content, respectively. The heart blood/femoral blood ratio of NTX was 0.76. Copyright © 2015. Published by Elsevier Ireland Ltd.

  1. Recent advances in the spectral green's function method for monoenergetic slab-geometry fixed-source adjoint transport problems in S{sub N} formulation

    Energy Technology Data Exchange (ETDEWEB)

    Curbelo, Jesus P.; Alves Filho, Hermes; Barros, Ricardo C., E-mail: jperez@iprj.uerj.br, E-mail: halves@iprj.uerj.br, E-mail: rcbarros@pq.cnpq.br [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Instituto Politecnico. Programa de Pos-Graduacao em Modelagem Computacional; Hernandez, Carlos R.G., E-mail: cgh@instec.cu [Instituto Superior de Tecnologias y Ciencias Aplicadas (InSTEC), La Habana (Cuba)

    2015-07-01

    The spectral Green's function (SGF) method is a numerical method that is free of spatial truncation errors for slab-geometry fixed-source discrete ordinates (S{sub N}) adjoint problems. The method is based on the standard spatially discretized adjoint S{sub N} balance equations and a nonstandard adjoint auxiliary equation expressing the node-average adjoint angular flux, in each discretization node, as a weighted combination of the node-edge outgoing adjoint fluxes. The auxiliary equation contains parameters which act as Green's functions for the cell-average adjoint angular flux. These parameters are determined by means of a spectral analysis which yields the local general solution of the S{sub N} equations within each node of the discretization grid. In this work a number of advances in the SGF adjoint method are presented: the method is extended to adjoint S{sub N} problems considering linearly anisotropic scattering and non-zero prescribed boundary conditions for the forward source-detector problem. Numerical results to typical model problems are considered to illustrate the efficiency and accuracy of the o offered method. (author)

  2. Advances in the development of a subgroup method for the self-shielding of resonant isotopes in arbitrary geometries

    International Nuclear Information System (INIS)

    Hebert, A.

    1997-01-01

    The subgroup method is used to compute self-shielded cross sections defined over coarse energy groups in the resolved energy domain. The validity of the subgroup approach was extended beyond the unresolved energy domain by partially taking into account correlation effects between the slowing-down source with the collision probability terms of the transport equation. This approach enables one to obtain a pure subgroup solution of the self-shielding problem without relying on any form of equivalence in dilution. Specific improvements are presented on existing subgroup methods: an N-term rational approximation for the fuel-to-fuel collision probability, a new Pade deflation technique for computing probability tables, and the introduction of a superhomogenization correction. The absorption rates obtained after self-shielding are compared with exact values obtained using an elastic slowing-down calculation where each resonance is modeled individually in the resolved energy domain

  3. New advances in the forced response computation of periodic structures using the wave finite element (WFE) method

    OpenAIRE

    Mencik , Jean-Mathieu

    2014-01-01

    International audience; The wave finite element (WFE) method is investigated to describe the harmonic forced response of onedimensional periodic structures like those composed of complex substructures and encountered in engineering applications. The dynamic behavior of these periodic structures is analyzed over wide frequency bands where complex spatial dynamics, inside the substructures, are likely to occur.Within theWFE framework, the dynamic behavior of periodic structures is described in ...

  4. An advanced method for flood risk analysis in river deltas, applied to societal flood fatality risk in the Netherlands

    OpenAIRE

    K. M. de Bruijn; F. L. M. Diermanse; J. V. L. Beckers

    2014-01-01

    This paper discusses a new method for flood risk assessment in river deltas. Flood risk analysis of river deltas is complex, because both storm surges and river discharges may cause flooding and the effect of upstream breaches on downstream water levels and flood risk must be taken into account. This paper presents a Monte Carlo-based flood risk analysis framework for policy making, which considers both storm surges and river flood waves and includes effects from hydrodynami...

  5. Current advances in molecular methods for detection of nitrite-dependent anaerobic methane oxidizing bacteria in natural environments

    OpenAIRE

    Chen, Jing; Dick, Richard; Lin, Jih-Gaw; Gu, Ji-Dong

    2016-01-01

    Nitrite-dependent anaerobic methane oxidation (n-damo) process uniquely links microbial nitrogen and carbon cycles. Research on n-damo bacteria progresses quickly with experimental evidences through enrichment cultures. Polymerase chain reaction (PCR)-based methods for detecting them in various natural ecosystems and engineered systems play a very important role in the discovery of their distribution, abundance, and biodiversity in the ecosystems. Important characteristics of n-damo enrichmen...

  6. New theory of discriminant analysis after R. Fisher advanced research by the feature selection method for microarray data

    CERN Document Server

    Shinmura, Shuichi

    2016-01-01

    This is the first book to compare eight LDFs by different types of datasets, such as Fisher’s iris data, medical data with collinearities, Swiss banknote data that is a linearly separable data (LSD), student pass/fail determination using student attributes, 18 pass/fail determinations using exam scores, Japanese automobile data, and six microarray datasets (the datasets) that are LSD. We developed the 100-fold cross-validation for the small sample method (Method 1) instead of the LOO method. We proposed a simple model selection procedure to choose the best model having minimum M2 and Revised IP-OLDF based on MNM criterion was found to be better than other M2s in the above datasets. We compared two statistical LDFs and six MP-based LDFs. Those were Fisher’s LDF, logistic regression, three SVMs, Revised IP-OLDF, and another two OLDFs. Only a hard-margin SVM (H-SVM) and Revised IP-OLDF could discriminate LSD theoretically (Problem 2). We solved the defect of the generalized inverse matrices (Problem 3). For ...

  7. Comparison of the Performance of Two Advanced Spectral Methods for the Analysis of Times Series in Paleoceanography

    Directory of Open Access Journals (Sweden)

    Eulogio Pardo-Igúzquiza

    2015-08-01

    Full Text Available Many studies have revealed the cyclicity of past ocean/atmosphere dynamics at a wide range of time scales (from decadal to millennial time scales, based on the spectral analysis of time series of climate proxies obtained from deep sea sediment cores. Among the many techniques available for spectral analysis, the maximum entropy method and the Thomson multitaper approach have frequently been used because of their good statistical properties and high resolution with short time series. The novelty of the present study is that we compared the two methods by according to the performance of their statistical tests to assess the statistical significance of their power spectrum estimates. The statistical significance of maximum entropy estimates was assessed by a random permutation test (Pardo-Igúzquiza and Rodríguez-Tovar, 2000, while the statistical significance of the Thomson multitaper method was assessed by an F-test (Thomson, 1982. We compared the results obtained in a case study using simulated data where the spectral content of the time series was known and in a case study with real data. In both cases the results are similar: while the cycles identified as significant by maximum entropy and the permutation test have a clear physical interpretation, the F-test with the Thomson multitaper estimator tends to find as no significant the peaks in the low frequencies and tends to give as significant more spurious peaks in the middle and high frequencies. Nevertheless, the best strategy is to use both techniques and to use the advantages of each of them.

  8. Diffusion-weighted magnetic resonance imaging during radiotherapy of locally advanced cervical cancer - treatment response assessment using different segmentation methods

    DEFF Research Database (Denmark)

    Haack, Søren; Tanderup, Kari; Kallehauge, Jesper Folsted

    2015-01-01

    BACKGROUND: Diffusion-weighted magnetic resonance imaging (DW-MRI) and the derived apparent diffusion coefficient (ADC) value has potential for monitoring tumor response to radiotherapy (RT). Method used for segmentation of volumes with reduced diffusion will influence both volume size and observed......2-weighted MR images using the Jaccard similarity index (JSI). ADC values from segmented volumes were compared and changes of ADC values during therapy were evaluated. RESULTS: Significant difference between the four volumes (GTV, DWIcluster, DWISD4 and DWIregion) was found (p

  9. The calibration of rating models estimation of the probability of default based on advanced pattern classification methods

    CERN Document Server

    Konrad, Paul Markus

    2014-01-01

    All across Europe, a drama of historical proportions is unfolding as the debt crisis continues to rock the worldwide financial landscape. Whilst insecurity rises, the general public, policy makers, scientists and academics are searching high and low for independent and objective analyses that may help to assess this unusual situation. For more than a century, rating agencies had developed methods and standards to evaluate and analyze companies, projects or even sovereign countries. However, due to their dated internal processes, the independence of these rating agencies is being questioned, ra

  10. Advanced Mechanistic 3D Spatial Modeling and Analysis Methods to Accurately Represent Nuclear Facility External Event Scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Sezen, Halil [The Ohio State Univ., Columbus, OH (United States). Dept. of Civil, Environmental and Geodetic Engineering; Aldemir, Tunc [The Ohio State Univ., Columbus, OH (United States). College of Engineering, Nuclear Engineering Program, Dept. of Mechanical and Aerospace Engineering; Denning, R. [The Ohio State Univ., Columbus, OH (United States); Vaidya, N. [Rizzo Associates, Pittsburgh, PA (United States)

    2017-12-29

    Probabilistic risk assessment of nuclear power plants initially focused on events initiated by internal faults at the plant, rather than external hazards including earthquakes and flooding. Although the importance of external hazards risk analysis is now well recognized, the methods for analyzing low probability external hazards rely heavily on subjective judgment of specialists, often resulting in substantial conservatism. This research developed a framework to integrate the risk of seismic and flooding events using realistic structural models and simulation of response of nuclear structures. The results of four application case studies are presented.

  11. A new and reliable method for live imaging and quantification of reactive oxygen species in Botrytis cinerea: technological advancement.

    Science.gov (United States)

    Marschall, Robert; Tudzynski, Paul

    2014-10-01

    Reactive oxygen species (ROS) are produced in conserved cellular processes either as by-products of the cellular respiration in mitochondria, or purposefully for defense mechanisms, signaling cascades or cell homeostasis. ROS have two diametrically opposed attributes due to their highly damaging potential for DNA, lipids and other molecules and due to their indispensability for signaling and developmental processes. In filamentous fungi, the role of ROS in growth and development has been studied in detail, but these analyses were often hampered by the lack of reliable and specific techniques to monitor different activities of ROS in living cells. Here, we present a new method for live cell imaging of ROS in filamentous fungi. We demonstrate that by use of a mixture of two fluorescent dyes it is possible to monitor H2O2 and superoxide specifically and simultaneously in distinct cellular structures during various hyphal differentiation processes. In addition, the method allows for reliable fluorometric quantification of ROS. We demonstrate that this can be used to characterize different mutants with respect to their ROS production/scavenging potential. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. Advanced modeling strategy for the analysis of heart valve leaflet tissue mechanics using high-order finite element method.

    Science.gov (United States)

    Mohammadi, Hadi; Bahramian, Fereshteh; Wan, Wankei

    2009-11-01

    Modeling soft tissue using the finite element method is one of the most challenging areas in the field of biomechanical engineering. To date, many models have been developed to describe heart valve leaflet tissue mechanics, which are accurate to some extent. Nevertheless, there is no comprehensive method to modeling soft tissue mechanics, This is because (1) the degree of anisotropy in the heart valve leaflet changes layer by layer due to a variety of collagen fiber densities and orientations that cannot be taken into account in the model and also (2) a constitutive material model fully describing the mechanical properties of the leaflet structure is not available in the literature. In this framework, we develop a new high-order element using p-type finite element formulation to create anisotropic material properties similar to those of the heart valve leaflet tissue in only one single element. This element also takes the nonlinearity of the leaflet tissue into consideration using a bilinear material model. This new element is composed a two-dimensional finite element in the principal directions of leaflet tissue and a p-type finite element in the direction of thickness. The proposed element is easy to implement, much more efficient than standard elements available in commercial finite element packages. This study is one step towards the modeling of soft tissue mechanics using a meshless finite element approach to be applied in real-time haptic feedback of soft-tissue models in virtual reality simulation.

  13. An advanced algorithm for construction of Integral Transport Matrix Method operators using accumulation of single cell coupling factors

    International Nuclear Information System (INIS)

    Powell, B. P.; Azmy, Y. Y.

    2013-01-01

    The Integral Transport Matrix Method (ITMM) has been shown to be an effective method for solving the neutron transport equation in large domains on massively parallel architectures. In the limit of very large number of processors, the speed of the algorithm, and its suitability for unstructured meshes, i.e. other than an ordered Cartesian grid, is limited by the construction of four matrix operators required for obtaining the solution in each sub-domain. The existing algorithm used for construction of these matrix operators, termed the differential mesh sweep, is computationally expensive and was developed for a structured grid. This work proposes the use of a new algorithm for construction of these operators based on the construction of a single, fundamental matrix representing the transport of a particle along every possible path throughout the sub-domain mesh. Each of the operators is constructed by multiplying an element of this fundamental matrix by two factors dependent only upon the operator being constructed and on properties of the emitting and incident cells. The ITMM matrix operator construction time for the new algorithm is demonstrated to be shorter than the existing algorithm in all tested cases with both isotropic and anisotropic scattering considered. While also being a more efficient algorithm on a structured Cartesian grid, the new algorithm is promising in its geometric robustness and potential for being applied to an unstructured mesh, with the ultimate goal of application to an unstructured tetrahedral mesh on a massively parallel architecture. (authors)

  14. Advances in the application of molecular microbiological methods in the oil and gas industry and links to microbiologically influenced corrosion

    DEFF Research Database (Denmark)

    Eckert, Rickard; Skovhus, Torben Lund

    2018-01-01

    mechanisms is still emerging. Different MMMs provide various types of information about microbial diversity, abundance, activity and function, all of which are quite different from the culture-based results that are familiar to oil and gas industry corrosion professionals. In addition, a multidisciplinary......While the oil and gas industry has witnessed increased applications of molecular microbiological methods (MMMs) for diagnosing and managing microbiologically influenced corrosion (MIC) in the past decade, the process for establishing clear links between microbiological conditions and corrosion...... process for establishing the significance of molecular microbiological data in regard to corrosion threat identification, mitigation and monitoring has yet to be clearly established. As a result, the benefits of employing MMMs for MIC management are not yet being fully realized or appreciated. Regardless...

  15. Advances in integrated and sustainable supply chain planning concepts, methods, tools and solution approaches toward a platform for industrial practice

    CERN Document Server

    Laínez-Aguirre, José Miguel

    2015-01-01

    Decision making at the enterprise level often encompass not only production operations and  product R&D, but other strategic functions such as financial planning and marketing. With the aim of maximizing growth and a firm’s value, companies often focus on co-ordinating these functional components as well as traditional hierarchical decision levels. Understanding this interplay can enhance enterprise capabilities of adaptation and response to uncertainties arising from internal processes as well as the external environment. This book presents concepts, methods, tools and solutions based on mathematical programming, which provides the quantitative support needed for integrated decision-making and ultimately for improving the allocation of overall corporate resources (e.g., materials, cash and personnel). Through a systems perspective, the integrated planning of the supply chain also promotes activities of reuse, reduction and recycling for achieving more sustainable environmental impacts of production/di...

  16. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  17. Communication with children about a parent's advanced cancer and measures of parental anxiety and depression: a cross-sectional mixed-methods study.

    Science.gov (United States)

    Hailey, Claire E; Yopp, Justin M; Deal, Allison M; Mayer, Deborah K; Hanson, Laura C; Grunfeld, Gili; Rosenstein, Donald L; Park, Eliza M

    2018-01-01

    Parents with advanced cancer are faced with difficult decision-making about communication about their illness with their children. The objectives of this study were to describe how parents communicated with their children about advanced cancer and to explore associations between communication and parental depression and anxiety. This was a cross-sectional, mixed-methods study of 42 patients with stage IV solid tumor malignancies who had at least one child less than 18 years of age. Participants completed a semi-structured interview and the Hospital Anxiety and Depression Scale (HADS). We used multiple linear regression to evaluate the association between extent of communication and HADS Anxiety and Depression scores. Interview data were analyzed using standard qualitative content and thematic techniques and triangulated with survey data. Higher HADS Anxiety scores, but not HADS Depression scores, were cross-sectionally associated with greater extent of parental communication (p = 0.003), even when controlling for performance status and children's ages. In qualitative analyses, parents who acknowledged the terminal nature of their illness or experienced higher symptom burden were more likely to report that they also communicated more extensively with children. A third of parents (n = 14, 33%) described difficulty with illness-related communication with their children. In this pilot study, parents with advanced cancer who reported more illness-related communication with their children also reported more symptoms of general anxiety. Future interventions should address psychological distress relevant to parenting and further assess how parental communication may be linked to parental mood symptoms.

  18. A building characterization-based method for the advancement of knowledge on external architectural features of traditional rural buildings

    Directory of Open Access Journals (Sweden)

    Porto, S. M. C.

    2013-12-01

    Full Text Available The significant role that traditional rural buildings have with regard to environmental conservation and rural development is widely acknowledged by the scientific community. These buildings must be protected from inappropriate building interventions that may stem from their rather superficial knowledge. Therefore, this study was directed towards overcoming such a limitation by developing a method based on traditional rural buildings’ characterization. In particular, the study aimed at the characterization of building materials and techniques used for the construction of a number of building components that make up the external envelope of traditional rural buildings. The application of the method to a homogeneous area of the Etna Regional Park (Italy highlighted the need to improve the technical norms of the park’s Territorial Coordination Plan to respect the building characteristics of the traditional rural buildings located in the protected area.La comunidad científica le atribuye a las construcciones rurales tradicionales un papel fundamental en términos de conservación del medioambiente y de evolución rural. Dichos edificios deben ser protegidos contra obras inapropiadas debidas a un conocimiento más bien superficial. Por lo tanto, el objetivo de este estudio fue el de eliminar dichas limitaciones desarrollando un método basado en la caracterización de las construcciones rurales tradicionales, que puede ser aplicado para mejorar el conocimiento de estas últimas. En particular, el susodicho estudio tiene la finalidad de caracterizar los materiales y las técnicas constructivas a emplear para la construcción de algunos componentes del envoltorio externo de las construcciones rurales tradicionales. La aplicación del método propuesto a una zona homogénea del Parque Regional del Etna (Italia puso de relieve la necesidad de mejorar las normas técnicas del Plan de Coordinación Territorial del parque para respetar las caracter

  19. An Exploratory Analysis for the Selection and Implementation of Advanced Manufacturing Technology by Fuzzy Multi-criteria Decision Making Methods: A Comparative Study

    Science.gov (United States)

    Nath, Surajit; Sarkar, Bijan

    2017-08-01

    Advanced Manufacturing Technologies (AMTs) offer opportunities for the manufacturing organizations to excel their competitiveness and in turn their effectiveness in manufacturing. Proper selection and evaluation of AMTs is the most significant task in today's modern world. But this involves a lot of uncertainty and vagueness as it requires many conflicting criteria to deal with. So the task of selection and evaluation of AMTs becomes very tedious for the evaluators as they are not able to provide crisp data for the criteria. Different Fuzzy Multi-criteria Decision Making (MCDM) methods help greatly in dealing with this problem. This paper focuses on the application of two very much potential Fuzzy MCDM methods namely COPRAS-G, EVAMIX and a comparative study between them on some rarely mentioned criteria. Each of the two methods is very powerful evaluation tool and has beauty in its own. Although, performance wise these two methods are almost at same level, but, the approach of each one of them are quite unique. This uniqueness is revealed by introducing a numerical example of selection of AMT.

  20. Advancing Physically-Based Flow Simulations of Alluvial Systems Through Atmospheric Noble Gases and the Novel 37Ar Tracer Method

    Science.gov (United States)

    Schilling, Oliver S.; Gerber, Christoph; Partington, Daniel J.; Purtschert, Roland; Brennwald, Matthias S.; Kipfer, Rolf; Hunkeler, Daniel; Brunner, Philip

    2017-12-01

    To provide a sound understanding of the sources, pathways, and residence times of groundwater water in alluvial river-aquifer systems, a combined multitracer and modeling experiment was carried out in an important alluvial drinking water wellfield in Switzerland. 222Rn, 3H/3He, atmospheric noble gases, and the novel 37Ar-method were used to quantify residence times and mixing ratios of water from different sources. With a half-life of 35.1 days, 37Ar allowed to successfully close a critical observational time gap between 222Rn and 3H/3He for residence times of weeks to months. Covering the entire range of residence times of groundwater in alluvial systems revealed that, to quantify the fractions of water from different sources in such systems, atmospheric noble gases and helium isotopes are tracers suited for end-member mixing analysis. A comparison between the tracer-based mixing ratios and mixing ratios simulated with a fully-integrated, physically-based flow model showed that models, which are only calibrated against hydraulic heads, cannot reliably reproduce mixing ratios or residence times of alluvial river-aquifer systems. However, the tracer-based mixing ratios allowed the identification of an appropriate flow model parametrization. Consequently, for alluvial systems, we recommend the combination of multitracer studies that cover all relevant residence times with fully-coupled, physically-based flow modeling to better characterize the complex interactions of river-aquifer systems.

  1. Advances in agronomic management of phytoremediation: methods and results from a 10-year study of metal-polluted soils

    Directory of Open Access Journals (Sweden)

    Teofilo Vamerali

    2012-11-01

    Full Text Available Among green technologies addressed to metal pollution, phytoextraction has received increasing attention in recent years as an alternative to physical and chemical methods of decontamination. Since 1998, as part of an Italian multidisciplinary research team on phytoremediation, we have been carrying out several agronomic investigations with field crops in agricultural soil and pyrite waste, both markedly contaminated by heavy metals. Phytoextraction was rarely an efficient process, requiring a long time even to remove merely the bioavailable metal fraction, but the great metal stock in roots suggests exploring the effectiveness of long-term in planta stabilisation. Poor above-ground productivity was the main factor constraining metal removals, especially in wastes. Without assisting the process, only zinc (Zn, manganese (Mn and copper (Cu were harvested by the canopy in substantial amounts, with an estimated maximum of ~8 kg of metals from an hectare base with rapeseed in the agricultural soil and only 0.33 kg with fodder radish in pyrite waste. Root growth was a key trait in species and genotype selection, in view of the close relationship between root length and metal uptake. The auxins, humic acids and chelators tested on the model plant fodder radish generally increased metal concentrations in plant tissues, but reduced growth and removals. It is currently concluded that phytoremediation efficiency with crop species may be improved through increased productivity by suitable soil management, involving mineral and organic fertilisation, contaminant dilution, soil capping, and metal immobilisation with inorganics and biochar.

  2. Technical Advance: New in vitro method for assaying the migration of primary B cells using an endothelial monolayer as substrate.

    Science.gov (United States)

    Stewart-Hutchinson, Phillip J; Szasz, Taylor P; Jaeger, Emily R; Onken, Michael D; Cooper, John A; Morley, Sharon Celeste

    2017-09-01

    Migration of B cells supports their development and recruitment into functional niches. Therefore, defining factors that control B cell migration will lead to a better understanding of adaptive immunity. In vitro cell migration assays with B cells have been limited by poor adhesion of cells to glass coated with adhesion molecules. We have developed a technique using monolayers of endothelial cells as the substrate for B cell migration and used this technique to establish a robust in vitro assay for B cell migration. We use TNF-α to up-regulate surface expression of the adhesion molecule VCAM-1 on endothelial cells. The ligand VLA-4 is expressed on B cells, allowing them to interact with the endothelial monolayer and migrate on its surface. We tested our new method by examining the role of L-plastin (LPL), an F-actin-bundling protein, in B cell migration. LPL-deficient (LPL -/- ) B cells displayed decreased speed and increased arrest coefficient compared with wild-type (WT) B cells, following chemokine stimulation. However, the confinement ratios for WT and LPL -/- B cells were similar. Thus, we demonstrate how the use of endothelial monolayers as a substrate will support future interrogation of molecular pathways essential to B cell migration. © Society for Leukocyte Biology.

  3. Natural circulation data and methods for advanced water cooled nuclear power plant designs. Proceedings of a technical committee meeting

    International Nuclear Information System (INIS)

    2002-04-01

    The complex set of physical phenomena that occur in a gravity environment when a geometrically distinct heat sink and heat source are connected by a fluid flow path can be identified as natural circulation (NC). No external sources of mechanical energy for the fluid motion are involved when NC is established. Within the present context, natural convection is used to identify the phenomena that occur when a heat source is put in contact with a fluid. Therefore, natural convection characterizes a heat transfer regime that constitutes a subset of NC phenomena. This report provides the presented papers and summarizes the discussions at an IAEA Technical Committee Meeting (TCM) on Natural Circulation Data and Methods for innovative Nuclear Power Plant Design. While the planned scope of the TCM involved all types of reactor designs (light water reactors, heavy water reactors, gas-cooled reactors and liquid metal-cooled reactors), the meeting participants and papers addressed only light water reactors (LWRs) and heavy water reactors (HWRs). Furthermore, the papers and discussion addressed both evolutionary and innovative water cooled reactors, as defined by the IAEA. The accomplishment of the objectives of achieving a high safety level and reducing the cost through the reliance on NC mechanisms, requires a thorough understanding of those mechanisms. Natural circulation systems are usually characterized by smaller driving forces with respect to the systems that use an external source of energy for the fluid motion. For instance, pressure drops caused by vertical bends and siphons in a given piping system, or heat losses to environment are a secondary design consideration when a pump is installed and drives the flow. On the contrary, a significant influence upon the overall system performance may be expected due to the same pressure drops and thermal power release to the environment when natural circulation produces the coolant flow. Therefore, the level of knowledge for

  4. Unmet Supportive Care Needs of Men With Locally Advanced and Metastatic Prostate Cancer on Hormonal Treatment: A Mixed Methods Study.

    Science.gov (United States)

    Paterson, Catherine; Kata, Sławomir Grzegorz; Nandwani, Ghulam; Das Chaudhury, Debi; Nabi, Ghulam

    Men affected by prostate cancer who are undergoing hormone therapy can endure a range of symptoms that can adversely affect quality of life. Little research has been conducted to date, to understand the specific unmet supportive care needs of this patient group within the context of current service delivery. The aim of this study was to understand the experiences of unmet supportive care needs of men affected by prostate cancer on hormone therapy in the United Kingdom. Mixed methods study recruited 31 men with ≥T3 prostate Cancer or worse and treated by hormone therapy. A small cross-sectional survey (European Organization for Research and Treatment of Cancer [EORTC] C30 and PR25, Self-Management Self-Efficacy Scale, and the Supportive Care Needs Survey) was used to inform the interview schedule. Semi-structured interviews were conducted, and framework approach was used to analyze the data. Complex unmet supportive care needs that were related to physical, psychological/emotional, intimacy/sexual, practical, health system/informational, existential, and patient/clinician communication needs are experienced. Men articulated that current healthcare delivery is failing to provide a holistic person-centered model of care. This is one of the few studies that have identified the unmet supportive care needs of men receiving hormone therapy for ≥T3 prostate Cancer or worse. The needs are multiple and far-ranging. Despite national cancer reforms, unmet supportive care needs persist. The findings from this study may be central in the re-design of future services to optimize men's quality of life and satisfaction with care. Clinicians are encouraged to use these finding to help them optimize care delivery and individual quality of life.

  5. A study on the advanced methods for on-line signal processing by using artificial intelligence in nuclear power plants

    International Nuclear Information System (INIS)

    Kim, Wan Joo

    1993-02-01

    signals in a certain time interval for reducing the loads of the fusion part. The simulation results of LOCA in the simulator are demonstrated for the classification of the signal trend. The demonstration is performed for the transient states of a steam generator. Using the fuzzy memberships, the pre-processors classify the trend types in each time interval into three classes; increase, decrease, and steady that are fuzzy to classify. The result compared with the artificial neural network which has no pre-processor shows that the training time is reduced and the outputs are seldom influenced by noises. Because most knowledge of human operators include fuzzy concepts and words, the method like this is very helpful for computerizing the buman expert's knowledge

  6. Advances in analytical methods and occurrence of organic UV-filters in the environment — A review

    Energy Technology Data Exchange (ETDEWEB)

    Ramos, Sara; Homem, Vera, E-mail: vhomem@fe.up.pt; Alves, Arminda; Santos, Lúcia

    2015-09-01

    UV-filters are a group of compounds designed mainly to protect skin against UVA and UVB radiation, but they are also included in plastics, furniture, etc., to protect products from light damage. Their massive use in sunscreens for skin protection has been increasing due to the awareness of the chronic and acute effects of UV radiation. Some organic UV-filters have raised significant concerns in the past few years for their continuous usage, persistent input and potential threat to ecological environment and human health. UV-filters end up in wastewater and because wastewater treatment plants are not efficient in removing them, lipophilic compounds tend to sorb onto sludge and hydrophilics end up in river water, contaminating the existing biota. To better understand the risk associated with UV-filters in the environment a thorough review regarding their physicochemical properties, toxicity and environmental degradation, analytical methods and their occurrence was conducted. Higher UV-filter concentrations were found in rivers, reaching 0.3 mg/L for the most studied family, the benzophenone derivatives. Concentrations in the ng to μg/L range were also detected for the p-aminobenzoic acid, cinnamate, crylene and benzoyl methane derivatives in lake and sea water. Although at lower levels (few ng/L), UV-filters were also found in tap and groundwater. Swimming pool water is also a sink for UV-filters and its chlorine by-products, at the μg/L range, highlighting the benzophenone and benzimidazole derivatives. Soils and sediments are not frequently studied, but concentrations in the μg/L range have already been found especially for the benzophenone and crylene derivatives. Aquatic biota is frequently studied and UV-filters are found in the ng/g-dw range with higher values for fish and mussels. It has been concluded that more information regarding UV-filter degradation studies both in water and sediments is necessary and environmental occurrences should be monitored more

  7. Advances in analytical methods and occurrence of organic UV-filters in the environment--A review.

    Science.gov (United States)

    Ramos, Sara; Homem, Vera; Alves, Arminda; Santos, Lúcia

    2015-09-01

    UV-filters are a group of compounds designed mainly to protect skin against UVA and UVB radiation, but they are also included in plastics, furniture, etc., to protect products from light damage. Their massive use in sunscreens for skin protection has been increasing due to the awareness of the chronic and acute effects of UV radiation. Some organic UV-filters have raised significant concerns in the past few years for their continuous usage, persistent input and potential threat to ecological environment and human health. UV-filters end up in wastewater and because wastewater treatment plants are not efficient in removing them, lipophilic compounds tend to sorb onto sludge and hydrophilics end up in river water, contaminating the existing biota. To better understand the risk associated with UV-filters in the environment a thorough review regarding their physicochemical properties, toxicity and environmental degradation, analytical methods and their occurrence was conducted. Higher UV-filter concentrations were found in rivers, reaching 0.3mg/L for the most studied family, the benzophenone derivatives. Concentrations in the ng to μg/L range were also detected for the p-aminobenzoic acid, cinnamate, crylene and benzoyl methane derivatives in lake and sea water. Although at lower levels (few ng/L), UV-filters were also found in tap and groundwater. Swimming pool water is also a sink for UV-filters and its chlorine by-products, at the μg/L range, highlighting the benzophenone and benzimidazole derivatives. Soils and sediments are not frequently studied, but concentrations in the μg/L range have already been found especially for the benzophenone and crylene derivatives. Aquatic biota is frequently studied and UV-filters are found in the ng/g-dw range with higher values for fish and mussels. It has been concluded that more information regarding UV-filter degradation studies both in water and sediments is necessary and environmental occurrences should be monitored more

  8. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    International Nuclear Information System (INIS)

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations

  9. An Evaluation of the Accuracy of the Subtraction Method Used for Determining Platelet Counts in Advanced Platelet-Rich Fibrin and Concentrated Growth Factor Preparations

    Directory of Open Access Journals (Sweden)

    Taisuke Watanabe

    2017-01-01

    Full Text Available Platelet concentrates should be quality-assured of purity and identity prior to clinical use. Unlike for the liquid form of platelet-rich plasma, platelet counts cannot be directly determined in solid fibrin clots and are instead calculated by subtracting the counts in other liquid or semi-clotted fractions from those in whole blood samples. Having long suspected the validity of this method, we herein examined the possible loss of platelets in the preparation process. Blood samples collected from healthy male donors were immediately centrifuged for advanced platelet-rich fibrin (A-PRF and concentrated growth factors (CGF according to recommended centrifugal protocols. Blood cells in liquid and semi-clotted fractions were directly counted. Platelets aggregated on clot surfaces were observed by scanning electron microscopy. A higher centrifugal force increased the numbers of platelets and platelet aggregates in the liquid red blood cell fraction and the semi-clotted red thrombus in the presence and absence of the anticoagulant, respectively. Nevertheless, the calculated platelet counts in A-PRF/CGF preparations were much higher than expected, rendering the currently accepted subtraction method inaccurate for determining platelet counts in fibrin clots. To ensure the quality of solid types of platelet concentrates chairside in a timely manner, a simple and accurate platelet-counting method should be developed immediately.

  10. A new primary cleft lip repair technique tailored for Asian patients that combines three surgical concepts: Comparison with rotation--advancement and straight-line methods.

    Science.gov (United States)

    Funayama, Emi; Yamamoto, Yuhei; Furukawa, Hiroshi; Murao, Naoki; Shichinohe, Ryuji; Hayashi, Toshihiko; Oyama, Akihiko

    2016-01-01

    Various techniques have been described for unilateral cleft lip repair. These may be broadly classified into three types of procedure/concept: the straight-line method (SL; Rose-Thompson effect); rotation-advancement (RA; upper-lip Z-plasty); and the triangular flap method (TA; lower-lip Z-plasty). Based on these procedures, cleft lip repair has evolved in recent decades. The cleft lip repair method in our institution has also undergone several changes. However, we have found that further modifications are needed for Asian patients who have wider philtral dimples and columns than Caucasians, while following the principles of the original techniques mentioned above. Here, we have incorporated the advantages of each procedure and propose a refined hybrid operating technique, seeking a more appropriate procedure for Asian patients. To evaluate our new technique, a comparison study was performed to evaluate RA, SL, and our technique. We have used our new technique to treat 137 consecutive cleft lip cases of all types and degrees of severity, with or without a cleft palate, since 2009. In the time since we adopted the hybrid technique, we have observed improved esthetics of the repaired lip. Our technique demonstrated higher glance impression average scores than RA/SL. Copyright © 2015 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  11. Parameter Identification with the Random Perturbation Particle Swarm Optimization Method and Sensitivity Analysis of an Advanced Pressurized Water Reactor Nuclear Power Plant Model for Power Systems

    Directory of Open Access Journals (Sweden)

    Li Wang

    2017-02-01

    Full Text Available The ability to obtain appropriate parameters for an advanced pressurized water reactor (PWR unit model is of great significance for power system analysis. The attributes of that ability include the following: nonlinear relationships, long transition time, intercoupled parameters and difficult obtainment from practical test, posed complexity and difficult parameter identification. In this paper, a model and a parameter identification method for the PWR primary loop system were investigated. A parameter identification process was proposed, using a particle swarm optimization (PSO algorithm that is based on random perturbation (RP-PSO. The identification process included model variable initialization based on the differential equations of each sub-module and program setting method, parameter obtainment through sub-module identification in the Matlab/Simulink Software (Math Works Inc., Natick, MA, USA as well as adaptation analysis for an integrated model. A lot of parameter identification work was carried out, the results of which verified the effectiveness of the method. It was found that the change of some parameters, like the fuel temperature and coolant temperature feedback coefficients, changed the model gain, of which the trajectory sensitivities were not zero. Thus, obtaining their appropriate values had significant effects on the simulation results. The trajectory sensitivities of some parameters in the core neutron dynamic module were interrelated, causing the parameters to be difficult to identify. The model parameter sensitivity could be different, which would be influenced by the model input conditions, reflecting the parameter identifiability difficulty degree for various input conditions.

  12. Review of pipe-break probability assessment methods and data for applicability to the advanced neutron source project for Oak Ridge National Laboratory

    International Nuclear Information System (INIS)

    Fullwood, R.R.

    1989-04-01

    The Advanced Neutron Source (ANS) (Difilippo, 1986; Gamble, 1986; West, 1986; Selby, 1987) will be the world's best facility for low energy neutron research. This performance requires the highest flux density of all non-pulsed reactors with concomitant low thermal inertial and fast response to upset conditions. One of the primary concerns is that a flow cessation of the order of a second may result in fuel damage. Such a flow stoppage could be the result of break in the primary piping. This report is a review of methods for assessing pipe break probabilities based on historical operating experience in power reactors, scaling methods, fracture mechanics and fracture growth models. The goal of this work is to develop parametric guidance for the ANS design to make the event highly unlikely. It is also to review and select methods that may be used in an interactive IBM-PC model providing fast and reasonably accurate models to aid the ANS designers in achieving the safety requirements. 80 refs., 7 figs

  13. Advanced Ceramics

    International Nuclear Information System (INIS)

    1989-01-01

    The First Florida-Brazil Seminar on Materials and the Second State Meeting about new materials in Rio de Janeiro State show the specific technical contribution in advanced ceramic sector. The others main topics discussed for the development of the country are the advanced ceramic programs the market, the national technic-scientific capacitation, the advanced ceramic patents, etc. (C.G.C.) [pt

  14. Advanced methods in twin studies.

    Science.gov (United States)

    Kaprio, Jaakko; Silventoinen, Karri

    2011-01-01

    While twin studies have been used to estimate the heritability of different traits and disorders since the beginning of the twentieth century, statistical developments over the past 20 years and more extensive and systematic data collection have greatly expanded the scope of twin studies. This chapter reviews selected possibilities of twin study designs to address specific hypotheses regarding the role of both genetic and environmental factors in the development of traits and diseases. In addition to modelling latent genetic influences, current models permit inclusion of information on specific genetic variants, measured environmental factors and their interactive effects. Examples from studies of anthropometric traits are used to illustrate such approaches.

  15. Sentinel lymph node biopsy using dye alone method is reliable and accurate even after neo-adjuvant chemotherapy in locally advanced breast cancer - a prospective study

    Directory of Open Access Journals (Sweden)

    Mishra Ashwani

    2011-02-01

    Full Text Available Abstract Background Sentinel lymph node biopsy (SLNB is now considered a standard of care in early breast cancers with N0 axillae; however, its role in locally advanced breast cancer (LABC after neo-adjuvant chemotherapy (NACT is still being debated. The present study assessed the feasibility, efficacy and accuracy of sentinel lymph node biopsy (SLNB using "dye alone" (methylene blue method in patients with LABC following NACT. Materials and methods Thirty, biopsy proven cases of LABC that had received three cycles of neo-adjuvant chemotherapy (cyclophosphamide, adriamycin, 5-fluorouracil were subjected to SLNB (using methylene blue dye followed by complete axillary lymph node dissection (levels I-III. The sentinel node(s was/were and the axilla were individually assessed histologically. The SLN accuracy parameters were calculated employing standard definitions. The SLN identification rate in the present study was 100%. The sensitivity of SLNB was 86.6% while the accuracy was 93.3%, which were comparable with other studies done using dual lymphatic mapping method. The SLN was found at level I in all cases and no untoward reaction to methylene blue dye was observed. Conclusions This study confirms that SLNB using methylene blue dye as a sole mapping agent is reasonably safe and almost as accurate as dual agent mapping method. It is likely that in the near future, SLNB may become the standard of care and provide a less morbid alternative to routine axillary lymph node dissection even in patients with LABC that have received NACT.

  16. Development of improved processing and evaluation methods for high reliability structural ceramics for advanced heat engine applications, Phase 1. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pujari, V.K.; Tracey, D.M.; Foley, M.R.; Paille, N.I.; Pelletier, P.J.; Sales, L.C.; Wilkens, C.A.; Yeckley, R.L. [Norton Co., Northboro, MA (United States)

    1993-08-01

    The program goals were to develop and demonstrate significant improvements in processing methods, process controls and non-destructive evaluation (NDE) which can be commercially implemented to produce high reliability silicon nitride components for advanced heat engine applications at temperatures to 1,370{degrees}C. The program focused on a Si{sub 3}N{sub 4}-4% Y{sub 2}O{sub 3} high temperature ceramic composition and hot-isostatic-pressing as the method of densification. Stage I had as major objectives: (1) comparing injection molding and colloidal consolidation process routes, and selecting one route for subsequent optimization, (2) comparing the performance of water milled and alcohol milled powder and selecting one on the basis of performance data, and (3) adapting several NDE methods to the needs of ceramic processing. The NDE methods considered were microfocus X-ray radiography, computed tomography, ultrasonics, NMR imaging, NMR spectroscopy, fluorescent liquid dye penetrant and X-ray diffraction residual stress analysis. The colloidal consolidation process route was selected and approved as the forming technique for the remainder of the program. The material produced by the final Stage II optimized process has been given the designation NCX 5102 silicon nitride. According to plan, a large number of specimens were produced and tested during Stage III to establish a statistically robust room temperature tensile strength database for this material. Highlights of the Stage III process demonstration and resultant database are included in the main text of the report, along with a synopsis of the NCX-5102 aqueous based colloidal process. The R and D accomplishments for Stage I are discussed in Appendices 1--4, while the tensile strength-fractography database for the Stage III NCX-5102 process demonstration is provided in Appendix 5. 4 refs., 108 figs., 23 tabs.

  17. Obstructive Form of Hypertrophic Cardiomyopathy-Left Ventricular Outflow Tract Gradient: Novel Methods of Provocation, Monitoring of Biomarkers, and Recent Advances in the Treatment

    Directory of Open Access Journals (Sweden)

    Pawel Petkow Dimitrow

    2016-01-01

    Full Text Available Dynamic (latent or/and labile obstruction of left ventricular outflow (LVOT was recognized from the earliest clinical descriptions of hypertrophic cardiomyopathy (HCM and has proved to be a complex phenomenon, as well as arguably the most audible (“visible” pathophysiological hallmark of this heterogeneous disease. The aim of the current review is focused on two novel issues in a subgroup of obstructive HCM. Firstly, the important methodological problem in HCM is the examination of a subgroup of patients with nonobstructive hypertrophy in resting conditions and hard, but possible provoking obstruction. Recently, investigators have proposed physiological stress test (with double combined stimuli to disclose such type of patients. The upright exercise is described in the ESC guideline on hypertrophic cardiomyopathy from 2014 and may appear as a candidate for gold standard provocation test. The second novel area of interest is associated with elevated level of signaling biomarkers: hypercoagulation, hemolysis, acquired von Willebrand 2A disease, and enhanced oxidative stress. The accelerated and turbulent flow within narrow LVOT may be responsible for these biochemical disturbances. The most recent advances in the treatment of obstructive HCM are related to nonpharmacological methods of LVOT gradient reduction. This report extensively discusses novel methods.

  18. Advanced methods for the risk, vulnerability and resilience assessment of safety-critical engineering components, systems and infrastructures, in the presence of uncertainties

    International Nuclear Information System (INIS)

    Pedroni, Nicolas

    2016-01-01

    Safety-critical industrial installations (e.g., nuclear plants) and infrastructures (e.g., power transmission networks) are complex systems composed by a multitude and variety of heterogeneous 'elements', which are highly interconnected and mutually dependent. In addition, such systems are affected by large uncertainties in the characterization of the failure and recovery behavior of their components, interconnections and interactions. Such characteristics raise concerns with respect to the system risk, vulnerability and resilience properties, which have to be accurately and precisely assessed for decision making purposes. In general, this entails the following main steps: (1) representation of the system to capture its main features; (2) construction of a mathematical model of the system; (3) simulation of the behavior of the system under various uncertain conditions to evaluate the relevant risk, vulnerability and resilience metrics by propagating the uncertainties through the mathematical model; (4) decision making to (optimally) determine the set of protective actions to effectively reduce (resp., increase) the system risk and vulnerability (resp., resilience). New methods to address these issues have been developed in this dissertation. Specifically, the research works have been carried out along two main axes: (1) the study of approaches for uncertainty modeling and quantification; (2) the development of advanced computational methods for the efficient system modeling, simulation and analysis in the presence of uncertainties. (author)

  19. Advanced Stellar Compass

    DEFF Research Database (Denmark)

    Madsen, Peter Buch; Jørgensen, John Leif; Thuesen, Gøsta

    1997-01-01

    This document describes all interface properties for the Advanced Stellar Compass, developed for the German Research Satellite "CHAMP". Basic operations, modes, software protocol, calibration methods and closed loop test strategies are described.......This document describes all interface properties for the Advanced Stellar Compass, developed for the German Research Satellite "CHAMP". Basic operations, modes, software protocol, calibration methods and closed loop test strategies are described....

  20. Application of advanced machine learning methods on resting-state fMRI network for identification of mild cognitive impairment and Alzheimer's disease.

    Science.gov (United States)

    Khazaee, Ali; Ebrahimzadeh, Ata; Babajani-Feremi, Abbas

    2016-09-01

    The study of brain networks by resting-state functional magnetic resonance imaging (rs-fMRI) is a promising method for identifying patients with dementia from healthy controls (HC). Using graph theory, different aspects of the brain network can be efficiently characterized by calculating measures of integration and segregation. In this study, we combined a graph theoretical approach with advanced machine learning methods to study the brain network in 89 patients with mild cognitive impairment (MCI), 34 patients with Alzheimer's disease (AD), and 45 age-matched HC. The rs-fMRI connectivity matrix was constructed using a brain parcellation based on a 264 putative functional areas. Using the optimal features extracted from the graph measures, we were able to accurately classify three groups (i.e., HC, MCI, and AD) with accuracy of 88.4 %. We also investigated performance of our proposed method for a binary classification of a group (e.g., MCI) from two other groups (e.g., HC and AD). The classification accuracies for identifying HC from AD and MCI, AD from HC and MCI, and MCI from HC and AD, were 87.3, 97.5, and 72.0 %, respectively. In addition, results based on the parcellation of 264 regions were compared to that of the automated anatomical labeling atlas (AAL), consisted of 90 regions. The accuracy of classification of three groups using AAL was degraded to 83.2 %. Our results show that combining the graph measures with the machine learning approach, on the basis of the rs-fMRI connectivity analysis, may assist in diagnosis of AD and MCI.

  1. A survey of analytical methods employed for monitoring of Advanced Oxidation/Reduction Processes for decomposition of selected perfluorinated environmental pollutants.

    Science.gov (United States)

    Trojanowicz, Marek; Bobrowski, Krzysztof; Szostek, Bogdan; Bojanowska-Czajka, Anna; Szreder, Tomasz; Bartoszewicz, Iwona; Kulisa, Krzysztof

    2018-01-15

    The monitoring of Advanced Oxidation/Reduction Processes (AO/RPs) for the evaluation of the yield and mechanisms of decomposition of perfluorinated compounds (PFCs) is often a more difficult task than their determination in the environmental, biological or food samples with complex matrices. This is mostly due to the formation of hundreds, or even thousands, of both intermediate and final products. The considered AO/RPs, involving free radical reactions, include photolytic and photocatalytic processes, Fenton reactions, sonolysis, ozonation, application of ionizing radiation and several wet oxidation processes. The main attention is paid to the most commonly occurring PFCs in the environment, namely PFOA and PFOS. The most powerful and widely exploited method for this purpose is without a doubt LC/MS/MS, which allows the identification and trace quantitation of all species with detectability and resolution power depending on the particular instrumental configurations. The GC/MS is often employed for the monitoring of volatile fluorocarbons, confirming the formation of radicals in the processes of C‒C and C‒S bonds cleavage. For the direct monitoring of radicals participating in the reactions of PFCs decomposition, the molecular spectrophotometry is employed, especially electron paramagnetic resonance (EPR). The UV/Vis spectrophotometry as a detection method is of special importance in the evaluation of kinetics of radical reactions with the use of pulse radiolysis methods. The most commonly employed for the determination of the yield of mineralization of PFCs is ion-chromatography, but there is also potentiometry with ion-selective electrode and the measurements of general parameters such as Total Organic Carbon and Total Organic Fluoride. The presented review is based on about 100 original papers published in both analytical and environmental journals. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Prototype Demonstration of Gamma- Blind Tensioned Metastable Fluid Neutron/Multiplicity/Alpha Detector – Real Time Methods for Advanced Fuel Cycle Applications

    Energy Technology Data Exchange (ETDEWEB)

    McDeavitt, Sean M. [Texas A & M Univ., College Station, TX (United States)

    2016-12-20

    The content of this report summarizes a multi-year effort to develop prototype detection equipment using the Tensioned Metastable Fluid Detector (TMFD) technology developed by Taleyarkhan [1]. The context of this development effort was to create new methods for evaluating and developing advanced methods for safeguarding nuclear materials along with instrumentation in various stages of the fuel cycle, especially in material balance areas (MBAs) and during reprocessing of used nuclear fuel. One of the challenges related to the implementation of any type of MBA and/or reprocessing technology (e.g., PUREX or UREX) is the real-time quantification and control of the transuranic (TRU) isotopes as they move through the process. Monitoring of higher actinides from their neutron emission (including multiplicity) and alpha signatures during transit in MBAs and in aqueous separations is a critical research area. By providing on-line real-time materials accountability, diversion of the materials becomes much more difficult. The Tensioned Metastable Fluid Detector (TMFD) is a transformational technology that is uniquely capable of both alpha and neutron spectroscopy while being “blind” to the intense gamma field that typically accompanies used fuel – simultaneously with the ability to provide multiplicity information as well [1-3]. The TMFD technology was proven (lab-scale) as part of a 2008 NERI-C program [1-7]. The bulk of this report describes the advancements and demonstrations made in TMFD technology. One final point to present before turning to the TMFD demonstrations is the context for discussing real-time monitoring of SNM. It is useful to review the spectrum of isotopes generated within nuclear fuel during reactor operations. Used nuclear fuel (UNF) from a light water reactor (LWR) contains fission products as well as TRU elements formed through neutron absorption/decay chains. The majority of the fission products are gamma and beta emitters and they represent the

  3. Method of an integrated and advanced evaluation of vulnerability. Conceptional-methodical fundamentals and examplary implementation for the water household, power generation and energetic utilisation of wood under climatic change; Methode einer integrierten und erweiterten Vulnerabilitaetsbewertung. Konzeptionell-methodische Grundlagen und exemplarische Umsetzung fuer Wasserhaushalt, Stromerzeugung und energetische Nutzung von Holz unter Klimawandel

    Energy Technology Data Exchange (ETDEWEB)

    Weisz, Helga; Koch, Hagen; Lasch, Petra [Potsdam-Institut fuer Klimafolgenforschung e.V. (Germany)] [and others

    2013-07-15

    Actually, in Germany there are more than hundred investigations on the consequences of the climatic change. It is difficult to evaluate the vulnerability of Germany against the climatic change. Under this aspect, the authors of the contributions report on a method of an integrated and advanced evaluation of vulnerability: Conceptional-methodical fundamentals and exemplary implementation for water household, power generation and energetic utilization of wood under climatic change.

  4. Evaluation of the cryogenic helium recovery process from natural gas based on flash separation by advanced exergy cost method - Linde modified process

    Science.gov (United States)

    Ansarinasab, Hojat; Mehrpooya, Mehdi; Parivazh, Mohammad Mehdi

    2017-10-01

    In this paper, exergy cost analysis method is used to evaluate a new cryogenic Helium recovery process from natural gas based on flash separation. Also advanced exergoeconomic analysis was made to determine the amount of avoidable exergy destruction cost of the process component. This proposed process can extract Helium from a feed gas stream with better efficiency than other existing processes. The results indicate that according to the avoidable endogenous exergy destruction cost C-4 (287.2/hr), C-5 (257.3/hr) and C-6 (181.6/hr) compressors should be modified first, respectively. According to the endogenous investment and exergy destruction cost, the interactions between the process components are not strong. In compressors, a high proportion of the cost of exergy destruction is avoidable while in these components, investment costs are unavoidable. In heat exchangers and air coolers, a high proportion of the exergy destruction cost is unavoidable while in these components, investment costs are avoidable. Finally, three different strategies are suggested to improve performance of each component, and the sensitivity of exergoeconomic factor and cost of exergy destruction to operating variables of the process are studied.

  5. Development of improved processing and evaluation methods for high reliability structural ceramics for advanced heat engine applications Phase II. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Pujari, V.J.; Tracey, D.M.; Foley, M.R. [and others

    1996-02-01

    The research program had as goals the development and demonstration of significant improvements in processing methods, process controls, and nondestructive evaluation (NDE) which can be commercially implemented to produce high reliability silicon nitride components for advanced heat engine applications at temperatures to 1370{degrees}C. In Phase I of the program a process was developed that resulted in a silicon nitride - 4 w% yttria HIP`ed material (NCX 5102) that displayed unprecedented strength and reliability. An average tensile strength of 1 GPa and a strength distribution following a 3-parameter Weibull distribution were demonstrated by testing several hundred buttonhead tensile specimens. The Phase II program focused on the development of methodology for colloidal consolidation producing green microstructure which minimizes downstream process problems such as drying, shrinkage, cracking, and part distortion during densification. Furthermore, the program focused on the extension of the process to gas pressure sinterable (GPS) compositions. Excellent results were obtained for the HIP composition processed for minimal density gradients, both with respect to room-temperature strength and high-temperature creep resistance. Complex component fabricability of this material was demonstrated by producing engine-vane prototypes. Strength data for the GPS material (NCX-5400) suggest that it ranks very high relative to other silicon nitride materials in terms of tensile/flexure strength ratio, a measure of volume quality. This high quality was derived from the closed-loop colloidal process employed in the program.

  6. TRAC-P1: an advanced best estimate computer program for PWR LOCA analysis. I. Methods, models, user information, and programming details

    Energy Technology Data Exchange (ETDEWEB)

    1978-05-01

    The Transient Reactor Analysis Code (TRAC) is being developed at the Los Alamos Scientific Laboratory (LASL) to provide an advanced ''best estimate'' predictive capability for the analysis of postulated accidents in light water reactors (LWRs). TRAC-Pl provides this analysis capability for pressurized water reactors (PWRs) and for a wide variety of thermal-hydraulic experimental facilities. It features a three-dimensional treatment of the pressure vessel and associated internals; two-phase nonequilibrium hydrodynamics models; flow-regime-dependent constitutive equation treatment; reflood tracking capability for both bottom flood and falling film quench fronts; and consistent treatment of entire accident sequences including the generation of consistent initial conditions. The TRAC-Pl User's Manual is composed of two separate volumes. Volume I gives a description of the thermal-hydraulic models and numerical solution methods used in the code. Detailed programming and user information is also provided. Volume II presents the results of the developmental verification calculations.

  7. A capture method based on the VC1 domain reveals new binding properties of the human receptor for advanced glycation end products (RAGE

    Directory of Open Access Journals (Sweden)

    Genny Degani

    2017-04-01

    Full Text Available The Advanced Glycation and Lipoxidation End products (AGEs and ALEs are a heterogeneous class of compounds derived from the non-enzymatic glycation or protein adduction by lipoxidation break-down products. The receptor for AGEs (RAGE is involved in the progression of chronic diseases based on persistent inflammatory state and oxidative stress. RAGE is a pattern recognition receptor (PRR and the inhibition of the interaction with its ligands or of the ligand accumulation have a potential therapeutic effect. The N-terminal domain of RAGE, the V domain, is the major site of AGEs binding and is stabilized by the adjacent C1 domain. In this study, we set up an affinity assay relying on the extremely specific biological interaction AGEs ligands have for the VC1 domain. A glycosylated form of VC1, produced in the yeast Pichia pastoris, was attached to magnetic beads and used as insoluble affinity matrix (VC1-resin. The VC1 interaction assay was employed to isolate specific VC1 binding partners from in vitro generated AGE-albumins and modifications were identified/localized by mass spectrometry analysis. Interestingly, this method also led to the isolation of ALEs produced by malondialdehyde treatment of albumins. Computational studies provided a rational-based interpretation of the contacts established by specific modified residues and amino acids of the V domain. The validation of VC1-resin in capturing AGE-albumins from complex biological mixtures such as plasma and milk, may lead to the identification of new RAGE ligands potentially involved in pro-inflammatory and pro-fibrotic responses, independently of their structures or physical properties, and without the use of any covalent derivatization process. In addition, the method can be applied to the identification of antagonists of RAGE-ligand interaction.

  8. Computational Aerodynamic Simulations of an 840 ft/sec Tip Speed Advanced Ducted Propulsor Fan System Model for Acoustic Methods Assessment and Development

    Science.gov (United States)

    Tweedt, Daniel L.

    2014-01-01

    Computational Aerodynamic simulations of an 840 ft/sec tip speed, Advanced Ducted Propulsor fan system were performed at five different operating points on the fan operating line, in order to provide detailed internal flow field information for use with fan acoustic prediction methods presently being developed, assessed and validated. The fan system is a sub-scale, lownoise research fan/nacelle model that has undergone extensive experimental testing in the 9- by 15- foot Low Speed Wind Tunnel at the NASA Glenn Research Center, resulting in quality, detailed aerodynamic and acoustic measurement data. Details of the fan geometry, the computational fluid dynamics methods, the computational grids, and various computational parameters relevant to the numerical simulations are discussed. Flow field results for three of the five operating conditions simulated are presented in order to provide a representative look at the computed solutions. Each of the five fan aerodynamic simulations involved the entire fan system, excluding a long core duct section downstream of the core inlet guide vane. As a result, only fan rotational speed and system bypass ratio, set by specifying static pressure downstream of the core inlet guide vane row, were adjusted in order to set the fan operating point, leading to operating points that lie on a fan operating line and making mass flow rate a fully dependent parameter. The resulting mass flow rates are in good agreement with measurement values. The computed blade row flow fields for all five fan operating points are, in general, aerodynamically healthy. Rotor blade and fan exit guide vane flow characteristics are good, including incidence and deviation angles, chordwise static pressure distributions, blade surface boundary layers, secondary flow structures, and blade wakes. Examination of the computed flow fields reveals no excessive boundary layer separations or related secondary-flow problems. A few spanwise comparisons between

  9. Advanced predictive methods for wine age prediction: Part I - A comparison study of single-block regression approaches based on variable selection, penalized regression, latent variables and tree-based ensemble methods.

    Science.gov (United States)

    Rendall, Ricardo; Pereira, Ana Cristina; Reis, Marco S

    2017-08-15

    In this paper we test and compare advanced predictive approaches for estimating wine age in the context of the production of a high quality fortified wine - Madeira Wine. We consider four different data sets, namely, volatile, polyphenols, organic acids and the UV-vis spectra. Each one of these data sets contain chemical information of a different nature and present diverse data structures, namely a different dimensionality, level of collinearity and degree of sparsity. These different aspects may imply the use of different modelling approaches in order to better explore the data set's information content, namely their predictive potential for wine age. This happens to be so, because different regression methods have different prior assumptions regarding the predictors, response variable(s) and the data generating mechanism, which may or may not find good adherence to the case study under analysis. In order to cover a wide range of modelling domains, we have incorporated in this work methods belonging to four very distinct classes of approaches that cover most applications found in practice: linear regression with variable selection, penalized regression, latent variables regression and tree-based ensemble methods. We have also developed a rigorous comparison framework based on a double Monte Carlo cross-validation scheme, in order to perform the relative assessment of the performance of the various methods. Upon comparison, models built using the polyphenols and volatile composition data sets led to better wine age predictions, showing lower errors under testing conditions. Furthermore, the results obtained for the polyphenols data set suggest a more sparse structure that can be further explored in order to reduce the number of measured variables. In terms of regression methods, tree-based methods, and boosted regression trees in particular, presented the best results for the polyphenols, volatile and the organic acid data sets, suggesting a possible presence of a

  10. Method

    Directory of Open Access Journals (Sweden)

    Ling Fiona W.M.

    2017-01-01

    Full Text Available Rapid prototyping of microchannel gain lots of attention from researchers along with the rapid development of microfluidic technology. The conventional methods carried few disadvantages such as high cost, time consuming, required high operating pressure and temperature and involve expertise in operating the equipment. In this work, new method adapting xurography method is introduced to replace the conventional method of fabrication of microchannels. The novelty in this study is replacing the adhesion film with clear plastic film which was used to cut the design of the microchannel as the material is more suitable for fabricating more complex microchannel design. The microchannel was then mold using polymethyldisiloxane (PDMS and bonded with a clean glass to produce a close microchannel. The microchannel produced had a clean edge indicating good master mold was produced using the cutting plotter and the bonding between the PDMS and glass was good where no leakage was observed. The materials used in this method is cheap and the total time consumed is less than 5 hours where this method is suitable for rapid prototyping of microchannel.

  11. Advanced instrumentation and analysis methods for in-pile thermal and nuclear measurements: from out-of-pile studies to irradiation campaigns

    International Nuclear Information System (INIS)

    Reynard-Carette, C.; Lyoussi, A.

    2015-01-01

    Research and development on nuclear fuel behavior under irradiations and accelerated ageing of structure materials is a key issue for sustainable nuclear energy in order to meet specific needs by keeping the best level of safety. A new Material Testing Reactor (MTR), the Jules Horowitz Reactor (JHR) currently under construction in the South of France in the CEA Cadarache research centre will offer a real opportunity to perform R and D programs and hence will crucially contribute to the selection, optimization and qualification of innovative materials and fuels. To perform such programs advanced accurate and innovative experiments, irradiation devices that contain material and fuel samples are required to be set up inside or beside the reactor core. These experiments needs beforehand in situ and on line sophisticated measurements to accurately reach specific and determining parameters such as thermal and fast neutron fluxes, nuclear heating and temperature conditions to precisely monitor and control the conducted assays. Consequently, since 2009 CEA and Aix-Marseille University collaborate in order to design and develop a new multi-sensor device which will be dedicated to measuring profiles of such conditions inside the experimental channels of the JHR. These works are performed in the framework of two complementary joint research programs called MAHRI-BETHY and INCORE. These programs couple experimental studies carried out both out-of nuclear fluxes (in laboratory) and under irradiation conditions (in OSIRIS MTR reactor in France and MARIA MTR reactor in Poland) with numerical works realized by thermal simulations (CAST3M code) and Monte Carlo simulations (MCNP code). These programs deal with three main aims. The first one corresponds to the design and/or the test of new in-pile instrumentation. The second one concerns the development of advanced calibration procedures in particular in the case of one specific sensor: a differential calorimeter used to quantify

  12. Advanced development of the boundary element method for elastic and inelastic thermal stress analysis. Ph.D. Thesis, 1987 Final Report

    Science.gov (United States)

    Henry, Donald P., Jr.

    1991-01-01

    The focus of this dissertation is on advanced development of the boundary element method for elastic and inelastic thermal stress analysis. New formulations for the treatment of body forces and nonlinear effects are derived. These formulations, which are based on particular integral theory, eliminate the need for volume integrals or extra surface integrals to account for these effects. The formulations are presented for axisymmetric, two and three dimensional analysis. Also in this dissertation, two dimensional and axisymmetric formulations for elastic and inelastic, inhomogeneous stress analysis are introduced. The derivatives account for inhomogeneities due to spatially dependent material parameters, and thermally induced inhomogeneities. The nonlinear formulation of the present work are based on an incremental initial stress approach. Two inelastic solutions algorithms are implemented: an iterative; and a variable stiffness type approach. The Von Mises yield criterion with variable hardening and the associated flow rules are adopted in these algorithms. All formulations are implemented in a general purpose, multi-region computer code with the capability of local definition of boundary conditions. Quadratic, isoparametric shape functions are used to model the geometry and field variables of the boundary (and domain) of the problem. The multi-region implementation permits a body to be modeled in substructured parts, thus dramatically reducing the cost of analysis. Furthermore, it allows a body consisting of regions of different (homogeneous) material to be studied. To test the program, results obtained for simple test cases are checked against their analytic solutions. Thereafter, a range of problems of practical interest are analyzed. In addition to displacement and traction loads, problems with body forces due to self-weight, centrifugal, and thermal loads are considered.

  13. Advanced Crash Avoidance Technologies (ACAT) Program - Final Report of the Volvo-Ford-UMTRI Project: Safety Impact Methodology for Lane Departure Warning - Method Development and Estimation of Benefits

    Science.gov (United States)

    2010-10-01

    The Volvo-Ford-UMTRI project: Safety Impact Methodology (SIM) for Lane Departure Warning is part of the U.S. Department of Transportation's Advanced Crash Avoidance Technologies (ACAT) program. The project developed a basic analytical framework for e...

  14. Advances in atomic spectroscopy

    CERN Document Server

    Sneddon, J

    1997-01-01

    This series describes selected advances in the area of atomic spectroscopy. It is primarily intended for the reader who has a background in atmoic spectroscopy; suitable to the novice and expert. Although a widely used and accepted method for metal and non-metal analysis in a variety of complex samples, Advances in Atomic Spectroscopy covers a wide range of materials. Each Chapter will completely cover an area of atomic spectroscopy where rapid development has occurred.

  15. method

    Directory of Open Access Journals (Sweden)

    L. M. Kimball

    2002-01-01

    Full Text Available This paper presents an interior point algorithm to solve the multiperiod hydrothermal economic dispatch (HTED. The multiperiod HTED is a large scale nonlinear programming problem. Various optimization methods have been applied to the multiperiod HTED, but most neglect important network characteristics or require decomposition into thermal and hydro subproblems. The algorithm described here exploits the special bordered block diagonal structure and sparsity of the Newton system for the first order necessary conditions to result in a fast efficient algorithm that can account for all network aspects. Applying this new algorithm challenges a conventional method for the use of available hydro resources known as the peak shaving heuristic.

  16. Advance Directives

    Science.gov (United States)

    ... Types Bladder Cancer Breast Cancer Colorectal Cancer Kidney (Renal Cell) Cancer Leukemia Liver Cancer Lung Cancer Lymphoma ... about advance directives. Two well-known ones are: Aging With Dignity Aging with Dignity is a national ...

  17. AdvancED Flex 4

    CERN Document Server

    Tiwari, Shashank; Schulze, Charlie

    2010-01-01

    AdvancED Flex 4 makes advanced Flex 4 concepts and techniques easy. Ajax, RIA, Web 2.0, mashups, mobile applications, the most sophisticated web tools, and the coolest interactive web applications are all covered with practical, visually oriented recipes. * Completely updated for the new tools in Flex 4* Demonstrates how to use Flex 4 to create robust and scalable enterprise-grade Rich Internet Applications.* Teaches you to build high-performance web applications with interactivity that really engages your users.* What you'll learn Practiced beginners and intermediate users of Flex, especially

  18. Advance in structural bioinformatics

    CERN Document Server

    Wei, Dongqing; Zhao, Tangzhen; Dai, Hao

    2014-01-01

    This text examines in detail mathematical and physical modeling, computational methods and systems for obtaining and analyzing biological structures, using pioneering research cases as examples. As such, it emphasizes programming and problem-solving skills. It provides information on structure bioinformatics at various levels, with individual chapters covering introductory to advanced aspects, from fundamental methods and guidelines on acquiring and analyzing genomics and proteomics sequences, the structures of protein, DNA and RNA, to the basics of physical simulations and methods for conform

  19. Advanced calculus

    CERN Document Server

    Nickerson, HK; Steenrod, NE

    2011-01-01

    ""This book is a radical departure from all previous concepts of advanced calculus,"" declared the Bulletin of the American Mathematics Society, ""and the nature of this departure merits serious study of the book by everyone interested in undergraduate education in mathematics."" Classroom-tested in a Princeton University honors course, it offers students a unified introduction to advanced calculus. Starting with an abstract treatment of vector spaces and linear transforms, the authors introduce a single basic derivative in an invariant form. All other derivatives - gradient, divergent, curl,

  20. Compensation methods to support generic graph editing: A case study in automated verification of schema requirements for an advanced transaction model

    NARCIS (Netherlands)

    Even, S.J.; Spelt, D.