WorldWideScience

Sample records for system analysis analyzing

  1. AnalyzeThis: An Analysis Workflow-Aware Storage System

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Hyogi [ORNL; Kim, Youngjae [ORNL; Vazhkudai, Sudharshan S [ORNL; Tiwari, Devesh [ORNL; Anwar, Ali [Virginia Tech, Blacksburg, VA; Butt, Ali R [Virginia Tech, Blacksburg, VA; Ramakrishnan, Lavanya [Lawrence Berkeley National Laboratory (LBNL)

    2015-01-01

    The need for novel data analysis is urgent in the face of a data deluge from modern applications. Traditional approaches to data analysis incur significant data movement costs, moving data back and forth between the storage system and the processor. Emerging Active Flash devices enable processing on the flash, where the data already resides. An array of such Active Flash devices allows us to revisit how analysis workflows interact with storage systems. By seamlessly blending together the flash storage and data analysis, we create an analysis workflow-aware storage system, AnalyzeThis. Our guiding principle is that analysis-awareness be deeply ingrained in each and every layer of the storage, elevating data analyses as first-class citizens, and transforming AnalyzeThis into a potent analytics-aware appliance. We implement the AnalyzeThis storage system atop an emulation platform of the Active Flash array. Our results indicate that AnalyzeThis is viable, expediting workflow execution and minimizing data movement.

  2. An image analyzer system for the analysis of nuclear traces

    International Nuclear Information System (INIS)

    Cuapio O, A.

    1990-10-01

    Inside the project of nuclear traces and its application techniques to be applied in the detection of nuclear reactions of low section (non detectable by conventional methods), in the study of accidental and personal neutron dosemeters, and other but, are developed. All these studies are based on the fact that the charged particles leave latent traces of dielectric that if its are engraved with appropriate chemical solutions its are revealed until becoming visible to the optical microscope. From the analysis of the different trace forms, it is possible to obtain information of the characteristic parameters of the incident particles (charge, mass and energy). Of the density of traces it is possible to obtain information of the flow of the incident radiation and consequently of the received dose. For carry out this analysis has been designed and coupled different systems, that it has allowed the solution of diverse outlined problems. Notwithstanding it has been detected that to make but versatile this activity is necessary to have an Image Analyzer System that allow us to digitize, to process and to display the images with more rapidity. The present document, presents the proposal to carry out the acquisition of the necessary components for to assembling an Image Analyzing System, like support to the mentioned project. (Author)

  3. Data acquisition and analysis system for the ion microprobe mass analyzer

    International Nuclear Information System (INIS)

    Darby, D.M.; Cristy, S.S.

    1979-02-01

    A computer was interfaced to an ion microprobe mass analyzer for more rapid data acquisition and analysis. The interface is designed to allow data acquisition, independent of the computer. A large data analysis package was developed and implemented. Performance of the computerized system was evaluated and compared to manual operation

  4. Molecular Detection of Bladder Cancer by Fluorescence Microsatellite Analysis and an Automated Genetic Analyzing System

    Directory of Open Access Journals (Sweden)

    Sarel Halachmi

    2007-01-01

    Full Text Available To investigate the ability of an automated fluorescent analyzing system to detect microsatellite alterations, in patients with bladder cancer. We investigated 11 with pathology proven bladder Transitional Cell Carcinoma (TCC for microsatellite alterations in blood, urine, and tumor biopsies. DNA was prepared by standard methods from blood, urine and resected tumor specimens, and was used for microsatellite analysis. After the primers were fluorescent labeled, amplification of the DNA was performed with PCR. The PCR products were placed into the automated genetic analyser (ABI Prism 310, Perkin Elmer, USA and were subjected to fluorescent scanning with argon ion laser beams. The fluorescent signal intensity measured by the genetic analyzer measured the product size in terms of base pairs. We found loss of heterozygocity (LOH or microsatellite alterations (a loss or gain of nucleotides, which alter the original normal locus size in all the patients by using fluorescent microsatellite analysis and an automated analyzing system. In each case the genetic changes found in urine samples were identical to those found in the resected tumor sample. The studies demonstrated the ability to detect bladder tumor non-invasively by fluorescent microsatellite analysis of urine samples. Our study supports the worldwide trend for the search of non-invasive methods to detect bladder cancer. We have overcome major obstacles that prevented the clinical use of an experimental system. With our new tested system microsatellite analysis can be done cheaper, faster, easier and with higher scientific accuracy.

  5. Computational methods for analyzing the transmission characteristics of a beta particle magnetic analysis system

    Science.gov (United States)

    Singh, J. J.

    1979-01-01

    Computational methods were developed to study the trajectories of beta particles (positrons) through a magnetic analysis system as a function of the spatial distribution of the radionuclides in the beta source, size and shape of the source collimator, and the strength of the analyzer magnetic field. On the basis of these methods, the particle flux, their energy spectrum, and source-to-target transit times have been calculated for Na-22 positrons as a function of the analyzer magnetic field and the size and location of the target. These data are in studies requiring parallel beams of positrons of uniform energy such as measurement of the moisture distribution in composite materials. Computer programs for obtaining various trajectories are included.

  6. Analyzing Information Systems Development: A Comparison and Analysis of Eight IS Development Approaches.

    Science.gov (United States)

    Iivari, Juhani; Hirschheim, Rudy

    1996-01-01

    Analyzes and compares eight information systems (IS) development approaches: Information Modelling, Decision Support Systems, the Socio-Technical approach, the Infological approach, the Interactionist approach, the Speech Act-based approach, Soft Systems Methodology, and the Scandinavian Trade Unionist approach. Discusses the organizational roles…

  7. Analyzing Chaos Systems and Fine Spectrum Sensing Using Detrended Fluctuation Analysis Algorithm

    Directory of Open Access Journals (Sweden)

    Javier S. González-Salas

    2016-01-01

    Full Text Available A numerical study that uses detrended fluctuation analysis (DFA algorithm of time series obtained from linear and nonlinear dynamical systems is presented. The DFA algorithm behavior toward periodic and chaotic signals is investigated and the effect of the time scale under analysis is discussed. The displayed results prove that the DFA algorithm response is invariant (stable performance to initial condition and chaotic system parameters. An initial idea of DFA algorithm implementation for fine spectrum sensing (SS is proposed under two-stage spectrum sensor approach with test statistics based on the scaling exponent value. The outcomes demonstrate a promising new SS technique that can alleviate several imperfections such as noise power uncertainty and spatial correlation between the adjacent antenna array elements.

  8. Arsenic monitoring in intensive systems of production of bovine for analyzes by neutron activation analysis

    International Nuclear Information System (INIS)

    Armelin, M.J.A.; Piasentin, R.M.; Primavesi, O.

    2000-01-01

    Neutron activation analysis was applied to determine Arsenic in samples of several kinds of soils collected at two depths, 0-20 and 20-40 cm, and roots and leaves of grasses cultivated on them, to check the end level of Arsenic in leaves that are used in animal feeding. The results showed that Arsenic from limestones, fertilizers and agrochemicals applied to soil, to increase soil fertility, presents remote potential to injure cattle health. (author)

  9. Dual analyzer system for surface analysis dedicated for angle-resolved photoelectron spectroscopy at liquid surfaces and interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Niedermaier, Inga; Kolbeck, Claudia; Steinrück, Hans-Peter; Maier, Florian, E-mail: florian.maier@fau.de [Lehrstuhl für Physikalische Chemie II, FAU Universität Erlangen-Nürnberg, Egerlandstraße 3, 91058 Erlangen (Germany)

    2016-04-15

    The investigation of liquid surfaces and interfaces with the powerful toolbox of ultra-high vacuum (UHV)-based surface science techniques generally has to overcome the issue of liquid evaporation within the vacuum system. In the last decade, however, new classes of liquids with negligible vapor pressure at room temperature—in particular, ionic liquids (ILs)—have emerged for surface science studies. It has been demonstrated that particularly angle-resolved X-ray Photoelectron Spectroscopy (ARXPS) allows for investigating phenomena that occur at gas-liquid and liquid-solid interfaces on the molecular level. The results are not only relevant for IL systems but also for liquids in general. In all of these previous ARXPS studies, the sample holder had to be tilted in order to change the polar detection angle of emitted photoelectrons, which restricted the liquid systems to very thin viscous IL films coating a flat solid support. We now report on the concept and realization of a new and unique laboratory “Dual Analyzer System for Surface Analysis (DASSA)” which enables fast ARXPS, UV photoelectron spectroscopy, imaging XPS, and low-energy ion scattering at the horizontal surface plane of macroscopically thick non-volatile liquid samples. It comprises a UHV chamber equipped with two electron analyzers mounted for simultaneous measurements in 0° and 80° emission relative to the surface normal. The performance of DASSA on a first macroscopic liquid system will be demonstrated.

  10. Dual analyzer system for surface analysis dedicated for angle-resolved photoelectron spectroscopy at liquid surfaces and interfaces

    International Nuclear Information System (INIS)

    Niedermaier, Inga; Kolbeck, Claudia; Steinrück, Hans-Peter; Maier, Florian

    2016-01-01

    The investigation of liquid surfaces and interfaces with the powerful toolbox of ultra-high vacuum (UHV)-based surface science techniques generally has to overcome the issue of liquid evaporation within the vacuum system. In the last decade, however, new classes of liquids with negligible vapor pressure at room temperature—in particular, ionic liquids (ILs)—have emerged for surface science studies. It has been demonstrated that particularly angle-resolved X-ray Photoelectron Spectroscopy (ARXPS) allows for investigating phenomena that occur at gas-liquid and liquid-solid interfaces on the molecular level. The results are not only relevant for IL systems but also for liquids in general. In all of these previous ARXPS studies, the sample holder had to be tilted in order to change the polar detection angle of emitted photoelectrons, which restricted the liquid systems to very thin viscous IL films coating a flat solid support. We now report on the concept and realization of a new and unique laboratory “Dual Analyzer System for Surface Analysis (DASSA)” which enables fast ARXPS, UV photoelectron spectroscopy, imaging XPS, and low-energy ion scattering at the horizontal surface plane of macroscopically thick non-volatile liquid samples. It comprises a UHV chamber equipped with two electron analyzers mounted for simultaneous measurements in 0° and 80° emission relative to the surface normal. The performance of DASSA on a first macroscopic liquid system will be demonstrated.

  11. Analyzing the Biology on the System Level

    OpenAIRE

    Tong, Wei

    2016-01-01

    Although various genome projects have provided us enormous static sequence information, understanding of the sophisticated biology continues to require integrating the computational modeling, system analysis, technology development for experiments, and quantitative experiments all together to analyze the biology architecture on various levels, which is just the origin of systems biology subject. This review discusses the object, its characteristics, and research attentions in systems biology,...

  12. MACSSA (Macintosh Safeguards Systems Analyzer)

    International Nuclear Information System (INIS)

    Argentesi, F.; Costantini, L.; Kohl, M.

    1986-01-01

    This paper discusses MACSSA a fully interactive menu-driven software system for accountancy of nuclear safeguards systems written for Apple Macintosh. Plant inventory and inventory change records can be entered interactively or can be downloaded from a mainframe database. Measurement procedures and instrument parameters can be defined. Partial or total statistics on propagated errors is performed and shown in tabular or graphic form

  13. Computer-based radionuclide analyzer system

    International Nuclear Information System (INIS)

    Ohba, Kengo; Ishizuka, Akira; Kobayashi, Akira; Ohhashi, Hideaki; Tsuruoka, Kimitoshi.

    1978-01-01

    The radionuclide analysis in nuclear power plants, practiced for the purpose of monitoring the quality of the primary loop water, the confirmation of the performance of reactor cleanup system and monitoring the radioactive waste effluent, is an important job. Important as it is, it requires considerable labor of experts, because the samples to be analyzed are multifarious and very large in number, and in addition, this job depends much on manual work. With a view of saving the labor, simplifying and standardizing the work, reducing radiation exposure, and automatizing the work of analysis, the computerized analyzer system has been worked out. The results of its performance test at the operating power plant have proved that the development has fairly accomplished the objects and that the system is well useful. The developmental work was carried out by the cooperation between The Tokyo Electric Power Co. and Toshiba in about 4 years from 1974 to this year. (auth.)

  14. Analyzing the Pension System of the USSR

    Directory of Open Access Journals (Sweden)

    Aleksei V. Pudovkin

    2015-01-01

    Full Text Available The article under the title "ANALYSIS OF THE PENSION SYSTEM OF THE USSR" deals with numerous aspects of development of the pension system of the former USSR. Since the improvement of the Russian pension system is presently high on the agenda, the author believes that analyzing the own historical experience in the first line is essential in order to create a sound and efficient pension system in Russia. The study presented in the article aims to execute an in-depth analysis of legislature on the soviet pension system with the view to recreate the architecture of the pension system of the USSR. In addition, the study also reflects on the official statistics for the said period to make a qualified and fundamental conclusion on the efficiency of the soviet pension system. The evolution of the pension system, based on statistical data evidently proves the efficiently of the soviet pension system. It is highly recommended that the positive aspects of the soviet pension system are taken into consideration when reforming the actual pension system of Russian Federation.

  15. Automated Root Tracking with "Root System Analyzer"

    Science.gov (United States)

    Schnepf, Andrea; Jin, Meina; Ockert, Charlotte; Bol, Roland; Leitner, Daniel

    2015-04-01

    Crucial factors for plant development are water and nutrient availability in soils. Thus, root architecture is a main aspect of plant productivity and needs to be accurately considered when describing root processes. Images of root architecture contain a huge amount of information, and image analysis helps to recover parameters describing certain root architectural and morphological traits. The majority of imaging systems for root systems are designed for two-dimensional images, such as RootReader2, GiA Roots, SmartRoot, EZ-Rhizo, and Growscreen, but most of them are semi-automated and involve mouse-clicks in each root by the user. "Root System Analyzer" is a new, fully automated approach for recovering root architectural parameters from two-dimensional images of root systems. Individual roots can still be corrected manually in a user interface if required. The algorithm starts with a sequence of segmented two-dimensional images showing the dynamic development of a root system. For each image, morphological operators are used for skeletonization. Based on this, a graph representation of the root system is created. A dynamic root architecture model helps to determine which edges of the graph belong to an individual root. The algorithm elongates each root at the root tip and simulates growth confined within the already existing graph representation. The increment of root elongation is calculated assuming constant growth. For each root, the algorithm finds all possible paths and elongates the root in the direction of the optimal path. In this way, each edge of the graph is assigned to one or more coherent roots. Image sequences of root systems are handled in such a way that the previous image is used as a starting point for the current image. The algorithm is implemented in a set of Matlab m-files. Output of Root System Analyzer is a data structure that includes for each root an identification number, the branching order, the time of emergence, the parent

  16. Isotopic abundance analysis of carbon, nitrogen and sulfur with a combined elemental analyzer-mass spectrometer system

    International Nuclear Information System (INIS)

    Pichlmayer, F.; Blochberger, K.

    1988-01-01

    Stable isotope ratio measurements of carbon, nitrogen and sulfur are of growing interest as analytical tool in many fields of research, but applications were somewhat hindered in the past by the fact that cumbersome sample preparation was necessary. A method has therefore been developed, consisting in essential of coupling an elemental analyzer with an isotope mass spectrometer, enabling fast and reliable conversion of C-, N- and S-compounds in any solid or liquid sample into the measuring gases carbon dioxide, nitrogen and sulfur dioxide for on-line isotopic analysis. The experimental set-up and the main characteristics are described in short and examples of application in environmental research, food analysis and clinical diagnosis are given. (orig.)

  17. Safety Analysis in Large Volume Vacuum Systems Like Tokamak: Experiments and Numerical Simulation to Analyze Vacuum Ruptures Consequences

    Directory of Open Access Journals (Sweden)

    A. Malizia

    2014-01-01

    Full Text Available The large volume vacuum systems are used in many industrial operations and research laboratories. Accidents in these systems should have a relevant economical and safety impact. A loss of vacuum accident (LOVA due to a failure of the main vacuum vessel can result in a fast pressurization of the vessel and consequent mobilization dispersion of hazardous internal material through the braches. It is clear that the influence of flow fields, consequence of accidents like LOVA, on dust resuspension is a key safety issue. In order to develop this analysis an experimental facility is been developed: STARDUST. This last facility has been used to improve the knowledge about LOVA to replicate a condition more similar to appropriate operative condition like to kamaks. By the experimental data the boundary conditions have been extrapolated to give the proper input for the 2D thermofluid-dynamics numerical simulations, developed by the commercial CFD numerical code. The benchmark of numerical simulation results with the experimental ones has been used to validate and tune the 2D thermofluid-dynamics numerical model that has been developed by the authors to replicate the LOVA conditions inside STARDUST. In present work, the facility, materials, numerical model, and relevant results will be presented.

  18. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Kubar, T. L.; Li, J.; Zhang, J.; Wang, W.

    2015-12-01

    Both the National Research Council Decadal Survey and the latest Intergovernmental Panel on Climate Change Assessment Report stressed the need for the comprehensive and innovative evaluation of climate models with the synergistic use of global satellite observations in order to improve our weather and climate simulation and prediction capabilities. The abundance of satellite observations for fundamental climate parameters and the availability of coordinated model outputs from CMIP5 for the same parameters offer a great opportunity to understand and diagnose model biases in climate models. In addition, the Obs4MIPs efforts have created several key global observational datasets that are readily usable for model evaluations. However, a model diagnostic evaluation process requires physics-based multi-variable comparisons that typically involve large-volume and heterogeneous datasets, making them both computationally- and data-intensive. In response, we have developed a novel methodology to diagnose model biases in contemporary climate models and implementing the methodology as a web-service based, cloud-enabled, provenance-supported climate-model evaluation system. The evaluation system is named Climate Model Diagnostic Analyzer (CMDA), which is the product of the research and technology development investments of several current and past NASA ROSES programs. The current technologies and infrastructure of CMDA are designed and selected to address several technical challenges that the Earth science modeling and model analysis community faces in evaluating and diagnosing climate models. In particular, we have three key technology components: (1) diagnostic analysis methodology; (2) web-service based, cloud-enabled technology; (3) provenance-supported technology. The diagnostic analysis methodology includes random forest feature importance ranking, conditional probability distribution function, conditional sampling, and time-lagged correlation map. We have implemented the

  19. Climate Model Diagnostic Analyzer Web Service System

    Science.gov (United States)

    Lee, S.; Pan, L.; Zhai, C.; Tang, B.; Jiang, J. H.

    2014-12-01

    We have developed a cloud-enabled web-service system that empowers physics-based, multi-variable model performance evaluations and diagnoses through the comprehensive and synergistic use of multiple observational data, reanalysis data, and model outputs. We have developed a methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks. The web-service system, called Climate Model Diagnostic Analyzer (CMDA), currently supports (1) all the observational datasets from Obs4MIPs and a few ocean datasets from NOAA and Argo, which can serve as observation-based reference data for model evaluation, (2) many of CMIP5 model outputs covering a broad range of atmosphere, ocean, and land variables from the CMIP5 specific historical runs and AMIP runs, and (3) ECMWF reanalysis outputs for several environmental variables in order to supplement observational datasets. Analysis capabilities currently supported by CMDA are (1) the calculation of annual and seasonal means of physical variables, (2) the calculation of time evolution of the means in any specified geographical region, (3) the calculation of correlation between two variables, (4) the calculation of difference between two variables, and (5) the conditional sampling of one physical variable with respect to another variable. A web user interface is chosen for CMDA because it not only lowers the learning curve and removes the adoption barrier of the tool but also enables instantaneous use, avoiding the hassle of local software installation and environment incompatibility. CMDA will be used as an educational tool for the summer school organized by JPL's Center for Climate Science in 2014. In order to support 30+ simultaneous users during the school, we have deployed CMDA to the Amazon cloud environment. The cloud-enabled CMDA will provide each student with a virtual machine while the user interaction with the system will remain the same

  20. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    Energy Technology Data Exchange (ETDEWEB)

    Tang, A; Samost, A [Massachusetts Institute of Technology, Cambridge, Massachusetts (United States); Viswanathan, A; Cormack, R; Damato, A [Dana-Farber Cancer Institute - Brigham and Women’s Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios were then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in

  1. WE-G-BRA-07: Analyzing the Safety Implications of a Brachytherapy Process Improvement Project Utilizing a Novel System-Theory-Based Hazard-Analysis Technique

    International Nuclear Information System (INIS)

    Tang, A; Samost, A; Viswanathan, A; Cormack, R; Damato, A

    2015-01-01

    Purpose: To investigate the hazards in cervical-cancer HDR brachytherapy using a novel hazard-analysis technique, System Theoretic Process Analysis (STPA). The applicability and benefit of STPA to the field of radiation oncology is demonstrated. Methods: We analyzed the tandem and ring HDR procedure through observations, discussions with physicists and physicians, and the use of a previously developed process map. Controllers and their respective control actions were identified and arranged into a hierarchical control model of the system, modeling the workflow from applicator insertion through initiating treatment delivery. We then used the STPA process to identify potentially unsafe control actions. Scenarios were then generated from the identified unsafe control actions and used to develop recommendations for system safety constraints. Results: 10 controllers were identified and included in the final model. From these controllers 32 potentially unsafe control actions were identified, leading to more than 120 potential accident scenarios, including both clinical errors (e.g., using outdated imaging studies for planning), and managerial-based incidents (e.g., unsafe equipment, budget, or staffing decisions). Constraints identified from those scenarios include common themes, such as the need for appropriate feedback to give the controllers an adequate mental model to maintain safe boundaries of operations. As an example, one finding was that the likelihood of the potential accident scenario of the applicator breaking during insertion might be reduced by establishing a feedback loop of equipment-usage metrics and equipment-failure reports to the management controller. Conclusion: The utility of STPA in analyzing system hazards in a clinical brachytherapy system was demonstrated. This technique, rooted in system theory, identified scenarios both technical/clinical and managerial in nature. These results suggest that STPA can be successfully used to analyze safety in

  2. An image analyzer system for the analysis of nuclear traces; Un sistema analizador de imagenes para el analisis de trazas nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Cuapio O, A

    1990-10-15

    Inside the project of nuclear traces and its application techniques to be applied in the detection of nuclear reactions of low section (non detectable by conventional methods), in the study of accidental and personal neutron dosemeters, and other but, are developed. All these studies are based on the fact that the charged particles leave latent traces of dielectric that if its are engraved with appropriate chemical solutions its are revealed until becoming visible to the optical microscope. From the analysis of the different trace forms, it is possible to obtain information of the characteristic parameters of the incident particles (charge, mass and energy). Of the density of traces it is possible to obtain information of the flow of the incident radiation and consequently of the received dose. For carry out this analysis has been designed and coupled different systems, that it has allowed the solution of diverse outlined problems. Notwithstanding it has been detected that to make but versatile this activity is necessary to have an Image Analyzer System that allow us to digitize, to process and to display the images with more rapidity. The present document, presents the proposal to carry out the acquisition of the necessary components for to assembling an Image Analyzing System, like support to the mentioned project. (Author)

  3. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-10-01

    The Nuclear Plant Analyzer (NPA) is being developed as the US Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  4. CRISP90 - SOFTWARE DESIGN ANALYZER SYSTEM

    Science.gov (United States)

    Tausworthe, R. C.

    1994-01-01

    The CRISP90 Software Design Analyzer System, an update of CRISP-80, is a set of programs forming a software design and documentation tool which supports top-down, hierarchic, modular, structured design and programming methodologies. The quality of a computer program can often be significantly influenced by the design medium in which the program is developed. The medium must foster the expression of the programmer's ideas easily and quickly, and it must permit flexible and facile alterations, additions, and deletions to these ideas as the design evolves. The CRISP90 software design analyzer system was developed to provide the PDL (Programmer Design Language) programmer with such a design medium. A program design using CRISP90 consists of short, English-like textual descriptions of data, interfaces, and procedures that are imbedded in a simple, structured, modular syntax. The display is formatted into two-dimensional, flowchart-like segments for a graphic presentation of the design. Together with a good interactive full-screen editor or word processor, the CRISP90 design analyzer becomes a powerful tool for the programmer. In addition to being a text formatter, the CRISP90 system prepares material that would be tedious and error prone to extract manually, such as a table of contents, module directory, structure (tier) chart, cross-references, and a statistics report on the characteristics of the design. Referenced modules are marked by schematic logic symbols to show conditional, iterative, and/or concurrent invocation in the program. A keyword usage profile can be generated automatically and glossary definitions inserted into the output documentation. Another feature is the capability to detect changes that were made between versions. Thus, "change-bars" can be placed in the output document along with a list of changed pages and a version history report. Also, items may be marked as "to be determined" and each will appear on a special table until the item is

  5. Analyzing endocrine system conservation and evolution.

    Science.gov (United States)

    Bonett, Ronald M

    2016-08-01

    Analyzing variation in rates of evolution can provide important insights into the factors that constrain trait evolution, as well as those that promote diversification. Metazoan endocrine systems exhibit apparent variation in evolutionary rates of their constituent components at multiple levels, yet relatively few studies have quantified these patterns and analyzed them in a phylogenetic context. This may be in part due to historical and current data limitations for many endocrine components and taxonomic groups. However, recent technological advancements such as high-throughput sequencing provide the opportunity to collect large-scale comparative data sets for even non-model species. Such ventures will produce a fertile data landscape for evolutionary analyses of nucleic acid and amino acid based endocrine components. Here I summarize evolutionary rate analyses that can be applied to categorical and continuous endocrine traits, and also those for nucleic acid and protein-based components. I emphasize analyses that could be used to test whether other variables (e.g., ecology, ontogenetic timing of expression, etc.) are related to patterns of rate variation and endocrine component diversification. The application of phylogenetic-based rate analyses to comparative endocrine data will greatly enhance our understanding of the factors that have shaped endocrine system evolution. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Nuclear plant analyzer development and analysis applications

    International Nuclear Information System (INIS)

    Laats, E.T.

    1984-01-01

    The Nuclear Plant Analyzer (NPA) is being developed as the U.S. Nuclear Regulatory Commission's (NRC's) state of the art safety analysis and engineering tool to address key nuclear plant safety issues. The NPA integrates the NRC's computerized reactor behavior simulation codes such as RELAP5 and TRAC-BWR, both of which are well-developed computer graphics programs, and large repositories of reactor design and experimental data. Utilizing the complex reactor behavior codes as well as the experiment data repositories enables simulation applications of the NPA that are generally not possible with more simplistic, less mechanistic reactor behavior codes. These latter codes are used in training simulators or with other NPA-type software packages and are limited to displaying calculated data only. This paper describes four applications of the NPA in assisting reactor safety analyses. Two analyses evaluated reactor operating procedures, during off-normal operation, for a pressurized water reactor (PWR) and a boiling water reactor (BWR), respectively. The third analysis was performed in support of a reactor safety experiment conducted in the Semiscale facility. The final application demonstrated the usefulness of atmospheric dispersion computer codes for site emergency planning purposes. An overview of the NPA and how it supported these analyses are the topics of this paper

  7. Modeling technical change in energy system analysis: analyzing the introduction of learning-by-doing in bottom-up energy models

    International Nuclear Information System (INIS)

    Berglund, Christer; Soederholm, Patrik

    2006-01-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aimed at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options-which is absent in many top-down models-they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they often fail in capturing strategic technology diffusion behavior in the energy sector as well as the energy sector's endogenous responses to policy, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). Some suggestions on how innovation and diffusion modeling in bottom-up analysis can be improved are put forward

  8. Modeling Technical Change in Energy System Analysis: Analyzing the Introduction of Learning-by-Doing in Bottom-up Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Berglund, Christer; Soederholm, Patrik [Luleaa Univ. of Technology (Sweden). Div. of Economics

    2005-02-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aiming at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options - which is absent in many top-down models - they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they fail in capturing strategic technology diffusion behavior in the energy sector, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). For these reasons bottom-up and top-down models with induced technical change should not be viewed as substitutes but rather as complements.

  9. An inline ion-exchange system in a chemiluminescence-based analyzer for direct analysis of N-nitrosamines in treated wastewater.

    Science.gov (United States)

    Kodamatani, Hitoshi; Roback, Shannon L; Plumlee, Megan H; Ishida, Kenneth P; Masunaga, Hiroto; Maruyama, Noboru; Fujioka, Takahiro

    2018-04-13

    A newly developed, ion exchange-based inline pretreatment system was used to mitigate the effect of background constituents in natural water and treated wastewater to achieve rapid, reliable, and sensitive analysis of N-nitrosamines. The pretreatment system (anion exchange module, AEM) was incorporated into a high-performance liquid chromatograph (HPLC) coupled with a photochemical reactor (PR) and chemiluminescence (CL) detector (HPLC-PR-CL), which can analyze four hydrophilic N-nitrosamines at ng/L levels. This system requires no pre-concentration of the water sample nor the use of deuterated surrogates, unlike other conventional N-nitrosamine analytical techniques. The AEM converted anions in the eluent to hydroxide ions after HPLC separation and increased eluent pH, allowing for the subsequent photochemical reactions, which are otherwise achieved by pH conditioning with an additional dosing pump of basic chemical. The AEM also removed anionic interfering compounds (e.g. nitrate) from the samples, allowing for improved N-nitrosamine analysis in treated wastewater. The operating conditions of the AEM and PR were optimized to obtain sensitive and stable analytical performance. As a result, the lowest-concentration minimum reporting levels of N-nitrosodimethylamine, N-nitrosomorpholine, N-nitrosomethylethylamine, and N- nitrosopyrrolidine using the optimized system were 0.42, 0.54, 0.58, and 1.4 ng/L, respectively. The improved analytical method was validated by comparing the results with a conventional method based on gas chromatography coupled with a mass spectrometric ion trap detector. These results indicated that HPLC-PR-CL equipped with an inline AEM can be competitively applied as a rapid analytical technique for the determination of N-nitrosamines in various water matrices. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Systems Analyze Water Quality in Real Time

    Science.gov (United States)

    2010-01-01

    A water analyzer developed under Small Business Innovation Research (SBIR) contracts with Kennedy Space Center now monitors treatment processes at water and wastewater facilities around the world. Originally designed to provide real-time detection of nutrient levels in hydroponic solutions for growing plants in space, the ChemScan analyzer, produced by ASA Analytics Inc., of Waukesha, Wisconsin, utilizes spectrometry and chemometric algorithms to automatically analyze multiple parameters in the water treatment process with little need for maintenance, calibration, or operator intervention. The company has experienced a compound annual growth rate of 40 percent over its 15-year history as a direct result of the technology's success.

  11. Giessen polarization facility. III. Multi-detector analyzing system

    Energy Technology Data Exchange (ETDEWEB)

    Krause, H H; Stock, R; Arnold, W; Berg, H; Huttel, E; Ulbricht, J; Clausnitzer, G [Giessen Univ. (Germany, F.R.). Strahlenzentrum

    1977-06-15

    An analyzing system with a PDP 11 computer and a digital multiplexer is described. It allows to accept signals from 16 detectors with individual ADCs simultaneously. For measurements of analyzing powers the polarization of the ion beam can be switched to zero with a frequency of 1 kHz. The switching operation additionally controls the handling of the detector pulses. The software contains special programs for the analysis of polarization experiments.

  12. SINDA, Systems Improved Numerical Differencing Analyzer

    Science.gov (United States)

    Fink, L. C.; Pan, H. M. Y.; Ishimoto, T.

    1972-01-01

    Computer program has been written to analyze group of 100-node areas and then provide for summation of any number of 100-node areas to obtain temperature profile. SINDA program options offer user variety of methods for solution of thermal analog modes presented in network format.

  13. Multi-faceted data gathering and analyzing system

    International Nuclear Information System (INIS)

    Gustavson, D.B.; Rich, K.

    1977-10-01

    A low-cost general purpose data gathering and analyzing system based on a microprocessor, an interface to CAMAC, and a phone link to a time-sharing system was implemented. The parts cost for the microprocessor system was about $6000. The microprocessor buffers the data such that the variable response of the time-sharing system is acceptable for performing real-time data acquisition. The full power and flexibility of the time-sharing system excels at the task of on-line data analysis once this buffering problem is solved. 4 figures

  14. Statistical network analysis for analyzing policy networks

    DEFF Research Database (Denmark)

    Robins, Garry; Lewis, Jenny; Wang, Peng

    2012-01-01

    and policy network methodology is the development of statistical modeling approaches that can accommodate such dependent data. In this article, we review three network statistical methods commonly used in the current literature: quadratic assignment procedures, exponential random graph models (ERGMs......To analyze social network data using standard statistical approaches is to risk incorrect inference. The dependencies among observations implied in a network conceptualization undermine standard assumptions of the usual general linear models. One of the most quickly expanding areas of social......), and stochastic actor-oriented models. We focus most attention on ERGMs by providing an illustrative example of a model for a strategic information network within a local government. We draw inferences about the structural role played by individuals recognized as key innovators and conclude that such an approach...

  15. Analyzing Innovation Systems (Burkina Faso) | IDRC - International ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Under the supervision of the national centre for scientific and technological research (CNRST), the forum on scientific research and technological innovation (FRSIT) will identify the principal players in the national system of ... Journal articles.

  16. Model for analyzing decontamination process systems

    International Nuclear Information System (INIS)

    Boykin, R.F.; Rolland, C.W.

    1979-06-01

    Selection of equipment and the design of a new facility in light of minimizing cost and maximizing capacity, is a problem managers face many times in the operations of a manufacturing organization. This paper deals with the actual analysis of equipment facility design for a decontamination operation. Discussions on the selection method of the equipment and the development of the facility design criteria are presented along with insight into the problems encountered in the equipment analysis for a new decontamination facility. The presentation also includes a review of the transition from the old facility into the new facility and the process used to minimize the cost and conveyance problems of the transition

  17. Analyzing Trust Perceptions in System Implementations

    DEFF Research Database (Denmark)

    Schlichter, Bjarne Rerup; Rose, Jeremy

    2009-01-01

    ' perceptions of trust relations influence future actions, and in this way have both negative and positive consequences. We also conclude that Giddens' theories of trust provide a promising insight into the dynamic aspects of trust relations in implementation projects, which go further than trust theories...... a framework based on Giddens´ theory of modernity. The framework theorizes dynamic elements of the evolution of trust, not previously investigated in this context. The data collection involves 4 actors interviewed twice in 2006 and 2007; and the data analysis strategy is content analysis using Nvivo software...

  18. Revealing and analyzing networks of environmental systems

    Science.gov (United States)

    Eveillard, D.; Bittner, L.; Chaffron, S.; Guidi, L.; Raes, J.; Karsenti, E.; Bowler, C.; Gorsky, G.

    2015-12-01

    Understanding the interactions between microbial communities and their environment well enough to be able to predict diversity on the basis of physicochemical parameters is a fundamental pursuit of microbial ecology that still eludes us. However, modeling microbial communities is a complicated task, because (i) communities are complex, (ii) most are described qualitatively, and (iii) quantitative understanding of the way communities interacts with their surroundings remains incomplete. Within this seminar, we will illustrate two complementary approaches that aim to overcome these points in different manners. First, we will present a network analysis that focus on the biological carbon pump in the global ocean. The biological carbon pump is the process by which photosynthesis transforms CO2 to organic carbon sinking to the deep-ocean as particles where it is sequestered. While the intensity of the pump correlate to plankton community composition, the underlying ecosystem structure and interactions driving this process remain largely uncharacterized Here we use environmental and metagenomic data gathered during the Tara Oceans expedition to improve understanding of these drivers. We show that specific plankton communities correlate with carbon export and highlight unexpected and overlooked taxa such as Radiolaria, alveolate parasites and bacterial pathogens, as well as Synechococcus and their phages, as key players in the biological pump. Additionally, we show that the abundances of just a few bacterial and viral genes predict most of the global ocean carbon export's variability. Together these findings help elucidate ecosystem drivers of the biological carbon pump and present a case study for scaling from genes-to-ecosystems. Second, we will show preliminary results on a probabilistic modeling that predicts microbial community structure across observed physicochemical data, from a putative network and partial quantitative knowledge. This modeling shows that, despite

  19. Experimental analysis of a new retarding field energy analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Wei, Yu-Xiang [Shanghai Institute of Mechanical and Electrical Engineering, No. 3888, Yuanjiang Road, Minhang District, Shanghai 201109 (China); Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Shu-Qing; Li, Xian-Xia; Shen, Hong-Li; Huang, Ming-Guang [Institute of Electronics, Chinese Academy of Sciences, No. 19, North 4th Ring Road West, Haidian District, Beijing 100190 (China); Liu, Pu-Kun, E-mail: pkliu@pku.edu.cn [School of Electronics Engineering and Computer Science, Peking University, No. 5, Yiheyuan Road, Haidian District, Beijing 100871 (China)

    2015-06-11

    In this paper, a new compact retarding field energy analyzer (RFEA) is designed for diagnosing electron beams of a K-band space travelling-wave tube (TWT). This analyzer has an aperture plate to sample electron beams and a cylindrical electrode to overcome the defocusing effects. The front end of the analyzer constructed as a multistage depression collector (MDC) structure is intended to shape the field to prevent electrons from being accelerated to escape. The direct-current (DC) beams of the K-band space TWTs with the removing MDC can be investigated on the beam measurement system. The current density distribution of DC beams is determined by the analyzer, while the anode voltage and helix voltage of the TWTs are 7000 V and 6850 V, respectively. The current curve’s slope effect due to the reflection of secondary electrons on the copper collector of the analyzer is discussed. The experimental analysis shows this RFEA has a good energy resolution to satisfy the requirement of beam measurement. - Highlights: • A new retarding field energy analyzer (RFEA) is designed to diagnose the electron beam of a K-band space TWT. • The current density distribution of direct-current beam is determined. • The reflection effect of secondary electrons on the copper collector of the analyzer is discussed.

  20. Development of automatic nuclear plate analyzing system equipped with TV measuring unit and its application to analysis of elementary particle reaction, 1

    International Nuclear Information System (INIS)

    Ushida, Noriyuki

    1987-01-01

    Various improvements are made on an analysis system which was previously reported. Twenty five emulsion plates, each with a decreased size of 3 cm x 3 cm, are mounted on a single acrylic resin sheet to reduce the required measurement time. An interface called New DOMS (digitized on-line microscope) is designed to reduce the analysis time and to improve the reliability of the analysis. The newly developed analysis system consists of five blocks: a stage block (with measuring range of 170 mm along the x and y axes and 2 mm along the z axis and an accuracy of 1 μm for each axis), DG-M10 host computer (with external storages for 15M byte hard disk and 368k byte minifloppy disk), DOMS interface (for control of the stage, operation of the graphic image and control of the CCD TV measuring unit), CCD TV measuring unit (equipped with a CCD TV camera to display the observed emulsion on a TV monitor for measuring the grain position), and measurement terminal (consisting of a picture monitor, video terminal module and keyboards). This report also shows a DOMS system function block diagram (crate controller and I/O, phase converter, motor controller, sub CPU for dysplay, graphic memory, ROM writer, power supply), describes the CCD TV measuring unit hardware (CCD TV camera, sync. separator, window generator, darkest point detector, mixer, focus counter), and outlines the connections among the components. (Nogami, K.)

  1. A multichannel analyzer computer system for simultaneously measuring 64 spectra

    International Nuclear Information System (INIS)

    Jin Yuheng; Wan Yuqing; Zhang Jiahong; Li Li; Chen Guozhu

    2000-01-01

    The author introduces a multichannel analyzer computer system for simultaneously measuring 64 spectra with 64 coded independent inputs. The system is developed for a double chopper neutron scattering time-of-flight spectrometer. The system structure, coding method, operating principle and performances are presented. The system can also be used for other nuclear physics experiments which need multichannel analyzer with independent coded inputs

  2. Development of a PDA Based Portable Pulse Height Analyzer System

    International Nuclear Information System (INIS)

    Mankheed, Panuphong; Ngernvijit, Narippawaj; Thong-Aram, Decho

    2007-08-01

    Full text: In this research a portable pulse height analyzer system was developed by application of a Personal Digital Assistant (PDAs) palm Tungsten T model together with Single Chip SCA developed by Department of Nuclear Technology, Chulalongkorn University to be used for education and research works. Capability of the developed system could measure both the energy and the average count rate of gamma rays. The results of this research showed that the gamma energy spectrum analysis of the developed system with a 2? x 2? NaI(Tl) detector could display photo peaks of Cs-137 and Co-60 at channel 57, channel 103, and channel 117 respectively. The energy resolution was found to be 7.14% at energy 661.66 keV of Cs-137

  3. DMET-analyzer: automatic analysis of Affymetrix DMET data.

    Science.gov (United States)

    Guzzi, Pietro Hiram; Agapito, Giuseppe; Di Martino, Maria Teresa; Arbitrio, Mariamena; Tassone, Pierfrancesco; Tagliaferri, Pierosandro; Cannataro, Mario

    2012-10-05

    Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters) is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism) on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix) and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i) to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii) the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP), (iii) the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists) to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different case studies regarding the analysis of

  4. DMET-Analyzer: automatic analysis of Affymetrix DMET Data

    Directory of Open Access Journals (Sweden)

    Guzzi Pietro

    2012-10-01

    Full Text Available Abstract Background Clinical Bioinformatics is currently growing and is based on the integration of clinical and omics data aiming at the development of personalized medicine. Thus the introduction of novel technologies able to investigate the relationship among clinical states and biological machineries may help the development of this field. For instance the Affymetrix DMET platform (drug metabolism enzymes and transporters is able to study the relationship among the variation of the genome of patients and drug metabolism, detecting SNPs (Single Nucleotide Polymorphism on genes related to drug metabolism. This may allow for instance to find genetic variants in patients which present different drug responses, in pharmacogenomics and clinical studies. Despite this, there is currently a lack in the development of open-source algorithms and tools for the analysis of DMET data. Existing software tools for DMET data generally allow only the preprocessing of binary data (e.g. the DMET-Console provided by Affymetrix and simple data analysis operations, but do not allow to test the association of the presence of SNPs with the response to drugs. Results We developed DMET-Analyzer a tool for the automatic association analysis among the variation of the patient genomes and the clinical conditions of patients, i.e. the different response to drugs. The proposed system allows: (i to automatize the workflow of analysis of DMET-SNP data avoiding the use of multiple tools; (ii the automatic annotation of DMET-SNP data and the search in existing databases of SNPs (e.g. dbSNP, (iii the association of SNP with pathway through the search in PharmaGKB, a major knowledge base for pharmacogenomic studies. DMET-Analyzer has a simple graphical user interface that allows users (doctors/biologists to upload and analyse DMET files produced by Affymetrix DMET-Console in an interactive way. The effectiveness and easy use of DMET Analyzer is demonstrated through different

  5. Radiometric flow injection analysis with an ASIA (Ismatec) analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Myint, U; Win, N; San, K; Han, B; Myoe, K M [Yangon Univ. (Myanmar). Dept. of Chemistry; Toelgyessy, J [Slovak Technical Univ., Bratislava (Slovakia). Dept. of Environmental Science

    1994-07-01

    Radiometric Flow Injection Analysis of a radioactive ([sup 131]I) sample is described. For analysis an ASIA (Ismatec) analyzer with a NaI(Tl) scintillation detector was used. (author) 5 refs.; 3 figs.

  6. Multi channel analyzer system addible to personal computer

    International Nuclear Information System (INIS)

    Ramirez J, F.J.; Garcia R, R.; Ramirez N, R.; Torres B, M.A.

    1996-01-01

    It has been developed a Multichannel analyzer system which was added to personal computer of 4096 channels for its use, in nuclear radiations matters, such as X-ray fluorescence analysis, Neutron activation analysis, etc. in that is interesting to know the radiation energy distribution. This system has three modules; a Digital analogical converter of 12 bits, fast (total conversion time of 6 μ s) that use a successive approximation technique with linearity correction by the gliding rule method. A digital card with microprocessor that is useful as an interface with the computer for the acquisition, data storage and the process control. A computer program with extensive use of graphics, friendly with the user in order to facilitate its utilization, also with the option to find peaks, an expansion of an interesting area, information storage in compatible format with spectra analysis programs, etc. Twenty five units of this system have been constructed which all of they were distributed to the ARCAL Member countries and the other 10 units were distributed in the National Institute of Nuclear Research. On the other hand, it has been able to find other applications where the information can be converted to pulses and the interest variable is represented by the pulse amplitude. (Author)

  7. AnalyzeHOLE: An Integrated Wellbore Flow Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    Keith J. Halford

    2009-10-01

    Conventional interpretation of flow logs assumes that hydraulic conductivity is directly proportional to flow change with depth. However, well construction can significantly alter the expected relation between changes in fluid velocity and hydraulic conductivity. Strong hydraulic conductivity contrasts between lithologic intervals can be masked in continuously screened wells. Alternating intervals of screen and blank casing also can greatly complicate the relation between flow and hydraulic properties. More permeable units are not necessarily associated with rapid fluid-velocity increases. Thin, highly permeable units can be misinterpreted as thick and less permeable intervals or not identified at all. These conditions compromise standard flow-log interpretation because vertical flow fields are induced near the wellbore. AnalyzeHOLE, an integrated wellbore analysis tool for simulating flow and transport in wells and aquifer systems, provides a better alternative for simulating and evaluating complex well-aquifer system interaction. A pumping well and adjacent aquifer system are simulated with an axisymmetric, radial geometry in a two-dimensional MODFLOW model. Hydraulic conductivities are distributed by depth and estimated with PEST by minimizing squared differences between simulated and measured flows and drawdowns. Hydraulic conductivity can vary within a lithology but variance is limited with regularization. Transmissivity of the simulated system also can be constrained to estimates from single-well, pumping tests. Water-quality changes in the pumping well are simulated with simple mixing models between zones of differing water quality. These zones are differentiated by backtracking thousands of particles from the well screens with MODPATH. An Excel spreadsheet is used to interface the various components of AnalyzeHOLE by (1) creating model input files, (2) executing MODFLOW, MODPATH, PEST, and supporting FORTRAN routines, and (3) importing and graphically

  8. Agent-based Modeling Methodology for Analyzing Weapons Systems

    Science.gov (United States)

    2015-03-26

    technique involve model structure, system representation and the degree of validity, coupled with the simplicity, of the overall model. ABM is best suited... system representation of the air combat system . We feel that a simulation model that combines ABM with equation-based representation of weapons and...AGENT-BASED MODELING METHODOLOGY FOR ANALYZING WEAPONS SYSTEMS THESIS Casey D. Connors, Major, USA

  9. Point-of-care, portable microfluidic blood analyzer system

    Science.gov (United States)

    Maleki, Teimour; Fricke, Todd; Quesenberry, J. T.; Todd, Paul W.; Leary, James F.

    2012-03-01

    Recent advances in MEMS technology have provided an opportunity to develop microfluidic devices with enormous potential for portable, point-of-care, low-cost medical diagnostic tools. Hand-held flow cytometers will soon be used in disease diagnosis and monitoring. Despite much interest in miniaturizing commercially available cytometers, they remain costly, bulky, and require expert operation. In this article, we report progress on the development of a battery-powered handheld blood analyzer that will quickly and automatically process a drop of whole human blood by real-time, on-chip magnetic separation of white blood cells (WBCs), fluorescence analysis of labeled WBC subsets, and counting a reproducible fraction of the red blood cells (RBCs) by light scattering. The whole blood (WB) analyzer is composed of a micro-mixer, a special branching/separation system, an optical detection system, and electronic readout circuitry. A droplet of un-processed blood is mixed with the reagents, i.e. magnetic beads and fluorescent stain in the micro-mixer. Valve-less sorting is achieved by magnetic deflection of magnetic microparticle-labeled WBC. LED excitation in combination with an avalanche photodiode (APD) detection system is used for counting fluorescent WBC subsets using several colors of immune-Qdots, while counting a reproducible fraction of red blood cells (RBC) is performed using a laser light scatting measurement with a photodiode. Optimized branching/channel width is achieved using Comsol Multi-Physics™ simulation. To accommodate full portability, all required power supplies (40v, +/-10V, and +3V) are provided via step-up voltage converters from one battery. A simple onboard lock-in amplifier is used to increase the sensitivity/resolution of the pulse counting circuitry.

  10. Analyzing Software Errors in Safety-Critical Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1994-01-01

    This paper analyzes the root causes of safty-related software faults identified as potentially hazardous to the system are distributed somewhat differently over the set of possible error causes than non-safety-related software faults.

  11. A Framework for Modeling and Analyzing Complex Distributed Systems

    National Research Council Canada - National Science Library

    Lynch, Nancy A; Shvartsman, Alex Allister

    2005-01-01

    Report developed under STTR contract for topic AF04-T023. This Phase I project developed a modeling language and laid a foundation for computational support tools for specifying, analyzing, and verifying complex distributed system designs...

  12. Information Security Analysis: A Study to Analyze the Extent to Which Information Security Systems Can Be Utilized to Prevent Intoxicated Individuals from Driving

    Science.gov (United States)

    Pierre, Joseph D.

    2011-01-01

    Information security systems (ISS) have been designed to protect assets from damages and from unauthorized access internally as well as externally. This research is promising similar protection from ISS methods that could prevent intoxicated individuals under the influence of alcohol from driving. However, previous research has shown significant…

  13. Analyzing Systemic Risk in the Chinese Banking System

    NARCIS (Netherlands)

    Huang, Qiubin; de Haan, Jakob; Scholtens, Bert

    We examine systemic risk in the Chinese banking system by estimating the conditional value at risk (CoVaR), the marginal expected shortfall (MES), the systemic impact index (SII) and the vulnerability index (VI) for 16 listed banks in China. Although these measures show different patterns, our

  14. Analysis and discussion on the experimental data of electrolyte analyzer

    Science.gov (United States)

    Dong, XinYu; Jiang, JunJie; Liu, MengJun; Li, Weiwei

    2018-06-01

    In the subsequent verification of electrolyte analyzer, we found that the instrument can achieve good repeatability and stability in repeated measurements with a short period of time, in line with the requirements of verification regulation of linear error and cross contamination rate, but the phenomenon of large indication error is very common, the measurement results of different manufacturers have great difference, in order to find and solve this problem, help enterprises to improve quality of product, to obtain accurate and reliable measurement data, we conducted the experimental evaluation of electrolyte analyzer, and the data were analyzed by statistical analysis.

  15. Computer program analyzes and monitors electrical power systems (POSIMO)

    Science.gov (United States)

    Jaeger, K.

    1972-01-01

    Requirements to monitor and/or simulate electric power distribution, power balance, and charge budget are discussed. Computer program to analyze power system and generate set of characteristic power system data is described. Application to status indicators to denote different exclusive conditions is presented.

  16. Wind energy system time-domain (WEST) analyzers

    Science.gov (United States)

    Dreier, M. E.; Hoffman, J. A.

    1981-01-01

    A portable analyzer which simulates in real time the complex nonlinear dynamics of horizontal axis wind energy systems was constructed. Math models for an aeroelastic rotor featuring nonlinear aerodynamic and inertial terms were implemented with high speed digital controllers and analog calculation. This model was combined with other math models of elastic supports, control systems, a power train and gimballed rotor kinematics. A stroboscopic display system graphically depicting distributed blade loads, motion, and other aerodynamic functions on a cathode ray tube is included. Limited correlation efforts showed good comparison between the results of this analyzer and other sophisticated digital simulations. The digital simulation results were successfully correlated with test data.

  17. Analyze of the Measuring Performance for Artificially Business Intelligent Systems

    OpenAIRE

    Vatuiu, Teodora

    2007-01-01

    This paper analyzes the measuring performance of artificially business intelligent systems. Thousands of persons-years have been devoted to the research and development in the vari¬ous aspects of artificially intelligent systems. Much progress has been attained. However, there has been no means of evaluating the progress of the field. How can we assess the cur¬rent state of the science? Most of business intelligent systems are beginning to be deployed commercially. How can a commercial buyer ...

  18. Regional modeling approach for analyzing harmonic stability in radial power electronics based power system

    DEFF Research Database (Denmark)

    Yoon, Changwoo; Bai, Haofeng; Wang, Xiongfei

    2015-01-01

    Stability analysis of distributed power generation system becomes complex when there are many numbers of grid inverters in the system. In order to analyze system stability, the overall network impedance will be lumped and needs to be analyzed one by one. However, using a unified bulky transfer-fu...... and then it is expanded for generalizing its concept to an overall radial structured network....

  19. On-line analyzers to distributed control system linking

    Energy Technology Data Exchange (ETDEWEB)

    Peterson, S.F.; Buchanan, B.R.; Sanders, M.A.

    1990-01-01

    The Analytical Development Section (ADS) of the Savannah River Laboratory is developing on-line analyzers to monitor various site processes. Data from some of the on-line analyzers (OLA's) will be used for process control by distributed control systems (DCS's) such as the Fisher PRoVOX. A problem in the past has been an efficient and cost effective way to get analyzer data onto the DCS data highway. ADS is developing a system to accomplish the linking of OLA's to PRoVOX DCS's. The system will be described, and results of operation in a research and development environment given. Plans for the installation in the production environment will be discussed.

  20. The security system analyzer: An application of the Prolog language

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Seeman, S.E.

    1986-01-01

    The Prolog programming language and entity-relationship modeling techniques were used to demonstrate a methodology for security system applications. A knowledge base was built that consists of statements modeling a generic building and surrounding area, including security fences and intrusion detectors (sensors and TV cameras). Declarative Prolog statements have the capability to use the knowledge base information in a routine manner to provide descriptive information about sensors, to dynamically update the knowledge base to provide on-line recording of changes in detector status or maintenance history, and to analyze the configuration of the building, surrounding area, and intrusion detector layout and current operability status in order to determine all the pathways from one specified point to another specified point which result in the detection probability being less than some specified value (i.e., find the ''weakest paths''). This ''search'' capability, which is the heart of the SECURITY program, allows the program to perform a CAD (computer aided design) function, and to provide a real-time security degradation analysis if intrusion detectors become inoperable. 2 refs., 3 figs

  1. Magnetic systems for wide-aperture neutron polarizers and analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Gilev, A.G. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Pleshanov, N.K., E-mail: pnk@pnpi.spb.ru [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Bazarov, B.A.; Bulkin, A.P.; Schebetov, A.F. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Syromyatnikov, V.G. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation); Physical Department, St. Petersburg State University, Ulyanovskaya, 1, Petrodvorets, St. Petersburg 198504 (Russian Federation); Tarnavich, V.V.; Ulyanov, V.A. [Neutron Research Department, Petersburg Nuclear Physics Institute, NRC “Kurchatov Institute”, Orlova Roscha, Gatchina, St. Petersburg 188300 (Russian Federation)

    2016-10-11

    Requirements on the field uniformity in neutron polarizers are analyzed in view of the fact that neutron polarizing coatings have been improved during the past decade. The design of magnetic systems that meet new requirements is optimized by numerical simulations. Magnetic systems for wide-aperture multichannel polarizers and analyzers are represented, including (a) the polarizer to be built at channel 4-4′ of the reactor PIK (Gatchina, Russia) for high-flux experiments with a 100×150 mm{sup 2} beam of polarized cold neutrons; (b) the fan analyzer covering a 150×100 mm{sup 2} window of the detector at the Magnetism Reflectometer (SNS, ORNL, USA); (c) the polarizer and (d) the fan analyzer covering a 220×110 mm{sup 2} window of the detector at the reflectometer NERO, which is transferred to PNPI (Russia) from HZG (Germany). Deviations of the field from the vertical did not exceed 2°. The polarizing efficiency of the analyzer at the Magnetism Reflectometer reached 99%, a record level for wide-aperture supermirror analyzers.

  2. Developing an Approach for Analyzing and Verifying System Communication

    Science.gov (United States)

    Stratton, William C.; Lindvall, Mikael; Ackermann, Chris; Sibol, Deane E.; Godfrey, Sally

    2009-01-01

    This slide presentation reviews a project for developing an approach for analyzing and verifying the inter system communications. The motivation for the study was that software systems in the aerospace domain are inherently complex, and operate under tight constraints for resources, so that systems of systems must communicate with each other to fulfill the tasks. The systems of systems requires reliable communications. The technical approach was to develop a system, DynSAVE, that detects communication problems among the systems. The project enhanced the proven Software Architecture Visualization and Evaluation (SAVE) tool to create Dynamic SAVE (DynSAVE). The approach monitors and records low level network traffic, converting low level traffic into meaningful messages, and displays the messages in a way the issues can be detected.

  3. Evaluation of system codes for analyzing naturally circulating gas loop

    International Nuclear Information System (INIS)

    Lee, Jeong Ik; No, Hee Cheon; Hejzlar, Pavel

    2009-01-01

    Steady-state natural circulation data obtained in a 7 m-tall experimental loop with carbon dioxide and nitrogen are presented in this paper. The loop was originally designed to encompass operating range of a prototype gas-cooled fast reactor passive decay heat removal system, but the results and conclusions are applicable to any natural circulation loop operating in regimes having buoyancy and acceleration parameters within the ranges validated in this loop. Natural circulation steady-state data are compared to numerical predictions by two system analysis codes: GAMMA and RELAP5-3D. GAMMA is a computational tool for predicting various transients which can potentially occur in a gas-cooled reactor. The code has a capability of analyzing multi-dimensional multi-component mixtures and includes models for friction, heat transfer, chemical reaction, and multi-component molecular diffusion. Natural circulation data with two gases show that the loop operates in the deteriorated turbulent heat transfer (DTHT) regime which exhibits substantially reduced heat transfer coefficients compared to the forced turbulent flow. The GAMMA code with an original heat transfer package predicted conservative results in terms of peak wall temperature. However, the estimated peak location did not successfully match the data. Even though GAMMA's original heat transfer package included mixed-convection regime, which is a part of the DTHT regime, the results showed that the original heat transfer package could not reproduce the data with sufficient accuracy. After implementing a recently developed correlation and corresponding heat transfer regime map into GAMMA to cover the whole range of the DTHT regime, we obtained better agreement with the data. RELAP5-3D results are discussed in parallel.

  4. A computerised EEG-analyzing system for small laboratory animals

    NARCIS (Netherlands)

    Kropveld, D.; Chamuleau, R. A.; Popken, R. J.; Smith, J.

    1983-01-01

    The experimental setup, including instrumentation and software packaging, is described for the use of a minicomputer as an on-line analyzing system of the EEG in rats. Complete fast Fourier transformation of the EEG sampled in 15 episodes of 10 s each is plotted out within 7 min after the start of

  5. Development of the static analyzer ANALYSIS/EX for FORTRAN programs

    International Nuclear Information System (INIS)

    Osanai, Seiji; Yokokawa, Mitsuo

    1993-08-01

    The static analyzer 'ANALYSIS' is the software tool for analyzing tree structure and COMMON regions of a FORTRAN program statically. With the installation of the new FORTRAN compiler, FORTRAN77EX(V12), to the computer system at JAERI, a new version of ANALYSIS, 'ANALYSIS/EX', has been developed to enhance its analyzing functions. In addition to the conventional functions of ANALYSIS, the ANALYSIS/EX is capable of analyzing of FORTRAN programs written in the FORTRAN77EX(V12) language grammar such as large-scale nuclear codes. The analyzing function of COMMON regions are also improved so as to obtain the relation between variables in COMMON regions in more detail. In this report, results of improvement and enhanced functions of the static analyzer ANALYSIS/EX are presented. (author)

  6. Analyzing Software Requirements Errors in Safety-Critical, Embedded Systems

    Science.gov (United States)

    Lutz, Robyn R.

    1993-01-01

    This paper analyzes the root causes of safety-related software errors in safety-critical, embedded systems. The results show that software errors identified as potentially hazardous to the system tend to be produced by different error mechanisms than non- safety-related software errors. Safety-related software errors are shown to arise most commonly from (1) discrepancies between the documented requirements specifications and the requirements needed for correct functioning of the system and (2) misunderstandings of the software's interface with the rest of the system. The paper uses these results to identify methods by which requirements errors can be prevented. The goal is to reduce safety-related software errors and to enhance the safety of complex, embedded systems.

  7. A Fuzzy Logic System to Analyze a Student's Lifestyle

    OpenAIRE

    Ghosh, Sourish; Boob, Aaditya Sanjay; Nikhil, Nishant; Vysyaraju, Nayan Raju; Kumar, Ankit

    2016-01-01

    A college student's life can be primarily categorized into domains such as education, health, social and other activities which may include daily chores and travelling time. Time management is crucial for every student. A self realisation of one's daily time expenditure in various domains is therefore essential to maximize one's effective output. This paper presents how a mobile application using Fuzzy Logic and Global Positioning System (GPS) analyzes a student's lifestyle and provides recom...

  8. Systems and methods for modeling and analyzing networks

    Science.gov (United States)

    Hill, Colin C; Church, Bruce W; McDonagh, Paul D; Khalil, Iya G; Neyarapally, Thomas A; Pitluk, Zachary W

    2013-10-29

    The systems and methods described herein utilize a probabilistic modeling framework for reverse engineering an ensemble of causal models, from data and then forward simulating the ensemble of models to analyze and predict the behavior of the network. In certain embodiments, the systems and methods described herein include data-driven techniques for developing causal models for biological networks. Causal network models include computational representations of the causal relationships between independent variables such as a compound of interest and dependent variables such as measured DNA alterations, changes in mRNA, protein, and metabolites to phenotypic readouts of efficacy and toxicity.

  9. A new grading system for analyzing pediatric cholesteatoma

    International Nuclear Information System (INIS)

    Kodama, Akira; Ashimori, Naoki; Tsurita, Minako; Ban, Akihiro

    2007-01-01

    We developed a new grading system to understand the complicated pathological changes of cholesteatoma in comparison to those of chronic otitis media. This grading system, based on the extent of the cholesteatoma and the surrounding pathlogical changes, is able to simply express the entire pathological condition of the ear with cholesteatoma. Using this grading system, we analyzed the ears of 48 children with cholesteatoma, who underwent tympanoplasty over the past ten years. Their ages ranged from 2 to 15 years with an average age of 8.5 years. The attic and mesotympanic cholesteatomas were associated with greater pathological changes than those observed in postero-superior quadrant cholesteatomas. The degree of the pathological change in the area surrounding the cholesteatoma appears to correlate with the degree of extention of the cholesteatoma. This system is thus considered to be useful for evaluating the improvement in the pathological conditions before and after surgery in patients with cholesteatoma. (author)

  10. Total-System Analyzer for performance assessment of Yucca Mountain

    International Nuclear Information System (INIS)

    Wilson, M.L.; Lauffer, F.C.; Cummings, J.C.; Zieman, N.B.

    1990-01-01

    The Total-System Analyzer is a modular computer program for probabilistic total-system performance calculations. The code employs stratified random sampling from model parameter distribution functions to generate multiple realizations of the system. The results of flow and transport calculations for each realization are combined into a probability distribution function of the system response as indicated by the performance measure. We give a detailed description of the code and present results for four example problems simulating the release of radionuclides from a proposed high-level-waste repository at Yucca Mountain, Nevada. The example simulations illustrate the impact of significant variation of percolation flux and sorption on radionuclide releases. We discuss the effects of numerical sampling error and of correlations among the model parameters. 20 refs., 7 figs., 2 tabs

  11. Analyzing availability using transfer function models and cross spectral analysis

    International Nuclear Information System (INIS)

    Singpurwalla, N.D.

    1980-01-01

    The paper shows how the methods of multivariate time series analysis can be used in a novel way to investigate the interrelationships between a series of operating (running) times and a series of maintenance (down) times of a complex system. Specifically, the techniques of cross spectral analysis are used to help obtain a Box-Jenkins type transfer function model for the running times and the down times of a nuclear reactor. A knowledge of the interrelationships between the running times and the down times is useful for an evaluation of maintenance policies, for replacement policy decisions, and for evaluating the availability and the readiness of complex systems

  12. Protocol Analysis as a Method for Analyzing Conversational Data.

    Science.gov (United States)

    Aleman, Carlos G.; Vangelisti, Anita L.

    Protocol analysis, a technique that uses people's verbal reports about their cognitions as they engage in an assigned task, has been used in a number of applications to provide insight into how people mentally plan, assess, and carry out those assignments. Using a system of networked computers where actors communicate with each other over…

  13. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITH NASADIG)

    Science.gov (United States)

    Anderson, G. E.

    1994-01-01

    The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data

  14. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (DEC VAX VERSION WITHOUT NASADIG)

    Science.gov (United States)

    Vogt, R. A.

    1994-01-01

    The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data

  15. TRASYS - THERMAL RADIATION ANALYZER SYSTEM (CRAY VERSION WITH NASADIG)

    Science.gov (United States)

    Anderson, G. E.

    1994-01-01

    The Thermal Radiation Analyzer System, TRASYS, is a computer software system with generalized capability to solve the radiation related aspects of thermal analysis problems. TRASYS computes the total thermal radiation environment for a spacecraft in orbit. The software calculates internode radiation interchange data as well as incident and absorbed heat rate data originating from environmental radiant heat sources. TRASYS provides data of both types in a format directly usable by such thermal analyzer programs as SINDA/FLUINT (available from COSMIC, program number MSC-21528). One primary feature of TRASYS is that it allows users to write their own driver programs to organize and direct the preprocessor and processor library routines in solving specific thermal radiation problems. The preprocessor first reads and converts the user's geometry input data into the form used by the processor library routines. Then, the preprocessor accepts the user's driving logic, written in the TRASYS modified FORTRAN language. In many cases, the user has a choice of routines to solve a given problem. Users may also provide their own routines where desirable. In particular, the user may write output routines to provide for an interface between TRASYS and any thermal analyzer program using the R-C network concept. Input to the TRASYS program consists of Options and Edit data, Model data, and Logic Flow and Operations data. Options and Edit data provide for basic program control and user edit capability. The Model data describe the problem in terms of geometry and other properties. This information includes surface geometry data, documentation data, nodal data, block coordinate system data, form factor data, and flux data. Logic Flow and Operations data house the user's driver logic, including the sequence of subroutine calls and the subroutine library. Output from TRASYS consists of two basic types of data: internode radiation interchange data, and incident and absorbed heat rate data

  16. Research and analyze of physical health using multiple regression analysis

    Directory of Open Access Journals (Sweden)

    T. S. Kyi

    2014-01-01

    Full Text Available This paper represents the research which is trying to create a mathematical model of the "healthy people" using the method of regression analysis. The factors are the physical parameters of the person (such as heart rate, lung capacity, blood pressure, breath holding, weight height coefficient, flexibility of the spine, muscles of the shoulder belt, abdominal muscles, squatting, etc.., and the response variable is an indicator of physical working capacity. After performing multiple regression analysis, obtained useful multiple regression models that can predict the physical performance of boys the aged of fourteen to seventeen years. This paper represents the development of regression model for the sixteen year old boys and analyzed results.

  17. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    Science.gov (United States)

    Majidi, Keivan; Li, Jun; Muehleman, Carol; Brankov, Jovan G.

    2014-04-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér-Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  18. Noise and analyzer-crystal angular position analysis for analyzer-based phase-contrast imaging

    International Nuclear Information System (INIS)

    Majidi, Keivan; Brankov, Jovan G; Li, Jun; Muehleman, Carol

    2014-01-01

    The analyzer-based phase-contrast x-ray imaging (ABI) method is emerging as a potential alternative to conventional radiography. Like many of the modern imaging techniques, ABI is a computed imaging method (meaning that images are calculated from raw data). ABI can simultaneously generate a number of planar parametric images containing information about absorption, refraction, and scattering properties of an object. These images are estimated from raw data acquired by measuring (sampling) the angular intensity profile of the x-ray beam passed through the object at different angular positions of the analyzer crystal. The noise in the estimated ABI parametric images depends upon imaging conditions like the source intensity (flux), measurements angular positions, object properties, and the estimation method. In this paper, we use the Cramér–Rao lower bound (CRLB) to quantify the noise properties in parametric images and to investigate the effect of source intensity, different analyzer-crystal angular positions and object properties on this bound, assuming a fixed radiation dose delivered to an object. The CRLB is the minimum bound for the variance of an unbiased estimator and defines the best noise performance that one can obtain regardless of which estimation method is used to estimate ABI parametric images. The main result of this paper is that the variance (hence the noise) in parametric images is directly proportional to the source intensity and only a limited number of analyzer-crystal angular measurements (eleven for uniform and three for optimal non-uniform) are required to get the best parametric images. The following angular measurements only spread the total dose to the measurements without improving or worsening CRLB, but the added measurements may improve parametric images by reducing estimation bias. Next, using CRLB we evaluate the multiple-image radiography, diffraction enhanced imaging and scatter diffraction enhanced imaging estimation techniques

  19. Diagnosis of cystic fibrosis with chloride meter (Sherwood M926S chloride analyzer®) and sweat test analysis system (CFΔ collection system®) compared to the Gibson Cooke method.

    Science.gov (United States)

    Emiralioğlu, Nagehan; Özçelik, Uğur; Yalçın, Ebru; Doğru, Deniz; Kiper, Nural

    2016-01-01

    Sweat test with Gibson Cooke (GC) method is the diagnostic gold standard for cystic fibrosis (CF). Recently, alternative methods have been introduced to simplify both the collection and analysis of sweat samples. Our aim was to compare sweat chloride values obtained by GC method with other sweat test methods in patients diagnosed with CF and whose CF diagnosis had been ruled out. We wanted to determine if the other sweat test methods could reliably identify patients with CF and differentiate them from healthy subjects. Chloride concentration was measured with GC method, chloride meter and sweat test analysis system; also conductivity was determined with sweat test analysis system. Forty eight patients with CF and 82 patients without CF underwent the sweat test, showing median sweat chloride values 98.9 mEq/L with GC method, 101 mmol/L with chloride meter, 87.8 mmol/L with sweat test analysis system. In non-CF group, median sweat chloride values were 16.8 mEq/L with GC method, 10.5 mmol/L with chloride meter, and 15.6 mmol/L with sweat test analysis system. Median conductivity value was 107.3 mmol/L in CF group and 32.1 mmol/L in non CF group. There was a strong positive correlation between GC method and the other sweat test methods with a statistical significance (r=0.85) in all subjects. Sweat chloride concentration and conductivity by other sweat test methods highly correlate with the GC method. We think that the other sweat test equipments can be used as reliably as the classic GC method to diagnose or exclude CF.

  20. Governance of Aquatic Agricultural Systems: Analyzing Representation, Power, and Accountability

    Directory of Open Access Journals (Sweden)

    Blake D. Ratner

    2013-12-01

    Full Text Available Aquatic agricultural systems in developing countries face increasing competition from multiple stakeholders over rights to access and use natural resources, land, water, wetlands, and fisheries, essential to rural livelihoods. A key implication is the need to strengthen governance to enable equitable decision making amidst competition that spans sectors and scales, building capacities for resilience, and for transformations in institutions that perpetuate poverty. In this paper we provide a simple framework to analyze the governance context for aquatic agricultural system development focused on three dimensions: stakeholder representation, distribution of power, and mechanisms of accountability. Case studies from Cambodia, Bangladesh, Malawi/Mozambique, and Solomon Islands illustrate the application of these concepts to fisheries and aquaculture livelihoods in the broader context of intersectoral and cross-scale governance interactions. Comparing these cases, we demonstrate how assessing governance dimensions yields practical insights into opportunities for transforming the institutions that constrain resilience in local livelihoods.

  1. Full Life Cycle of Data Analysis with Climate Model Diagnostic Analyzer (CMDA)

    Science.gov (United States)

    Lee, S.; Zhai, C.; Pan, L.; Tang, B.; Zhang, J.; Bao, Q.; Malarout, N.

    2017-12-01

    We have developed a system that supports the full life cycle of a data analysis process, from data discovery, to data customization, to analysis, to reanalysis, to publication, and to reproduction. The system called Climate Model Diagnostic Analyzer (CMDA) is designed to demonstrate that the full life cycle of data analysis can be supported within one integrated system for climate model diagnostic evaluation with global observational and reanalysis datasets. CMDA has four subsystems that are highly integrated to support the analysis life cycle. Data System manages datasets used by CMDA analysis tools, Analysis System manages CMDA analysis tools which are all web services, Provenance System manages the meta data of CMDA datasets and the provenance of CMDA analysis history, and Recommendation System extracts knowledge from CMDA usage history and recommends datasets/analysis tools to users. These four subsystems are not only highly integrated but also easily expandable. New datasets can be easily added to Data System and scanned to be visible to the other subsystems. New analysis tools can be easily registered to be available in the Analysis System and Provenance System. With CMDA, a user can start a data analysis process by discovering datasets of relevance to their research topic using the Recommendation System. Next, the user can customize the discovered datasets for their scientific use (e.g. anomaly calculation, regridding, etc) with tools in the Analysis System. Next, the user can do their analysis with the tools (e.g. conditional sampling, time averaging, spatial averaging) in the Analysis System. Next, the user can reanalyze the datasets based on the previously stored analysis provenance in the Provenance System. Further, they can publish their analysis process and result to the Provenance System to share with other users. Finally, any user can reproduce the published analysis process and results. By supporting the full life cycle of climate data analysis

  2. Systems and methods for analyzing liquids under vacuum

    Science.gov (United States)

    Yu, Xiao-Ying; Yang, Li; Cowin, James P.; Iedema, Martin J.; Zhu, Zihua

    2013-10-15

    Systems and methods for supporting a liquid against a vacuum pressure in a chamber can enable analysis of the liquid surface using vacuum-based chemical analysis instruments. No electrical or fluid connections are required to pass through the chamber walls. The systems can include a reservoir, a pump, and a liquid flow path. The reservoir contains a liquid-phase sample. The pump drives flow of the sample from the reservoir, through the liquid flow path, and back to the reservoir. The flow of the sample is not substantially driven by a differential between pressures inside and outside of the liquid flow path. An aperture in the liquid flow path exposes a stable portion of the liquid-phase sample to the vacuum pressure within the chamber. The radius, or size, of the aperture is less than or equal to a critical value required to support a meniscus of the liquid-phase sample by surface tension.

  3. Mathematical techniques for analyzing concurrent and probabilistic systems

    CERN Document Server

    Rutten, J J M M; Panangaden, Prakash; Panangaden, Prakash; Breugel, Franck van

    2004-01-01

    The book consists of two sets of lecture notes devoted to slightly different methods of analysis of concurrent and probabilistic computational systems. The first set of lectures develops a calculus of streams (a generalization of the set of natural numbers) based on the coinduction principle coming from the theory of coalgebras. It is now well understood that the interplay between algebra (for describing structure) and coalgebra (for describing dynamics) is crucial for understanding concurrent systems. There is a striking analogy between streams and formula calculus reminiscent to those appearing in quantum calculus. These lecture notes will appeal to anyone working in concurrency theory but also to algebraists and logicians. The other set of lecture notes focuses on methods for automatically verifying probabilistic systems using techniques of model checking. The unique aspect of these lectures is the coverage of both theory and practice. The authors have been responsible for one of the most successful experi...

  4. EPRI compact analyzer: A compact, interactive and color-graphics based simulator for power plant analysis

    International Nuclear Information System (INIS)

    Ipakchi, A.; Khadem, M.; Chen, H.; Colley, R.W.

    1986-01-01

    This paper presents the results of an EPRI sponsored project (RP2395-2) for design and development of an interactive, and color graphics based simulator for power plant analysis. The system is called Compact Analyzer and can be applied to engineering and training applications in the utility industry. The Compact Analyzer's software and system design are described. Results of two demonstration system for a nuclear plant, and a fossil plant are presented, and the applications of the Compact Analyzer to operating procedures evaluation are discussed

  5. Analyzing Cyber Security Threats on Cyber-Physical Systems Using Model-Based Systems Engineering

    Science.gov (United States)

    Kerzhner, Aleksandr; Pomerantz, Marc; Tan, Kymie; Campuzano, Brian; Dinkel, Kevin; Pecharich, Jeremy; Nguyen, Viet; Steele, Robert; Johnson, Bryan

    2015-01-01

    The spectre of cyber attacks on aerospace systems can no longer be ignored given that many of the components and vulnerabilities that have been successfully exploited by the adversary on other infrastructures are the same as those deployed and used within the aerospace environment. An important consideration with respect to the mission/safety critical infrastructure supporting space operations is that an appropriate defensive response to an attack invariably involves the need for high precision and accuracy, because an incorrect response can trigger unacceptable losses involving lives and/or significant financial damage. A highly precise defensive response, considering the typical complexity of aerospace environments, requires a detailed and well-founded understanding of the underlying system where the goal of the defensive response is to preserve critical mission objectives in the presence of adversarial activity. In this paper, a structured approach for modeling aerospace systems is described. The approach includes physical elements, network topology, software applications, system functions, and usage scenarios. We leverage Model-Based Systems Engineering methodology by utilizing the Object Management Group's Systems Modeling Language to represent the system being analyzed and also utilize model transformations to change relevant aspects of the model into specialized analyses. A novel visualization approach is utilized to visualize the entire model as a three-dimensional graph, allowing easier interaction with subject matter experts. The model provides a unifying structure for analyzing the impact of a particular attack or a particular type of attack. Two different example analysis types are demonstrated in this paper: a graph-based propagation analysis based on edge labels, and a graph-based propagation analysis based on node labels.

  6. Oxytetracycline analysis in honey using a specific portable analyzer

    Science.gov (United States)

    Chen, Guoying; Schwartz, Daniel; Braden, S.; Nunez, Alberto

    2007-09-01

    Oxytetracycline (OTC) residue in honey is detected using a portable analyzer designed to specifically target tetracycline (TC) drugs based on europium-sensitized luminescence (ESL). A 385 nm light emitting diode (LED) is used as the excitation source and a photomultiplier tube as the light detector. OTC is extracted from honey and cleaned up by solid phase extraction (SPE) using Strata X-WC weak cation exchange cartridges. To the eluate Eu(III) is added to form a Eu-TC chelate at pH 8.5. Efficient intrachelate energy transfer allows sensitive OTC detection at λ ex=385 nm and λ em=610 nm. After a 25-µs time delay, the ESL signal is integrated over a 25-1000 µs interval. The signal intensity reveals a linear relationship (R2=0.972) to OTC concentrations in the 10-200 ng/g range. The limit-of-detection is 6.7 ng/g with an average 5.8% relative standard deviation. The background signal corresponds to ~10 ppb. This instrumentation and method combination enables field analysis that is especially useful for beekeeping industry.

  7. The expanding universe of mass analyzer configurations for biological analysis.

    Science.gov (United States)

    Calvete, Juan J

    2014-01-01

    Mass spectrometry (MS) is an analytical technique that measures the mass-to-charge ratio of electrically charged gas-phase particles. All mass spectrometers combine ion formation, mass analysis, and ion detection. Although mass analyzers can be regarded as sophisticated devices that manipulate ions in space and time, the rich diversity of possible ways to combine ion separation, focusing, and detection in dynamic mass spectrometers accounts for the large number of instrument designs. A historical perspective of the progress in mass spectrometry that since 1965 until today have contributed to position this technique as an indispensable tool for biological research has been recently addressed by a privileged witness of this golden age of MS (Gelpí J. Mass Spectrom 43:419-435, 2008; Gelpí J. Mass Spectrom 44:1137-1161, 2008). The aim of this chapter is to highlight the view that the operational principles of mass spectrometry can be understood by a simple mathematical language, and that an understanding of the basic concepts of mass spectrometry is necessary to take the most out of this versatile technique.

  8. The Tobacco Use Management System: Analyzing Tobacco Control From a Systems Perspective

    Science.gov (United States)

    Young, David; Coghill, Ken; Zhang, Jian Ying

    2010-01-01

    We use systems thinking to develop a strategic framework for analyzing the tobacco problem and we suggest solutions. Humans are vulnerable to nicotine addiction, and the most marketable form of nicotine delivery is the most harmful. A tobacco use management system has evolved out of governments’ attempts to regulate tobacco marketing and use and to support services that provide information about tobacco's harms and discourage its use. Our analysis identified 5 systemic problems that constrain progress toward the elimination of tobacco-related harm. We argue that this goal would be more readily achieved if the regulatory subsystem had dynamic power to regulate tobacco products and the tobacco industry as well as a responsive process for resourcing tobacco use control activities. PMID:20466970

  9. Quality Performance of Drugs Analyzed in the Drug Analysis and ...

    African Journals Online (AJOL)

    ICT TEAM

    performance of drug samples analyzed therein. Previous reports have ... wholesalers, non-governmental organizations, hospitals, analytical ..... a dispute concerning discharge of waste water ... Healthcare Industry in Kenya, December. 2008.

  10. A model system for analyzing the interrelations between the energy sector and the whole economy

    International Nuclear Information System (INIS)

    Strubegger, M.; Messner, S.

    1989-01-01

    This paper introduces an instrument for the analysis of interactions between economy, private consumption and the energy sector. It is realized as an integrated model system that helps to analyze scenarios concerning future developments of the energy system and the economy for their internal consistency. This model system could provide a methodological basis for rationalizing the present debates on the economic consequences of different energy supply strategies. (author). 24 refs, 2 figs, 7 tabs

  11. An Activity Theory Approach to Analyze Barriers to a Virtual Management Information Systems (MIS) Curriculum

    Science.gov (United States)

    Jaradat, Suhair; Qablan, Ahmad; Barham, Areej

    2011-01-01

    This paper explains how the activity theory is used as a framework to analyze the barriers to a virtual Management Information Stream (MIS) Curriculum in Jordanian schools, from both the sociocultural and pedagogical perspectives. Taking the activity system as a unit of analysis, this study documents the processes by which activities shape and are…

  12. Analysis of detection performance of multi band laser beam analyzer

    Science.gov (United States)

    Du, Baolin; Chen, Xiaomei; Hu, Leili

    2017-10-01

    Compared with microwave radar, Laser radar has high resolution, strong anti-interference ability and good hiding ability, so it becomes the focus of laser technology engineering application. A large scale Laser radar cross section (LRCS) measurement system is designed and experimentally tested. First, the boundary conditions are measured and the long range laser echo power is estimated according to the actual requirements. The estimation results show that the echo power is greater than the detector's response power. Secondly, a large scale LRCS measurement system is designed according to the demonstration and estimation. The system mainly consists of laser shaping, beam emitting device, laser echo receiving device and integrated control device. Finally, according to the designed lidar cross section measurement system, the scattering cross section of target is simulated and tested. The simulation results are basically the same as the test results, and the correctness of the system is proved.

  13. Analyzing coastal environments by means of functional data analysis

    Science.gov (United States)

    Sierra, Carlos; Flor-Blanco, Germán; Ordoñez, Celestino; Flor, Germán; Gallego, José R.

    2017-07-01

    Here we used Functional Data Analysis (FDA) to examine particle-size distributions (PSDs) in a beach/shallow marine sedimentary environment in Gijón Bay (NW Spain). The work involved both Functional Principal Components Analysis (FPCA) and Functional Cluster Analysis (FCA). The grainsize of the sand samples was characterized by means of laser dispersion spectroscopy. Within this framework, FPCA was used as a dimension reduction technique to explore and uncover patterns in grain-size frequency curves. This procedure proved useful to describe variability in the structure of the data set. Moreover, an alternative approach, FCA, was applied to identify clusters and to interpret their spatial distribution. Results obtained with this latter technique were compared with those obtained by means of two vector approaches that combine PCA with CA (Cluster Analysis). The first method, the point density function (PDF), was employed after adapting a log-normal distribution to each PSD and resuming each of the density functions by its mean, sorting, skewness and kurtosis. The second applied a centered-log-ratio (clr) to the original data. PCA was then applied to the transformed data, and finally CA to the retained principal component scores. The study revealed functional data analysis, specifically FPCA and FCA, as a suitable alternative with considerable advantages over traditional vector analysis techniques in sedimentary geology studies.

  14. Analyzing the security of an existing computer system

    Science.gov (United States)

    Bishop, M.

    1986-01-01

    Most work concerning secure computer systems has dealt with the design, verification, and implementation of provably secure computer systems, or has explored ways of making existing computer systems more secure. The problem of locating security holes in existing systems has received considerably less attention; methods generally rely on thought experiments as a critical step in the procedure. The difficulty is that such experiments require that a large amount of information be available in a format that makes correlating the details of various programs straightforward. This paper describes a method of providing such a basis for the thought experiment by writing a special manual for parts of the operating system, system programs, and library subroutines.

  15. Analyzing traffic layout using dynamic social network analysis.

    Science.gov (United States)

    2014-07-12

    it is essential to build, maintain, and use our transportation systems in a manner that meets our current : needs while addressing the social and economic needs of future generations. In todays world, : transportation congestion causes serious neg...

  16. Coupling detrended fluctuation analysis for analyzing coupled nonstationary signals

    Science.gov (United States)

    Hedayatifar, L.; Vahabi, M.; Jafari, G. R.

    2011-08-01

    When many variables are coupled to each other, a single case study could not give us thorough and precise information. When these time series are stationary, different methods of random matrix analysis and complex networks can be used. But, in nonstationary cases, the multifractal-detrended-cross-correlation-analysis (MF-DXA) method was introduced for just two coupled time series. In this article, we have extended the MF-DXA to the method of coupling detrended fluctuation analysis (CDFA) for the case when more than two series are correlated to each other. Here, we have calculated the multifractal properties of the coupled time series, and by comparing CDFA results of the original series with those of the shuffled and surrogate series, we can estimate the source of multifractality and the extent to which our series are coupled to each other. We illustrate the method by selected examples from air pollution and foreign exchange rates.

  17. Modeling and Analyzing Real-Time Multiprocessor Systems

    NARCIS (Netherlands)

    Wiggers, M.H.; Thiele, Lothar; Lee, Edward A.; Schlieker, Simon; Bekooij, Marco Jan Gerrit

    2010-01-01

    Researchers have proposed approaches to verify that real-time multiprocessor systems meet their timeliness constraints. These approaches make assumptions on the model of computation, the load placed on the multiprocessor system, and the faults that can arise. This heterogeneous set of assumptions

  18. Development of expert systems for analyzing electronic documents

    Science.gov (United States)

    Abeer Yassin, Al-Azzawi; Shidlovskiy, S.; Jamal, A. A.

    2018-05-01

    The paper analyses a Database Management System (DBMS). Expert systems, Databases, and database technology have become an essential component of everyday life in the modern society. As databases are widely used in every organization with a computer system, data resource control and data management are very important [1]. DBMS is the most significant tool developed to serve multiple users in a database environment consisting of programs that enable users to create and maintain a database. This paper focuses on development of a database management system for General Directorate for education of Diyala in Iraq (GDED) using Clips, java Net-beans and Alfresco and system components, which were previously developed in Tomsk State University at the Faculty of Innovative Technology.

  19. Analyzing Integrated Cost-Schedule Risk for Complex Product Systems R&D Projects

    Directory of Open Access Journals (Sweden)

    Zhe Xu

    2014-01-01

    Full Text Available The vast majority of the research efforts in project risk management tend to assess cost risk and schedule risk independently. However, project cost and time are related in reality and the relationship between them should be analyzed directly. We propose an integrated cost and schedule risk assessment model for complex product systems R&D projects. Graphical evaluation review technique (GERT, Monte Carlo simulation, and probability distribution theory are utilized to establish the model. In addition, statistical analysis and regression analysis techniques are employed to analyze simulation outputs. Finally, a complex product systems R&D project as an example is modeled by the proposed approach and the simulation outputs are analyzed to illustrate the effectiveness of the risk assessment model. It seems that integrating cost and schedule risk assessment can provide more reliable risk estimation results.

  20. Visual and intelligent transients and accidents analyzer based on thermal-hydraulic system code

    International Nuclear Information System (INIS)

    Meng Lin; Rui Hu; Yun Su; Ronghua Zhang; Yanhua Yang

    2005-01-01

    Full text of publication follows: Many thermal-hydraulic system codes were developed in the past twenty years, such as RELAP5, RETRAN, ATHLET, etc. Because of their general and advanced features in thermal-hydraulic computation, they are widely used in the world to analyze transients and accidents. But there are following disadvantages for most of these original thermal-hydraulic system codes. Firstly, because models are built through input decks, so the input files are complex and non-figurative, and the style of input decks is various for different users and models. Secondly, results are shown in off-line data file form. It is not convenient for analysts who may pay more attention to dynamic parameters trend and changing. Thirdly, there are few interfaces with other program in these original thermal-hydraulic system codes. This restricts the codes expanding. The subject of this paper is to develop a powerful analyzer based on these thermal-hydraulic system codes to analyze transients and accidents more simply, accurately and fleetly. Firstly, modeling is visual and intelligent. Users build the thermalhydraulic system model using component objects according to their needs, and it is not necessary for them to face bald input decks. The style of input decks created automatically by the analyzer is unified and can be accepted easily by other people. Secondly, parameters concerned by analyst can be dynamically communicated to show or even change. Thirdly, the analyzer provide interface with other programs for the thermal-hydraulic system code. Thus parallel computation between thermal-hydraulic system code and other programs become possible. In conclusion, through visual and intelligent method, the analyzer based on general and advanced thermal-hydraulic system codes can be used to analysis transients and accidents more effectively. The main purpose of this paper is to present developmental activities, assessment and application results of the visual and intelligent

  1. On Analyzing LDPC Codes over Multiantenna MC-CDMA System

    Directory of Open Access Journals (Sweden)

    S. Suresh Kumar

    2014-01-01

    Full Text Available Multiantenna multicarrier code-division multiple access (MC-CDMA technique has been attracting much attention for designing future broadband wireless systems. In addition, low-density parity-check (LDPC code, a promising near-optimal error correction code, is also being widely considered in next generation communication systems. In this paper, we propose a simple method to construct a regular quasicyclic low-density parity-check (QC-LDPC code to improve the transmission performance over the precoded MC-CDMA system with limited feedback. Simulation results show that the coding gain of the proposed QC-LDPC codes is larger than that of the Reed-Solomon codes, and the performance of the multiantenna MC-CDMA system can be greatly improved by these QC-LDPC codes when the data rate is high.

  2. Failure Analysis of Nonvolatile Residue (NVR) Analyzer Model SP-1000

    Science.gov (United States)

    Potter, Joseph C.

    2011-01-01

    National Aeronautics and Space Administration (NASA) subcontractor Wiltech contacted the NASA Electrical Lab (NE-L) and requested a failure analysis of a Solvent Purity Meter; model SP-IOOO produced by the VerTis Instrument Company. The meter, used to measure the contaminate in a solvent to determine the relative contamination on spacecraft flight hardware and ground servicing equipment, had been inoperable and in storage for an unknown amount of time. NE-L was asked to troubleshoot the unit and make a determination on what may be required to make the unit operational. Through the use of general troubleshooting processes and the review of a unit in service at the time of analysis, the unit was found to be repairable but would need the replacement of multiple components.

  3. Evaluation of cell count and classification capabilities in body fluids using a fully automated Sysmex XN equipped with high-sensitive Analysis (hsA) mode and DI-60 hematology analyzer system.

    Science.gov (United States)

    Takemura, Hiroyuki; Ai, Tomohiko; Kimura, Konobu; Nagasaka, Kaori; Takahashi, Toshihiro; Tsuchiya, Koji; Yang, Haeun; Konishi, Aya; Uchihashi, Kinya; Horii, Takashi; Tabe, Yoko; Ohsaka, Akimichi

    2018-01-01

    The XN series automated hematology analyzer has been equipped with a body fluid (BF) mode to count and differentiate leukocytes in BF samples including cerebrospinal fluid (CSF). However, its diagnostic accuracy is not reliable for CSF samples with low cell concentration at the border between normal and pathologic level. To overcome this limitation, a new flow cytometry-based technology, termed "high sensitive analysis (hsA) mode," has been developed. In addition, the XN series analyzer has been equipped with the automated digital cell imaging analyzer DI-60 to classify cell morphology including normal leukocytes differential and abnormal malignant cells detection. Using various BF samples, we evaluated the performance of the XN-hsA mode and DI-60 compared to manual microscopic examination. The reproducibility of the XN-hsA mode showed good results in samples with low cell densities (coefficient of variation; % CV: 7.8% for 6 cells/μL). The linearity of the XN-hsA mode was established up to 938 cells/μL. The cell number obtained using the XN-hsA mode correlated highly with the corresponding microscopic examination. Good correlation was also observed between the DI-60 analyses and manual microscopic classification for all leukocyte types, except monocytes. In conclusion, the combined use of cell counting with the XN-hsA mode and automated morphological analyses using the DI-60 mode is potentially useful for the automated analysis of BF cells.

  4. A Low Cost Shading Analyzer and Site Evaluator Design to Determine Solar Power System Installation Area

    Directory of Open Access Journals (Sweden)

    Selami Kesler

    2015-01-01

    Full Text Available Shading analyzer systems are necessary for selecting the most suitable installation site to sustain enough solar power. Afterwards, changes in solar data throughout the year must be evaluated along with the identification of obstructions surrounding the installation site in order to analyze shading effects on productivity of the solar power system. In this study, the shading analysis tools are introduced briefly, and a new and different device is developed and explained to analyze shading effect of the environmental obstruction on the site on which the solar power system will be established. Thus, exposure duration of the PV panels to the sunlight can be measured effectively. The device is explained with an application on the installation area selected as a pilot site, Denizli, in Turkey.

  5. Analyzing the capability of a radio telescope in a bistatic space debris observation system

    International Nuclear Information System (INIS)

    Zhao Zhe; Zhao You; Gao Peng-Qi

    2013-01-01

    A bistatic space debris observation system using a radio telescope as the receiving part is introduced. The detection capability of the system at different working frequencies is analyzed based on real instruments. The detection range of targets with a fixed radar cross section and the detection ability of small space debris at a fixed range are discussed. The simulations of this particular observation system at different transmitting powers are also implemented and the detection capability is discussed. The simulated results approximately match the actual experiments. The analysis in this paper provides a theoretical basis for developing a space debris observation system that can be built in China

  6. Analyzing Innovation Systems (Burkina Faso) | CRDI - Centre de ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project aims to improve the efficiency of the nascent innovation system in Burkina Faso by strengthening exchanges between researchers, inventors and innovators and public ... L'Initiative des conseils subventionnaires de la recherche scientifique en Afrique subsaharienne remporte le prix de la diplomatie scientifique.

  7. An engineering code to analyze hypersonic thermal management systems

    Science.gov (United States)

    Vangriethuysen, Valerie J.; Wallace, Clark E.

    1993-01-01

    Thermal loads on current and future aircraft are increasing and as a result are stressing the energy collection, control, and dissipation capabilities of current thermal management systems and technology. The thermal loads for hypersonic vehicles will be no exception. In fact, with their projected high heat loads and fluxes, hypersonic vehicles are a prime example of systems that will require thermal management systems (TMS) that have been optimized and integrated with the entire vehicle to the maximum extent possible during the initial design stages. This will not only be to meet operational requirements, but also to fulfill weight and performance constraints in order for the vehicle to takeoff and complete its mission successfully. To meet this challenge, the TMS can no longer be two or more entirely independent systems, nor can thermal management be an after thought in the design process, the typical pervasive approach in the past. Instead, a TMS that was integrated throughout the entire vehicle and subsequently optimized will be required. To accomplish this, a method that iteratively optimizes the TMS throughout the vehicle will not only be highly desirable, but advantageous in order to reduce the manhours normally required to conduct the necessary tradeoff studies and comparisons. A thermal management engineering computer code that is under development and being managed at Wright Laboratory, Wright-Patterson AFB, is discussed. The primary goal of the code is to aid in the development of a hypersonic vehicle TMS that has been optimized and integrated on a total vehicle basis.

  8. On Modeling and Analyzing Cost Factors in Information Systems Engineering

    NARCIS (Netherlands)

    Mutschler, B.B.; Reichert, M.U.

    Introducing enterprise information systems (EIS) is usually associated with high costs. It is therefore crucial to understand those factors that determine or influence these costs. Though software cost estimation has received considerable attention during the last decades, it is difficult to apply

  9. Look Together: Analyzing Gaze Coordination with Epistemic Network Analysis

    Directory of Open Access Journals (Sweden)

    Sean eAndrist

    2015-07-01

    Full Text Available When conversing and collaborating in everyday situations, people naturally and interactively align their behaviors with each other across various communication channels, including speech, gesture, posture, and gaze. Having access to a partner's referential gaze behavior has been shown to be particularly important in achieving collaborative outcomes, but the process in which people's gaze behaviors unfold over the course of an interaction and become tightly coordinated is not well understood. In this paper, we present work to develop a deeper and more nuanced understanding of coordinated referential gaze in collaborating dyads. We recruited 13 dyads to participate in a collaborative sandwich-making task and used dual mobile eye tracking to synchronously record each participant's gaze behavior. We used a relatively new analysis technique—epistemic network analysis—to jointly model the gaze behaviors of both conversational participants. In this analysis, network nodes represent gaze targets for each participant, and edge strengths convey the likelihood of simultaneous gaze to the connected target nodes during a given time-slice. We divided collaborative task sequences into discrete phases to examine how the networks of shared gaze evolved over longer time windows. We conducted three separate analyses of the data to reveal (1 properties and patterns of how gaze coordination unfolds throughout an interaction sequence, (2 optimal time lags of gaze alignment within a dyad at different phases of the interaction, and (3 differences in gaze coordination patterns for interaction sequences that lead to breakdowns and repairs. In addition to contributing to the growing body of knowledge on the coordination of gaze behaviors in joint activities, this work has implications for the design of future technologies that engage in situated interactions with human users.

  10. Monitoring and analyzing features of electrical power quality system performance

    OpenAIRE

    Genci Sharko; Nike Shanku

    2010-01-01

    Power quality is a set of boundaries that allows electrical systems to function in their intended manner without significant loss of performance or life. The term is used to describe electric power that drives an electrical load and the load's ability to function properly with that electric power. Without the proper quality of the power, an electrical device may malfunction, fail prematurely or not operate at all. There are many reasons why the electric power can be of poor quality and many m...

  11. Analyzing the errors of DFT approximations for compressed water systems

    International Nuclear Information System (INIS)

    Alfè, D.; Bartók, A. P.; Csányi, G.; Gillan, M. J.

    2014-01-01

    We report an extensive study of the errors of density functional theory (DFT) approximations for compressed water systems. The approximations studied are based on the widely used PBE and BLYP exchange-correlation functionals, and we characterize their errors before and after correction for 1- and 2-body errors, the corrections being performed using the methods of Gaussian approximation potentials. The errors of the uncorrected and corrected approximations are investigated for two related types of water system: first, the compressed liquid at temperature 420 K and density 1.245 g/cm 3 where the experimental pressure is 15 kilobars; second, thermal samples of compressed water clusters from the trimer to the 27-mer. For the liquid, we report four first-principles molecular dynamics simulations, two generated with the uncorrected PBE and BLYP approximations and a further two with their 1- and 2-body corrected counterparts. The errors of the simulations are characterized by comparing with experimental data for the pressure, with neutron-diffraction data for the three radial distribution functions, and with quantum Monte Carlo (QMC) benchmarks for the energies of sets of configurations of the liquid in periodic boundary conditions. The DFT errors of the configuration samples of compressed water clusters are computed using QMC benchmarks. We find that the 2-body and beyond-2-body errors in the liquid are closely related to similar errors exhibited by the clusters. For both the liquid and the clusters, beyond-2-body errors of DFT make a substantial contribution to the overall errors, so that correction for 1- and 2-body errors does not suffice to give a satisfactory description. For BLYP, a recent representation of 3-body energies due to Medders, Babin, and Paesani [J. Chem. Theory Comput. 9, 1103 (2013)] gives a reasonably good way of correcting for beyond-2-body errors, after which the remaining errors are typically 0.5 mE h ≃ 15 meV/monomer for the liquid and the

  12. Development and evaluation of a computer-aided system for analyzing human error in railway operations

    International Nuclear Information System (INIS)

    Kim, Dong San; Baek, Dong Hyun; Yoon, Wan Chul

    2010-01-01

    As human error has been recognized as one of the major contributors to accidents in safety-critical systems, there has been a strong need for techniques that can analyze human error effectively. Although many techniques have been developed so far, much room for improvement remains. As human error analysis is a cognitively demanding and time-consuming task, it is particularly necessary to develop a computerized system supporting this task. This paper presents a computer-aided system for analyzing human error in railway operations, called Computer-Aided System for Human Error Analysis and Reduction (CAS-HEAR). It supports analysts to find multiple levels of error causes and their causal relations by using predefined links between contextual factors and causal factors as well as links between causal factors. In addition, it is based on a complete accident model; hence, it helps analysts to conduct a thorough analysis without missing any important part of human error analysis. A prototype of CAS-HEAR was evaluated by nine field investigators from six railway organizations in Korea. Its overall usefulness in human error analysis was confirmed, although development of its simplified version and some modification of the contextual factors and causal factors are required in order to ensure its practical use.

  13. Fault tree and reliability relationships for analyzing noncoherent two-state systems

    International Nuclear Information System (INIS)

    Alesso, H.P.; Benson, H.J.

    1980-01-01

    Recently, there has been interest in analyzing the noncoherent interactions that result from adversary theft of special nuclear material from reprocessing facilities. The actions of the adversary, acting in conflict with the reprocessing facility's material control and accounting system, may be viewed as a single noncoherent structure. This paper develops a basis for analyzing noncoherent structures by decomposing them into coherent subsystems. Both reliability and fault tree structure functions are used for this analysis. In addition, a bounding criterion is established for the reliability of statistically dependent noncoherent structures. (orig.)

  14. Research on the Method of Big Data Collecting, Storing and Analyzing of Tongue Diagnosis System

    Science.gov (United States)

    Chen, Xiaowei; Wu, Qingfeng

    2018-03-01

    This paper analyzes the contents of the clinical data of tongue diagnosis of TCM (Traditional Chinese Medicine), and puts forward a method to collect, store and analyze the clinical data of tongue diagnosis. Under the guidance of TCM theory of syndrome differentiation and treatment, this method combines with Hadoop, which is a distributed computing system with strong expansibility, and integrates the functions of analysis and conversion of big data of clinic tongue diagnosis. At the same time, the consistency, scalability and security of big data in tongue diagnosis are realized.

  15. An immersive simulation system for provoking and analyzing cataplexy.

    Science.gov (United States)

    Augustine, Kurt; Cameron, Bruce; Camp, Jon; Krahn, Lois; Robb, Richard

    2002-01-01

    medication. We believe this is a novel and innovative approach to a difficult problem. CatNAP is a compelling example of the potentially effective application of virtual reality technology to an important clinical problem that has resisted previous approaches. Preliminary results suggest that an immersive simulation system like CatNAP will be able to reliably induce cataplexy in a controlled environment. The project is continuing through a final stage of refinement prior to conducting a full clinical study.

  16. IndElec: A Software for Analyzing Party Systems and Electoral Systems

    Directory of Open Access Journals (Sweden)

    Francisco Ocaña

    2011-08-01

    Full Text Available IndElec is a software addressed to compute a wide range of indices from electoral data, which are intended to analyze both party systems and electoral systems in political studies. Further, IndElec can calculate such indices from electoral data at several levels of aggregation, even when the acronyms of some political parties change across districts. As the amount of information provided by IndElec may be considerable, this software also aids the user in the analysis of electoral data through three capabilities. First, IndElec automatically elaborates preliminary descriptive statistical reports of computed indices. Second, IndElec saves the computed information into text files in data matrix format, which can be directly loaded by any statistical software to facilitate more sophisticated statistical studies. Third, IndElec provides results in several file formats (text, CSV, HTML, R to facilitate their visualization and management by using a wide range of application softwares (word processors, spreadsheets, web browsers, etc.. Finally, a graphical user interface is provided for IndElec to manage calculation processes, but no visualization facility is available in this environment. In fact, both the inputs and outputs for IndElec are arranged in files with the aforementioned formats.

  17. Modematic: a fast laser beam analyzing system for high power CO2-laser beams

    Science.gov (United States)

    Olsen, Flemming O.; Ulrich, Dan

    2003-03-01

    The performance of an industrial laser is very much depending upon the characteristics of the laser beam. The ISO standards 11146 and 11154 describing test methods for laser beam parameters have been approved. To implement these methods in industry is difficult and especially for the infrared laser sources, such as the CO2-laser, the availabl analyzing systems are slow, difficult to apply and having limited reliability due to the nature of the detection methods. In an EUREKA-project the goal was defined to develop a laser beam analyzing system dedicated to high power CO2-lasers, which could fulfill the demands for an entire analyzing system, automating the time consuming pre-alignment and beam conditioning work required before a beam mode analyses, automating the analyzing sequences and data analysis required to determine the laser beam caustics and last but not least to deliver reliable close to real time data to the operator. The results of this project work will be described in this paper. The research project has led to the development of the Modematic laser beam analyzer, which is ready for the market.

  18. Analysis of Few-Mode Multi-Core Fiber Splice Behavior Using an Optical Vector Network Analyzer

    DEFF Research Database (Denmark)

    Rommel, Simon; Mendinueta, Jose Manuel Delgado; Klaus, Werner

    2017-01-01

    The behavior of splices in a 3-mode 36-core fiber is analyzed using optical vector network analysis. Time-domain response analysis confirms splices may cause significant mode-mixing, while frequency-domain analysis shows splices may affect system level mode-dependent loss both positively and negativ......The behavior of splices in a 3-mode 36-core fiber is analyzed using optical vector network analysis. Time-domain response analysis confirms splices may cause significant mode-mixing, while frequency-domain analysis shows splices may affect system level mode-dependent loss both positively...

  19. A formulation to analyze system-of-systems problems: A case study of airport metroplex operations

    Science.gov (United States)

    Ayyalasomayajula, Sricharan Kishore

    A system-of-systems (SoS) can be described as a collection of multiple, heterogeneous, distributed, independent components interacting to achieve a range of objectives. A generic formulation was developed to model component interactions in an SoS to understand their influence on overall SoS performance. The formulation employs a lexicon to aggregate components into hierarchical interaction networks and understand how their topological properties affect the performance of the aggregations. Overall SoS performance is evaluated by monitoring the changes in stakeholder profitability due to changes in component interactions. The formulation was applied to a case study in air transportation focusing on operations at airport metroplexes. Metroplexes are geographical regions with two or more airports in close proximity to one another. The case study explored how metroplex airports interact with one another, what dependencies drive these interactions, and how these dependencies affect metroplex throughput and capacity. Metrics were developed to quantify runway dependencies at a metroplex and were correlated with its throughput and capacity. Operations at the New York/New Jersey metroplex (NYNJ) airports were simulated to explore the feasibility of operating very large aircraft (VLA), such as the Airbus A380, as a delay-mitigation strategy at these airports. The proposed formulation was employed to analyze the impact of this strategy on different stakeholders in the national air transportation system (ATS), such as airlines and airports. The analysis results and their implications were used to compare the pros and cons of operating VLAs at NYNJ from the perspectives of airline profitability, and flight delays at NYNJ and across the ATS.

  20. Design of multi-channel analyzer's monitoring system based on embedded system

    International Nuclear Information System (INIS)

    Yang Tao; Wei Yixiang

    2007-01-01

    A new Multi-Channel Analyzer's Monitoring system based on ARM9 Embedded system is introduced in this paper. Some solutions to problem are also discussed during the procedure of design, installation and debugging on Linux system. The Monitoring system is developed by using MiniGUI and Linux software system API, with the functions of collecting, displaying and I/O data controlling 1024 channels datum. They are all realized in real time, with the merits of low cost, small size and portability. All these lay the foundation of developing homemade Digital and Portable nuclear spectrometers. (authors)

  1. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  2. Comparative study of gas-analyzing systems designed for continuous monitoring of TPP emissions

    Science.gov (United States)

    Kondrat'eva, O. E.; Roslyakov, P. V.

    2017-06-01

    Determining the composition of combustion products is important in terms of both control of emissions into the atmosphere from thermal power plants and optimization of fuel combustion processes in electric power plants. For this purpose, the concentration of oxygen, carbon monoxide, nitrogen, and sulfur oxides in flue gases is monitored; in case of solid fuel combustion, fly ash concentration is monitored as well. According to the new nature conservation law in Russia, all large TPPs shall be equipped with continuous emission monitoring and measurement systems (CEMMS) into the atmosphere. In order to ensure the continuous monitoring of pollutant emissions, direct round-the-clock measurements are conducted with the use of either domestically produced or imported gas analyzers and analysis systems, the operation of which is based on various physicochemical methods and which can be generally used when introducing CEMMS. Depending on the type and purposes of measurement, various kinds of instruments having different features may be used. This article represents a comparative study of gas-analysis systems for measuring the content of polluting substances in exhaust gases based on various physical and physicochemical analysis methods. It lists basic characteristics of the methods commonly applied in the area of gas analysis. It is proven that, considering the necessity of the long-term, continuous operation of gas analyzers for monitoring and measurement of pollutant emissions into the atmosphere, as well as the requirements for reliability and independence from aggressive components and temperature of the gas flow, it is preferable to use optical gas analyzers for the aforementioned purposes. In order to reduce the costs of equipment comprising a CEMMS at a TPP and optimize the combustion processes, electrochemical and thermomagnetic gas analyzers may also be used.

  3. ITHNA.SYS: An Integrated Thermal Hydraulic and Neutronic Analyzer SYStem for NUR research reactor

    Energy Technology Data Exchange (ETDEWEB)

    Mazidi, S., E-mail: samirmazidi@gmail.com [Division Physique et Applications Nucléaires, Centre de Recherche Nucléaire de Draria (CRND), BP 43 Sebala, Draria, Alger (Algeria); Meftah, B., E-mail: b_meftah@yahoo.com [Division Physique et Applications Nucléaires, Centre de Recherche Nucléaire de Draria (CRND), BP 43 Sebala, Draria, Alger (Algeria); Belgaid, M., E-mail: belgaidm@yahoo.com [Faculté de Physique, Université Houari Boumediene, USTHB, BP 31, Bab Ezzouar, Alger (Algeria); Letaim, F., E-mail: fletaim@yahoo.fr [Faculté des Sciences et Technologies, Université d’El-oued, PO Box 789, El-oued (Algeria); Halilou, A., E-mail: hal_rane@yahoo.fr [Division Réacteur NUR, Centre de Recherche Nucléaire de Draria, BP 43 Sebala, Draria, Alger (Algeria)

    2015-08-15

    Highlights: • We develop a neutronic and thermal hydraulic MTR reactor analyzer. • The analyzer allows a rapid determination of the reactor core parameters. • Some NUR reactor parameters have been analyzed. - Abstract: This paper introduces the Integrated Thermal Hydraulic and Neutronic Analyzer SYStem (ITHNA.SYS) that has been developed for the Algerian research reactor NUR. It is used both as an operating aid tool and as a core physics engineering analysis tool. The system embeds three modules of the MTR-PC software package developed by INVAP SE: the cell calculation code WIMSD, the core calculation code CITVAP and the program TERMIC for thermal hydraulic analysis of a material testing reactor (MTR) core in forced convection. ITHNA.SYS operates both in on-line and off-line modes. In the on-line mode, the system is linked, via the computer parallel port, to the data acquisition console of the reactor control room and allows a real time monitoring of major physical and safety parameters of the NUR core. PC-based ITHNA.SYS provides a viable and convenient way of using an accumulated and often complex reactor physics stock of knowledge and frees the user from the intricacy of adequate reactor core modeling. This guaranties an accurate, though rapid, determination of a variety of neutronic and thermal hydraulic parameters of importance for the operation and safety analysis of the NUR research reactor. Instead of the several hours usually required, the processing time for the determination of such parameters is now reduced to few seconds. Validation of the system was performed with respect to experimental measurements and to calculations using reference codes. ITHNA.SYS can be easily adapted to accommodate other kinds of MTR reactors.

  4. Single Molecule Analysis Research Tool (SMART: an integrated approach for analyzing single molecule data.

    Directory of Open Access Journals (Sweden)

    Max Greenfeld

    Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.

  5. System analysis and design

    International Nuclear Information System (INIS)

    Son, Seung Hui

    2004-02-01

    This book deals with information technology and business process, information system architecture, methods of system development, plan on system development like problem analysis and feasibility analysis, cases for system development, comprehension of analysis of users demands, analysis of users demands using traditional analysis, users demands analysis using integrated information system architecture, system design using integrated information system architecture, system implementation, and system maintenance.

  6. A scalable, self-analyzing digital locking system for use on quantum optics experiments.

    Science.gov (United States)

    Sparkes, B M; Chrzanowski, H M; Parrain, D P; Buchler, B C; Lam, P K; Symul, T

    2011-07-01

    Digital control of optics experiments has many advantages over analog control systems, specifically in terms of the scalability, cost, flexibility, and the integration of system information into one location. We present a digital control system, freely available for download online, specifically designed for quantum optics experiments that allows for automatic and sequential re-locking of optical components. We show how the inbuilt locking analysis tools, including a white-noise network analyzer, can be used to help optimize individual locks, and verify the long term stability of the digital system. Finally, we present an example of the benefits of digital locking for quantum optics by applying the code to a specific experiment used to characterize optical Schrödinger cat states.

  7. A WebGIS-based system for analyzing and visualizing air quality data for Shanghai Municipality

    Science.gov (United States)

    Wang, Manyi; Liu, Chaoshun; Gao, Wei

    2014-10-01

    An online visual analytical system based on Java Web and WebGIS for air quality data for Shanghai Municipality was designed and implemented to quantitatively analyze and qualitatively visualize air quality data. By analyzing the architecture of WebGIS and Java Web, we firstly designed the overall scheme for system architecture, then put forward the software and hardware environment and also determined the main function modules for the system. The visual system was ultimately established with the DIV + CSS layout method combined with JSP, JavaScript, and some other computer programming languages based on the Java programming environment. Moreover, Struts, Spring, and Hibernate frameworks (SSH) were integrated in the system for the purpose of easy maintenance and expansion. To provide mapping service and spatial analysis functions, we selected ArcGIS for Server as the GIS server. We also used Oracle database and ESRI file geodatabase to store spatial data and non-spatial data in order to ensure the data security. In addition, the response data from the Web server are resampled to implement rapid visualization through the browser. The experimental successes indicate that this system can quickly respond to user's requests, and efficiently return the accurate processing results.

  8. Analyzing the behavior and reliability of voting systems comprising tri-state units using enumerated simulation

    International Nuclear Information System (INIS)

    Yacoub, Sherif

    2003-01-01

    Voting is a common technique used in combining results from peer experts, for multiple purposes, and in a variety of domains. In distributed decision making systems, voting mechanisms are used to obtain a decision by incorporating the opinion of multiple units. Voting systems have many applications in fault tolerant systems, mutual exclusion in distributed systems, and replicated databases. We are specifically interested in voting systems as used in decision-making applications. In this paper, we describe a synthetic experimental procedure to study the behavior of a variety of voting system configurations using a simulator to: analyze the state of each expert, apply a voting mechanism, and analyze the voting results. We introduce an enumerated-simulation approach and compare it to existing mathematical approaches. The paper studies the following behaviors of a voting system: (1) the reliability of the voting system, R; (2) the probability of reaching a consensus, P c ; (3) certainty index, T; and (4) the confidence index, C. The configuration parameters controlling the analysis are: (1) the number of participating experts, N, (2) the possible output states of an expert, and (3) the probability distribution of each expert states. We illustrate the application of this approach to a voting system that consists of N units, each has three states: correct (success), wrong (failed), and abstain (did not produce an output). The final output of the decision-making (voting) system is correct if a consensus is reached on a correct unit output, abstain if all units abstain from voting, and wrong otherwise. We will show that using the proposed approach, we can easily conduct studies to unleash several behaviors of a decision-making system with tri-state experts

  9. [Modeling and implementation method for the automatic biochemistry analyzer control system].

    Science.gov (United States)

    Wang, Dong; Ge, Wan-cheng; Song, Chun-lin; Wang, Yun-guang

    2009-03-01

    In this paper the system structure The automatic biochemistry analyzer is a necessary instrument for clinical diagnostics. First of is analyzed. The system problems description and the fundamental principles for dispatch are brought forward. Then this text puts emphasis on the modeling for the automatic biochemistry analyzer control system. The objects model and the communications model are put forward. Finally, the implementation method is designed. It indicates that the system based on the model has good performance.

  10. Analyzing farming systems diversity: a case study in south-western France

    Energy Technology Data Exchange (ETDEWEB)

    Choisis, J. P.; Thevenet, C.; Girbon, A.

    2012-11-01

    The huge changes in agricultural activities, which may be amplified by the forthcoming Common Agriculture Policy reform, call the future of crop-livestock systems into question and hence the impact of these changes on landscapes and biodiversity. We analyzed relationships between agriculture, landscape and biodiversity in south-western France. The study area covered about 4,000 ha and included four villages. We conducted a survey of 56 farms. Multivariate analysis (multiple factor analysis and cluster analysis) were used to analyze relationships between 25 variables and to build a typology. The type of farming (beef and/or dairy cattle, cash crops), size (area and workforce) and cultivation practices, among others, were revealed as differentiating factors of farms. Six farming types were identified (1) hillside mixed crop-livestock farms, (2) large corporate farms, (3) extensive cattle farms, (4) large intensive farms on the valley sides, (5) small multiple-job holdings, and (6) hobby farms. The diversity of farming systems revealed the variable impact of the main drivers of change affecting agricultural development, particularly the enlargement and modernization of farms along with the demography of agricultural holdings. (Author) 41 refs.

  11. Hardware-software system for simulating and analyzing earthquakes applied to civil structures

    Directory of Open Access Journals (Sweden)

    J. P. Amezquita-Sanchez

    2012-01-01

    Full Text Available The occurrence of recent strong earthquakes, the incessant worldwide movements of tectonic plates and the continuous ambient vibrations caused by traffic and wind have increased the interest of researchers in improving the capacity of energy dissipation to avoid damages to civil structures. Experimental testing of structural systems is essential for the understanding of physical behaviors and the building of appropriate analytic models in order to expose difficulties that may not have been considered in analytical studies. This paper presents a hardware-software system for exciting, monitoring and analyzing simultaneously a structure under earthquake signals and other types of signals in real-time. Effectiveness of the proposed system has been validated by experimental case studies and has been found to be a useful tool in the analysis of earthquake effects on structures.

  12. Research on application of technique for analyzing system reliability, GO-FLOW

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Fukuto, Junji; Sugasawa, Shinobu; Mitomo, Nobuo; Miyazaki, Keiko; Hirao, Yoshihiro; Kobayashi, Michiyuki

    1997-01-01

    As the method of evaluation, probabilistic safety assessment (PSA) has been introduced in nuclear power field, and began to play important role in plant design and safety examination. In the Ship Research Institute, as the technique for analyzing system reliability which takes the main part of PSA, the research on developing the GO-FLOW technique which has various advanced functions has been carried out. In this research, the functions of the GO-FLOW technique are improved, and the function of the dynamic behavior analysis for systems and the analysis function for the combination of the physical behavior of systems and the change of probabilistic events are developed, further, the function of extracting main accident sequence by utilizing the GO-FLOW technique is prepared. As for the analysis of dynamic behavior, the sample problem on hold-up tank was investigated. As to the extraction of main accident sequence, the fundamental part of the function of event tree analysis was consolidated, and the function of setting branching probability was given. As to the indication of plant behavior, the simulator for improved marine reactor MRX was developed. (K.I.)

  13. Analyzing and Comparing Biomass Feedstock Supply Systems in China: Corn Stover and Sweet Sorghum Case Studies

    Directory of Open Access Journals (Sweden)

    Lantian Ren

    2015-06-01

    Full Text Available This paper analyzes the rural Chinese biomass supply system and models supply chain operations according to U.S. concepts of logistical unit operations: harvest and collection, storage, transportation, preprocessing, and handling and queuing. In this paper, we quantify the logistics cost of corn stover and sweet sorghum in China under different scenarios. We analyze three scenarios of corn stover logistics from northeast China and three scenarios of sweet sorghum stalks logistics from Inner Mongolia in China. The case study estimates that the logistics cost of corn stover and sweet sorghum stalk to be $52.95/dry metric ton and $52.64/dry metric ton, respectively, for the current labor-based biomass logistics system. However, if the feedstock logistics operation is mechanized, the cost of corn stover and sweet sorghum stalk decreases to $36.01/dry metric ton and $35.76/dry metric ton, respectively. The study also includes a sensitivity analysis to identify the cost factors that cause logistics cost variation. Results of the sensitivity analysis show that labor price has the most influence on the logistics cost of corn stover and sweet sorghum stalk, with a variation of $6 to $12/dry metric ton.

  14. Quantitative analysis of thorium-containing materials using an Industrial XRF analyzer

    International Nuclear Information System (INIS)

    Hasikova, J.; Titov, V.; Sokolov, A.

    2014-01-01

    Thorium (Th) as nuclear fuel is clean and safe and offers significant advantages over uranium. The technology for several types of thorium reactors is proven but still must be developed on a commercial scale. In the case of commercialization of thorium nuclear reactor thorium raw materials will be on demand. With this, mining and processing companies producing Th and rare earth elements will require prompt and reliable methods and instrumentation for Th quantitative on-line analysis. Potential applicability of X-ray fluorescence conveyor analyzer CON-X series is discussed for Th quantitative or semi-quantitative on-line measurement in several types of Th-bearing materials. Laboratory study of several minerals (zircon sands and limestone as unconventional Th resources; monazite concentrate as Th associated resources and uranium ore residues after extraction as a waste product) was performed and analyzer was tested for on-line quantitative measurements of Th contents along with other major and minor components. Th concentration range in zircon sand is 50-350 ppm; its detection limit at this level is estimated at 25- 50 ppm in 5 minute measurements depending on the type of material. On-site test of the CON-X analyzer for continuous analysis of thorium traces along with other elements in zircon sand showed that accuracy of Th measurements is within 20% relative. When Th content is higher than 1% as in the concentrate of monazite ore (5-8% ThO_2) accuracy of Th determination is within 1% relative. Although preliminary on-site test is recommended in order to address system feasibility at a large scale, provided results show that industrial conveyor XRF analyzer CON-X series can be effectively used for analytical control of mining and processing streams of Th-bearing materials. (author)

  15. The development of 'Macro analyzer' and its application to steel analysis

    International Nuclear Information System (INIS)

    Kitamura, Koichi; Kawashima, Katsuhiro; Soga, Hiromu; Ogawa, Hiroyuki; Saeki, Tsuyoshi; Sato, Mitsuyoshi; Kaneko, Jiro.

    1984-01-01

    Two-dimensional quantitative quick analyzing method has been expected for the accurate evaluation of the segregation and inclusions in steel. A new electron probe large area mapping analyzer(''Macro analyszer''), that permits rapid and accurate two-dimensional analysis of elements even for the large sample(300x100 mm) with the undulation inherent to it, has been developed. This has been achieved by the development of a long-focus electron optics, a broad-view X-ray spectrometer, and a high-speed sample stage. This system was automatically operated with a control computer and at the same time the obtained two-dimensional data were processed with a data processing computer and shown on a color graphic display. By the development of this system, a new quantitative evaluation method of segregation and inclusions in continuously cast slabs and steel materials was established. In this paper, as an example of the use of this system, quantitative characterization of small segregation spots in a lamellartear resistant steel was studied. (author)

  16. Polymerase chain reaction system using magnetic beads for analyzing a sample that includes nucleic acid

    Science.gov (United States)

    Nasarabadi, Shanavaz [Livermore, CA

    2011-01-11

    A polymerase chain reaction system for analyzing a sample containing nucleic acid includes providing magnetic beads; providing a flow channel having a polymerase chain reaction chamber, a pre polymerase chain reaction magnet position adjacent the polymerase chain reaction chamber, and a post pre polymerase magnet position adjacent the polymerase chain reaction chamber. The nucleic acid is bound to the magnetic beads. The magnetic beads with the nucleic acid flow to the pre polymerase chain reaction magnet position in the flow channel. The magnetic beads and the nucleic acid are washed with ethanol. The nucleic acid in the polymerase chain reaction chamber is amplified. The magnetic beads and the nucleic acid are separated into a waste stream containing the magnetic beads and a post polymerase chain reaction mix containing the nucleic acid. The reaction mix containing the nucleic acid flows to an analysis unit in the channel for analysis.

  17. Magnetic Signature Analysis & Validation System

    National Research Council Canada - National Science Library

    Vliet, Scott

    2001-01-01

    The Magnetic Signature Analysis and Validation (MAGSAV) System is a mobile platform that is used to measure, record, and analyze the perturbations to the earth's ambient magnetic field caused by object such as armored vehicles...

  18. 40 CFR 1065.309 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers...

    Science.gov (United States)

    2010-07-01

    ... not apply to any processing of individual analyzer signals that are time aligned to their t 50 times... for water removed from the sample done in post-processing according to § 1065.659 and it does not... used during emission testing. You may not use interpolation or filtering to alter the recorded values...

  19. Different approaches to analyze the dipolar interaction effects on diluted and concentrated granular superparamagnetic systems

    Energy Technology Data Exchange (ETDEWEB)

    Moscoso-Londoño, O., E-mail: omoscoso@ifi.unicamp.br [Instituto de Física ‘Gleb Wataghin’, Universidade Estadual de Campinas (UNICAMP), CEP13083-859 Campinas, São Paulo (Brazil); Tancredi, P. [Laboratorio de Sólidos Amorfos, INTECIN, Facultad de Ingeniería, Universidad de Buenos Aires (UBA), CONICET, C1063ACV Buenos Aires (Argentina); Muraca, D. [Instituto de Física ‘Gleb Wataghin’, Universidade Estadual de Campinas (UNICAMP), CEP13083-859 Campinas, São Paulo (Brazil); Centro de Ciencias Naturais e Humanas, Universidade Federal do ABC (UFABC), Av. Dos Estados, 5001, Santo André, SP (Brazil); Mendoza Zélis, P.; Coral, D.; Fernández van Raap, M.B. [Instituto de Física, Universidad Nacional de La Plata (UNLP), CONICET, CC.67, 1900 La Plata, Buenos Aires (Argentina); Wolff, U.; Neu, V.; Damm, C. [IFW Dresden, Leibniz Institute for Solid State and Materials Research, Dresden, Helmholtzstrasse 20, 01069 Dresden (Germany); Oliveira, C.L.P. de [Instituto de Física, Universidade de São Paulo, São Paulo 05314970 (Brazil); Pirota, K.R. [Instituto de Física ‘Gleb Wataghin’, Universidade Estadual de Campinas (UNICAMP), CEP13083-859 Campinas, São Paulo (Brazil); and others

    2017-04-15

    Controlled magnetic granular materials with different concentrations of magnetite nanoparticles immersed in a non-conducting polymer matrix were synthesized and, their macroscopic magnetic observables analyzed in order to advance towards a better understanding of the magnetic dipolar interactions and its effects on the obtained magnetic parameters. First, by means of X-ray diffraction, transmission electron microscopy, small angle X-ray scattering and X-ray absorption fine structure an accurate study of the structural properties was carried out. Then, the magnetic properties were analyzed by means of different models, including those that consider the magnetic interactions through long-range dipolar forces as: the Interacting Superparamagnetic Model (ISP) and the Vogel-Fulcher law (V-F). In systems with larger nanoparticle concentrations, magnetic results clearly indicate that the role played by the dipolar interactions affects the magnetic properties, giving rise to obtaining magnetic and structural parameters without physical meaning. Magnetic parameters as the effective anisotropic constant, magnetic moment relaxation time and mean blocking temperature, extracted from the application of the ISP model and V-F Law, were used to simulate the zero-field-cooling (ZFC) and field-cooling curves (FC). A comparative analysis of the simulated, fitted and experimental ZFC/FC curves suggests that the current models depict indeed our dilute granular systems. Notwithstanding, for concentrated samples, the ISP model infers that clustered nanoparticles are being interpreted as single entities of larger magnetic moment and volume, effect that is apparently related to a collective and complex magnetic moment dynamics within the cluster. - Highlights: • Nanoparticle architecture into matrices determines the composite magnetic response. • Magnetically diluted or compacted systems are useful to study magnetism at nanoscale. • Particle aggregation into the matrices was examined

  20. SSYST. A code system to analyze LWR fuel rod behavior under accident conditions

    International Nuclear Information System (INIS)

    Gulden, W.; Meyder, R.; Borgwaldt, H.

    1982-01-01

    SSYST (Safety SYSTem) is a modular system to analyze the behavior of light water reactor fuel rods and fuel rod simulators under accident conditions. It has been developed in close cooperation between Kernforschungszentrum Karlsruhe (KfK) and the Institut fuer Kerntechnik und Energiewandlung (IKE), University Stuttgart, under contract of Projekt Nukleare Sicherheit (PNS) at KfK. Although originally aimed at single rod analysis, features are available to calculate effects such as blockage ratios of bundles and wholes cores. A number of inpile and out-of-pile experiments were used to assess the system. Main differences versus codes like FRAP-T with similar applications are (1) an open-ended modular code organisation, (2) availability of modules of different sophistication levels for the same physical processes, and (3) a preference for simple models, wherever possible. The first feature makes SSYST a very flexible tool, easily adapted to changing requirements; the second enables the user to select computational models adequate to the significance of the physical process. This leads together with the third feature to short execution times. The analysis of transient rod behavior under LOCA boundary conditions e.g. takes 2 mins cpu-time (IBM-3033), so that extensive parametric studies become possible

  1. Reverse radiometric flow injection analysis (RFIA) of radioactive waste-waters with an ASIA (Ismatec) analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Myint, U; Win, N; San, K; Han, B; Myoe, K M [Yangon Univ. (Myanmar). Dept. of Chemistry; Toelgyessy, J [Slovak Technical Univ., Bratislava (Slovakia). Dept. of Environmental Science

    1994-07-01

    A new application of reverse radiometric flow injection analysis is described. RFIA was used for the analysis of radioactive wastewaters. ASIA (Ismatec) analyzer with NaI(Tl) scintillation detector was used in the study of analysis of [sup 131]I containing waste-aster. (author) 4 refs.; 3 figs.

  2. Risk analysis for working with an analyzer for gamma cameras diagnostic

    International Nuclear Information System (INIS)

    Oramas Polo, Ivon; Figueroa del Valle, Diana G.

    2014-01-01

    In this work, an analysis of the risks for working with an analyzer for gamma cameras diagnostic was made. The method employed is based on determining the Hazard Rating Number (HRN). The results showed that the risks with higher value of HRN are electrocution with 100 and touch source container with hands with 75. These risks were classified as 'Very High' and 'High' respectively. The following risks were classified as 'Important': Fall of the source container (HRN = 25), high dose of the sample in the container (HRN = 20) and fracture of glass detector (HRN = 30). The wrong shielding of the source container (HRN = 10) is a risk that was classified as L ow . Safety rules for use of the system are indicated. An action plan for risk management is also presented. (author)

  3. Integrated system for testing, investigation and analyzing of nuclear materials, TIAMAT-N

    International Nuclear Information System (INIS)

    Roth, Maria; Pitigoi, Vasile; Ionescu, Viorel; Constantin, Mihai; Babusi, Octavian

    2010-01-01

    Full text: The paper presents the results obtained in the framework of the project carried out as part of the National Program PNII, Modulus Capacities I, Competition 2008, concerning the performances of the Testing, Investigation and Analyzing System, used in the nuclear materials field. The system will ensure the evaluation of the nuclear structures, including the thermo-mechanical behaviour in connection with the physical-chemical analysis, microstructure and nondestructive investigations. Using last generation equipment and its interconnection to an IT system of monitoring, acquisition and data storage, it aims to implement the investigation methodologies applied in the nuclear area, to harmonize working practices according to the standards and procedures at European and international level. In addition, the system helps to develop a database, which will be continuously updated, with the materials investigated in the different types of tests and specific analyses. The project achievements will be capitalized at national level, sustaining the R and D studies of the National Nuclear Plan but also in the European and International Programs, including EURATOM Projects and Networks of Excellence, collaboration with AECL and COG Canada and participation in the AIEA Program. (authors)

  4. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    ://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http...

  5. Configurations of power relations in the Brazilian emergency care system: analyzing a context of visible practices.

    Science.gov (United States)

    Velloso, Isabela; Ceci, Christine; Alves, Marilia

    2013-09-01

    In this paper, we make explicit the changing configurations of power relations that currently characterize the Brazilian Emergency Care System (SAMU) team in Belo Horizonte, Brazil. The SAMU is a recent innovation in Brazilian healthcare service delivery. A qualitative case study methodology was used to explore SAMU's current organizational arrangements, specifically the power relations that have developed and that demonstrate internal team struggles over space and defense of particular occupational interests. The argument advanced in this paper is that these professionals are developing their work in conditions of exposure, that is, they are always being observed by someone, and that such observational exposure provides the conditions whereby everyday emergency care practices are enacted such that practice is shaped by, as well as shapes, particular, yet recognizable power relationships. Data were collected through the observation of the SAMU's work processes and through semi-structured interviews. Research materials were analyzed using discourse analysis. In the emergency care process of work, visibility is actually embedded in the disciplinary context and can thus be analyzed as a technique applied to produce disciplined individuals through the simple mechanisms elaborated by Foucault such as hierarchical surveillance, normalizing judgment, and the examination. © 2012 John Wiley & Sons Ltd.

  6. Development of the RETRAN input model for Ulchin 3/4 visual system analyzer

    International Nuclear Information System (INIS)

    Lee, S. W.; Kim, K. D.; Lee, Y. J.; Lee, W. J.; Chung, B. D.; Jeong, J. J.; Hwang, M. K.

    2004-01-01

    As a part of the Long-Term Nuclear R and D program, KAERI has developed the so-called Visual System Analyzer (ViSA) based on best-estimate codes. The MARS and RETRAN codes are used as the best-estimate codes for ViSA. Between these two codes, the RETRAN code is used for realistic analysis of Non-LOCA transients and small-break loss-of-coolant accidents, of which break size is less than 3 inch diameter. So it is necessary to develop the RETRAN input model for Ulchin 3/4 plants (KSNP). In recognition of this, the RETRAN input model for Ulchin 3/4 plants has been developed. This report includes the input model requirements and the calculation note for the input data generation (see the Appendix). In order to confirm the validity of the input data, the calculations are performed for a steady state at 100 % power operation condition, inadvertent reactor trip and RCP trip. The results of the steady-state calculation agree well with the design data. The results of the other transient calculations seem to be reasonable and consistent with those of other best-estimate calculations. Therefore, the RETRAN input data can be used as a base input deck for the RETRAN transient analyzer for Ulchin 3/4. Moreover, it is found that Core Protection Calculator (CPC) module, which is modified by Korea Electric Power Research Institute (KEPRI), is well adapted to ViSA

  7. Preliminary PANSAT ground station software design and use of an expert system to analyze telemetry

    Science.gov (United States)

    Lawrence, Gregory W.

    1994-03-01

    The Petite Amateur Navy Satellite (PANSAT) is a communications satellite designed to be used by civilian amateur radio operators. A master ground station is being built at the Naval Postgraduate School. This computer system performs satellite commands, displays telemetry, trouble-shoots problems, and passes messages. The system also controls an open loop tracking antenna. This paper concentrates on the telemetry display, decoding, and interpretation through artificial intelligence (AI). The telemetry is displayed in an easily interpretable format, so that any user can understand the current health of the satellite and be cued as to any problems and possible solutions. Only the master ground station has the ability to receive all telemetry and send commands to the spacecraft; civilian ham users do not have access to this information. The telemetry data is decommutated and analyzed before it is displayed to the user, so that the raw data will not have to be interpreted by ground users. The analysis will use CLIPS imbedded in the code, and derive its inputs from telemetry decommutation. The program is an expert system using a forward chaining set of rules based on the expected operation and parameters of the satellite. By building the rules during the construction and design of the satellite, the telemetry can be well understood and interpreted after the satellite is launched and the designers may no longer be available to provide input to the problem.

  8. Construction of prototype of on-line analyzer detection system for coal on belt conveyor using neutron activation technique

    International Nuclear Information System (INIS)

    Rony Djokorayono; Agus Cahyono; MP Indarzah; SG Usep; Sukandar

    2015-01-01

    The use of on-line neutron activation technique for coal analysis is proposed as an alternative method for analysis based on sampling technique. Compared to this conventional technique, the on-line neutron activation technique has much shorter time of analysis and more accurate results. The construction of detection system prototype for the on-line analyzer is described in this paper. This on-line analyzer consists of detection system, data acquisition system, and computer console. This detection system comprises several modules, i.e. NaI(Tl) scintillation detector completed with a photomultiplier tube (PMT), pre-amplifier, single channel analyzer (SCA), and analog signal transmitter and pulse counter processor. The construction processes of these four modules include the development of configuration block, lay out, and selection of electronic components. The modules have been integrated and tested. This detection system was tested using radioactive element Zn-65 having energy of 1115.5 keV and activity of 1 μCi. The test results show that the prototype of the on-line analyzer detection system has functioned as expected. (author)

  9. Building of the system for managing and analyzing the hyperspectral data of drilling core

    International Nuclear Information System (INIS)

    Huang Yanju; Zhang Jielin; Wang Junhu

    2010-01-01

    Drilling core logging is very important for geological exploration, hyperspectral detection provides a totally new method for drilling core logging. To use and analyze the drilling core data more easily, and especially store them permanently, a system is built for analyzing and managing the hyperspectral data. The system provides a convenient way to sort the core data, and extract the spectral characteristics, which is the basis for the following mineral identification. (authors)

  10. Distribution system modeling and analysis

    CERN Document Server

    Kersting, William H

    2001-01-01

    For decades, distribution engineers did not have the sophisticated tools developed for analyzing transmission systems-often they had only their instincts. Things have changed, and we now have computer programs that allow engineers to simulate, analyze, and optimize distribution systems. Powerful as these programs are, however, without a real understanding of the operating characteristics of a distribution system, engineers using the programs can easily make serious errors in their designs and operating procedures. Distribution System Modeling and Analysis helps prevent those errors. It gives readers a basic understanding of the modeling and operating characteristics of the major components of a distribution system. One by one, the author develops and analyzes each component as a stand-alone element, then puts them all together to analyze a distribution system comprising the various shunt and series devices for power-flow and short-circuit studies. He includes the derivation of all models and includes many num...

  11. Actinide isotopic analysis systems

    International Nuclear Information System (INIS)

    Koenig, Z.M.; Ruhter, W.D.; Gunnink, R.

    1990-01-01

    This manual provides instructions and procedures for using the Lawrence Livermore National Laboratory's two-detector actinide isotope analysis system to measure plutonium samples with other possible actinides (including uranium, americium, and neptunium) by gamma-ray spectrometry. The computer program that controls the system and analyzes the gamma-ray spectral data is driven by a menu of one-, two-, or three-letter options chosen by the operator. Provided in this manual are descriptions of these options and their functions, plus detailed instructions (operator dialog) for choosing among the options. Also provided are general instructions for calibrating the actinide isotropic analysis system and for monitoring its performance. The inventory measurement of a sample's total plutonium and other actinides content is determined by two nondestructive measurements. One is a calorimetry measurement of the sample's heat or power output, and the other is a gamma-ray spectrometry measurement of its relative isotopic abundances. The isotopic measurements needed to interpret the observed calorimetric power measurement are the relative abundances of various plutonium and uranium isotopes and americium-241. The actinide analysis system carries out these measurements. 8 figs

  12. The Social Process of Analyzing Real Water Resource Systems Plans and Management Policies

    Science.gov (United States)

    Loucks, Daniel

    2016-04-01

    Developing and applying systems analysis methods for improving the development and management of real world water resource systems, I have learned, is primarily a social process. This talk is a call for more recognition of this reality in the modeling approaches we propose in the papers and books we publish. The mathematical models designed to inform planners and managers of water systems that we see in many of our journals often seem more complex than they need be. They also often seem not as connected to reality as they could be. While it may be easier to publish descriptions of complex models than simpler ones, and while adding complexity to models might make them better able to mimic or resemble the actual complexity of the real physical and/or social systems or processes being analyzed, the usefulness of such models often can be an illusion. Sometimes the important features of reality that are of concern or interest to those who make decisions can be adequately captured using relatively simple models. Finding the right balance for the particular issues being addressed or the particular decisions that need to be made is an art. When applied to real world problems or issues in specific basins or regions, systems modeling projects often involve more attention to the social aspects than the mathematical ones. Mathematical models addressing connected interacting interdependent components of complex water systems are in fact some of the most useful methods we have to study and better understand the systems we manage around us. They can help us identify and evaluate possible alternative solutions to problems facing humanity today. The study of real world systems of interacting components using mathematical models is commonly called applied systems analyses. Performing such analyses with decision makers rather than of decision makers is critical if the needed trust between project personnel and their clients is to be developed. Using examples from recent and ongoing

  13. A Hybrid DGTD-MNA Scheme for Analyzing Complex Electromagnetic Systems

    KAUST Repository

    Li, Peng

    2015-01-07

    A hybrid electromagnetics (EM)-circuit simulator for analyzing complex systems consisting of EM devices loaded with nonlinear multi-port lumped circuits is described. The proposed scheme splits the computational domain into two subsystems: EM and circuit subsystems, where field interactions are modeled using Maxwell and Kirchhoff equations, respectively. Maxwell equations are discretized using a discontinuous Galerkin time domain (DGTD) scheme while Kirchhoff equations are discretized using a modified nodal analysis (MNA)-based scheme. The coupling between the EM and circuit subsystems is realized at the lumped ports, where related EM fields and circuit voltages and currents are allowed to “interact’’ via numerical flux. To account for nonlinear lumped circuit elements, the standard Newton-Raphson method is applied at every time step. Additionally, a local time-stepping scheme is developed to improve the efficiency of the hybrid solver. Numerical examples consisting of EM systems loaded with single and multiport linear/nonlinear circuit networks are presented to demonstrate the accuracy, efficiency, and applicability of the proposed solver.

  14. Green Buildings in Singapore; Analyzing a Frontrunner’s Sectoral Innovation System

    Directory of Open Access Journals (Sweden)

    Vidushini Siva

    2017-05-01

    Full Text Available The building sector in Singapore consumes up to half of the nation’s total energy. The government has therefore been urging the transformation of the industry by targeting 80% of all buildings to be green-certified by 2030. Thus far, Singapore has done relatively well, and is widely viewed as frontrunner in this respect. This paper addresses the question: what are the benefits and limitations of Singapore’s sectoral innovation system in spurring an energy transition in the building sector, in particular by up-scaling the use of green building technology? The Sectoral Innovation Systems (SIS theoretical framework was used to analyze the Singapore case. Four SIS components were assessed: technological regime, market demand, actor interactions and networks, and institutional framework. The benefits of Singapore’s sectoral innovation system identified in the analysis basically concern aspects of all of the four elements of SIS. Particular success factors concerned the launching of an integrated strategy to support green building innovations (i.e., the Green Mark policy scheme, implementing support policies, and setting up test beds. Furthermore, a masterplan to engage and educate end-users was implemented, knowledge exchange platforms were set up, regulations on the use of efficient equipment in buildings were issued, and standards and a certification system were adopted. The results also shed light on key barriers, namely, the reluctance of building users to change their habits, ineffective stakeholder collaboration, and green buildings innovation support coming from the government only. Measures in place have been moderately effective.

  15. A System Evaluation Theory Analyzing Value and Results Chain for Institutional Accreditation in Oman

    Science.gov (United States)

    Paquibut, Rene Ymbong

    2017-01-01

    Purpose: This paper aims to apply the system evaluation theory (SET) to analyze the institutional quality standards of Oman Academic Accreditation Authority using the results chain and value chain tools. Design/methodology/approach: In systems thinking, the institutional standards are connected as input, process, output and feedback and leads to…

  16. Wind Energy System Time-domain (WEST) analyzers using hybrid simulation techniques

    Science.gov (United States)

    Hoffman, J. A.

    1979-01-01

    Two stand-alone analyzers constructed for real time simulation of the complex dynamic characteristics of horizontal-axis wind energy systems are described. Mathematical models for an aeroelastic rotor, including nonlinear aerodynamic and elastic loads, are implemented with high speed digital and analog circuitry. Models for elastic supports, a power train, a control system, and a rotor gimbal system are also included. Limited correlation efforts show good comparisons between results produced by the analyzers and results produced by a large digital simulation. The digital simulation results correlate well with test data.

  17. On the Use of an Algebraic Signature Analyzer for Mixed-Signal Systems Testing

    Directory of Open Access Journals (Sweden)

    Vadim Geurkov

    2014-01-01

    Full Text Available We propose an approach to design of an algebraic signature analyzer that can be used for mixed-signal systems testing. The analyzer does not contain carry propagating circuitry, which improves its performance as well as fault tolerance. The common design technique of a signature analyzer for mixed-signal systems is based on the rules of an arithmetic finite field. The application of this technique to the systems with an arbitrary radix is a challenging task and the devices designed possess high hardware complexity. The proposed technique is simple and applicable to systems of any size and radix. The hardware complexity is low. The technique can also be used in arithmetic/algebraic coding and cryptography.

  18. SNAP/SOS: a package for simulating and analyzing safeguards systems

    International Nuclear Information System (INIS)

    Grant, F.H. III; Polito, J.; Sabuda, J.

    1983-01-01

    The effective analysis of safeguards systems at nuclear facilities requires significant effort. The Safeguards Network Analysis Procedure (SNAP) and the SNAP Operating System (SOS) reduce that effort to a manageable level. SNAP provides a detailed analysis of site safeguards for tactical evaluation. SOS helps the analyst organize and manage the SNAP effort effectively. SOS provides a database for model storage, automatic model generation, and computer graphics. The SOS/SNAP combination is a working example of a simulation system including executive-level control, database system, and facilities for model creation, editing, and output analysis

  19. A Framework For Analyzing And Mitigating The Vulnerabilities Of Complex Systems Via Attack And Protection Trees

    National Research Council Canada - National Science Library

    Edge, Kenneth S

    2007-01-01

    .... Attack trees by themselves do not provide enough decision support to system defenders. This research develops the concept of using protection trees to offer a detailed risk analysis of a system...

  20. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Hensley, W.K.; Denton, M.M.; Garcia, S.R.

    1981-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. The system and its mode of operation for a large reconnaissance survey are described

  1. End loss analyzer system for measurements of plasma flux at the C-2U divertor electrode

    Energy Technology Data Exchange (ETDEWEB)

    Griswold, M. E., E-mail: mgriswold@trialphaenergy.com; Korepanov, S.; Thompson, M. C. [Tri Alpha Energy, P.O. Box 7010, Rancho Santa Margarita, California 92688 (United States)

    2016-11-15

    An end loss analyzer system consisting of electrostatic, gridded retarding-potential analyzers and pyroelectric crystal bolometers was developed to characterize the plasma loss along open field lines to the divertors of C-2U. The system measures the current and energy distribution of escaping ions as well as the total power flux to enable calculation of the energy lost per escaping electron/ion pair. Special care was taken in the construction of the analyzer elements so that they can be directly mounted to the divertor electrode. An attenuation plate at the entrance to the gridded retarding-potential analyzer reduces plasma density by a factor of 60 to prevent space charge limitations inside the device, without sacrificing its angular acceptance of ions. In addition, all of the electronics for the measurement are isolated from ground so that they can float to the bias potential of the electrode, 2 kV below ground.

  2. Implementation of inter-unit analysis for C and C++ languages in a source-based static code analyzer

    Directory of Open Access Journals (Sweden)

    A. V. Sidorin

    2015-01-01

    Full Text Available The proliferation of automated testing capabilities arises a need for thorough testing of large software systems, including system inter-component interfaces. The objective of this research is to build a method for inter-procedural inter-unit analysis, which allows us to analyse large and complex software systems including multi-architecture projects (like Android OS as well as to support complex assembly systems of projects. Since the selected Clang Static Analyzer uses source code directly as input data, we need to develop a special technique to enable inter-unit analysis for such analyzer. This problem is of special nature because of C and C++ language features that assume and encourage the separate compilation of project files. We describe the build and analysis system that was implemented around Clang Static Analyzer to enable inter-unit analysis and consider problems related to support of complex projects. We also consider the task of merging abstract source trees of translation units and its related problems such as handling conflicting definitions, complex build systems and complex projects support, including support for multi-architecture projects, with examples. We consider both issues related to language design and human-related mistakes (that may be intentional. We describe some heuristics that were used for this work to make the merging process faster. The developed system was tested using Android OS as the input to show it is applicable even for such complicated projects. This system does not depend on the inter-procedural analysis method and allows the arbitrary change of its algorithm.

  3. A high-performance data acquisition system for computer-based multichannel analyzer

    International Nuclear Information System (INIS)

    Zhou Xinzhi; Bai Rongsheng; Wen Liangbi; Huang Yanwen

    1996-01-01

    A high-performance data acquisition system applied in the multichannel analyzer is designed with single-chip microcomputer system. The paper proposes the principle and the method of realizing the simultaneous data acquisition, the data pre-processing, and the fast bidirectional data transfer by means of direct memory access based on dual-port RAM as well. The measurement for dead or live time of ADC system can also be implemented efficiently by using it

  4. ANALYZING THE POSTPONEMENT OF TIME PRODUCTION SYSTEMS IN MAKE-TO-STOCK AND SEASONAL DEMAND

    Directory of Open Access Journals (Sweden)

    Paulo Cesar Chagas Rodrigues

    2011-12-01

    Full Text Available The supply chain management, postponement and demand management functions are of strategic importance to the economic success of organizations because they influence the production process, when viewed in isolation and empirically may hinder understanding of their behavior. The aim of this paper is to analyze the influence of the postponement in an enterprise production system with make-to-stock and with seasonal demand. The research method used was a case study, the instruments of data collection were semi-structured interviews, documentary analysis and site visits. This research is restricted to analysis of the influence that different levels of delay and the company's position in the supply chain have on the practice of demand management in the productive segment graphic, product spiral notebook and also in relation to geographical focus (region of the state São Paulo, in which it will seek to interview the managers and directors. As a way to support the research on the analysis of case study and the final considerations will be discussed the following issues: supply chain management, postponement, demand management and production system make-to-stock. The demand management can be understood as a practice that allows you to manage and coordinate the supply chain in reverse, i.e. the consumer to the supplier, in which consumers trigger actions for the supply of products can make the process more efficient. The purpose of managing the supply chain is able to allow the addition of value, exceeding the expectations of consumers, it is necessary to develop a relationship with suppliers and customers win-win. The postponement strategy must fit the characteristics of the turbulent environment within the markets along with demands that require variety of customized products and services and reasonable costs, aiming to support decision making. The postponement of time can be a way to soften the increase in inventory of finished product in the company

  5. Analyzing recommender systems for health promotion using a multidisciplinary taxonomy: A scoping review.

    Science.gov (United States)

    Hors-Fraile, Santiago; Rivera-Romero, Octavio; Schneider, Francine; Fernandez-Luque, Luis; Luna-Perejon, Francisco; Civit-Balcells, Anton; de Vries, Hein

    2018-06-01

    Recommender systems are information retrieval systems that provide users with relevant items (e.g., through messages). Despite their extensive use in the e-commerce and leisure domains, their application in healthcare is still in its infancy. These systems may be used to create tailored health interventions, thus reducing the cost of healthcare and fostering a healthier lifestyle in the population. This paper identifies, categorizes, and analyzes the existing knowledge in terms of the literature published over the past 10 years on the use of health recommender systems for patient interventions. The aim of this study is to understand the scientific evidence generated about health recommender systems, to identify any gaps in this field to achieve the United Nations Sustainable Development Goal 3 (SDG3) (namely, "Ensure healthy lives and promote well-being for all at all ages"), and to suggest possible reasons for these gaps as well as to propose some solutions. We conducted a scoping review, which consisted of a keyword search of the literature related to health recommender systems for patients in the following databases: ScienceDirect, PsycInfo, Association for Computing Machinery, IEEExplore, and Pubmed. Further, we limited our search to consider only English-language journal articles published in the last 10 years. The reviewing process comprised three researchers who filtered the results simultaneously. The quantitative synthesis was conducted in parallel by two researchers, who classified each paper in terms of four aspects-the domain, the methodological and procedural aspects, the health promotion theoretical factors and behavior change theories, and the technical aspects-using a new multidisciplinary taxonomy. Nineteen papers met the inclusion criteria and were included in the data analysis, for which thirty-three features were assessed. The nine features associated with the health promotion theoretical factors and behavior change theories were not observed in

  6. Automated 13CO2 analyzing system for the 13C breath test

    International Nuclear Information System (INIS)

    Suehiro, Makiko; Kuroda, Akira; Maeda, Masahiro; Hinaga, Kou; Watanabe, Hiroyuki.

    1987-01-01

    An automated 13 CO 2 analyzing system for the 13 C breath test was designed, built and evaluated. The system, which was designed to be controlled by a micro-computer, includes CO 2 purification, 13 CO 2 abundance measurement, data processing and data filing. This article gives the description of the whole system with flow charts. This system has proved to work well and it has become feasible to dispose of 5 to 6 CO 2 samples per hour. With such a system, the 13 C breath test will be carried out much more easily and will obtain much greater popularity. (author)

  7. A conceptual framework to analyze and mitigate aging effects of a system in nuclear power plants

    International Nuclear Information System (INIS)

    Ahmed, S.

    1985-01-01

    A conceptual framework is developed to analyze, characterize, and mitigate the degradation of a system in a nuclear power plant due to aging. The system is evaluated, based on an aging-specific system decision model, to understand and implement decisions pertaining to surveillance, maintenance, and replacement. Decisions on other corrective measures to mitigate the effects of aging of a number of equipment items, interconnections (relationships with other systems), and interfaces (relationships within the system) can also be made based on the proposed approach

  8. Conducting Qualitative Data Analysis: Reading Line-by-Line, but Analyzing by Meaningful Qualitative Units

    Science.gov (United States)

    Chenail, Ronald J.

    2012-01-01

    In the first of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail points out the challenges of determining units to analyze qualitatively when dealing with text. He acknowledges that although we may read a document word-by-word or line-by-line, we need to adjust our focus when processing the text for purposes of…

  9. Analyzing Developing Country Market Integration using Incomplete Price Data and Cluster Analysis

    NARCIS (Netherlands)

    Ansah, I.G.; Gardebroek, Koos; Ihle, R.; Jaletac, M.

    2015-01-01

    Recent global food price developments have spurred renewed interest in analyzing integration of local markets to global markets. A popular approach to quantify market integration is cointegration analysis. However, local market price data often has missing values, outliers, or short and incomplete

  10. Computerized ECT data analysis system

    International Nuclear Information System (INIS)

    Miyake, Y.; Fukui, S.; Iwahashi, Y.; Matsumoto, M.; Koyama, K.

    1988-01-01

    For the analytical method of the eddy current testing (ECT) of steam generator tubes in nuclear power plants, the authors have developed the computerized ECT data analysis system using a large-scale computer with a high-resolution color graphic display. This system can store acquired ECT data up to 15 steam generators, and ECT data can be analyzed immediately on the monitor in dialogue communication with a computer. Analyzed results of ECT data are stored and registered in the data base. This system enables an analyst to perform sorting and collecting of data under various conditions and obtain the results automatically, and also to make a plan of tube repair works. This system has completed the test run, and has been used for data analysis at the annual inspection of domestic plants. This paper describes an outline, features and examples of the computerized eddy current data analysis system for steam generator tubes in PWR nuclear power plants

  11. An Artificial Neural Network for Analyzing Overall Uniformity in Outdoor Lighting Systems

    Directory of Open Access Journals (Sweden)

    Antonio del Corte-Valiente

    2017-02-01

    Full Text Available Street lighting installations are an essential service for modern life due to their capability of creating a welcoming feeling at nighttime. Nevertheless, several studies have highlighted that it is possible to improve the quality of the light significantly improving the uniformity of the illuminance. The main difficulty arises when trying to improve some of the installation’s characteristics based only on statistical analysis of the light distribution. This paper presents a new algorithm that is able to obtain the overall illuminance uniformity in order to improve this sort of installations. To develop this algorithm it was necessary to perform a detailed study of all the elements which are part of street lighting installations. Because classification is one of the most important tasks in the application areas of artificial neural networks, we compared the performances of six types of training algorithms in a feed forward neural network for analyzing the overall uniformity in outdoor lighting systems. We found that the best algorithm that minimizes the error is “Levenberg-Marquardt back-propagation”, which approximates the desired output of the training pattern. By means of this kind of algorithm, it is possible to help to lighting professionals optimize the quality of street lighting installations.

  12. Analyzing dynamic fault trees derived from model-based system architectures

    International Nuclear Information System (INIS)

    Dehlinger, Josh; Dugan, Joanne Bechta

    2008-01-01

    Dependability-critical systems, such as digital instrumentation and control systems in nuclear power plants, necessitate engineering techniques and tools to provide assurances of their safety and reliability. Determining system reliability at the architectural design phase is important since it may guide design decisions and provide crucial information for trade-off analysis and estimating system cost. Despite this, reliability and system engineering remain separate disciplines and engineering processes by which the dependability analysis results may not represent the designed system. In this article we provide an overview and application of our approach to build architecture-based, dynamic system models for dependability-critical systems and then automatically generate Dynamic Fault Trees (DFT) for comprehensive, toolsupported reliability analysis. Specifically, we use the Architectural Analysis and Design Language (AADL) to model the structural, behavioral and failure aspects of the system in a composite architecture model. From the AADL model, we seek to derive the DFT(s) and use Galileo's automated reliability analyses to estimate system reliability. This approach alleviates the dependability engineering - systems engineering knowledge expertise gap, integrates the dependability and system engineering design and development processes and enables a more formal, automated and consistent DFT construction. We illustrate this work using an example based on a dynamic digital feed-water control system for a nuclear reactor

  13. Analyzing Effect of System Inertia on Grid Frequency Forecasting Usnig Two Stage Neuro-Fuzzy System

    Science.gov (United States)

    Chourey, Divyansh R.; Gupta, Himanshu; Kumar, Amit; Kumar, Jitesh; Kumar, Anand; Mishra, Anup

    2018-04-01

    Frequency forecasting is an important aspect of power system operation. The system frequency varies with load-generation imbalance. Frequency variation depends upon various parameters including system inertia. System inertia determines the rate of fall of frequency after the disturbance in the grid. Though, inertia of the system is not considered while forecasting the frequency of power system during planning and operation. This leads to significant errors in forecasting. In this paper, the effect of inertia on frequency forecasting is analysed for a particular grid system. In this paper, a parameter equivalent to system inertia is introduced. This parameter is used to forecast the frequency of a typical power grid for any instant of time. The system gives appreciable result with reduced error.

  14. Deep graphs—A general framework to represent and analyze heterogeneous complex systems across scales

    Science.gov (United States)

    Traxl, Dominik; Boers, Niklas; Kurths, Jürgen

    2016-06-01

    Network theory has proven to be a powerful tool in describing and analyzing systems by modelling the relations between their constituent objects. Particularly in recent years, a great progress has been made by augmenting "traditional" network theory in order to account for the multiplex nature of many networks, multiple types of connections between objects, the time-evolution of networks, networks of networks and other intricacies. However, existing network representations still lack crucial features in order to serve as a general data analysis tool. These include, most importantly, an explicit association of information with possibly heterogeneous types of objects and relations, and a conclusive representation of the properties of groups of nodes as well as the interactions between such groups on different scales. In this paper, we introduce a collection of definitions resulting in a framework that, on the one hand, entails and unifies existing network representations (e.g., network of networks and multilayer networks), and on the other hand, generalizes and extends them by incorporating the above features. To implement these features, we first specify the nodes and edges of a finite graph as sets of properties (which are permitted to be arbitrary mathematical objects). Second, the mathematical concept of partition lattices is transferred to the network theory in order to demonstrate how partitioning the node and edge set of a graph into supernodes and superedges allows us to aggregate, compute, and allocate information on and between arbitrary groups of nodes. The derived partition lattice of a graph, which we denote by deep graph, constitutes a concise, yet comprehensive representation that enables the expression and analysis of heterogeneous properties, relations, and interactions on all scales of a complex system in a self-contained manner. Furthermore, to be able to utilize existing network-based methods and models, we derive different representations of

  15. On variable geometric factor systems for top-hat electrostatic space plasma analyzers

    International Nuclear Information System (INIS)

    Collinson, Glyn A; Kataria, Dhiren O

    2010-01-01

    Even in the relatively small region of space that is the Earth's magnetosphere, ion and electron fluxes can vary by several orders of magnitude. Top-hat electrostatic analyzers currently do not possess the dynamic range required to sample plasma under all conditions. The purpose of this study was to compare, through computer simulation, three new electrostatic methods that would allow the sensitivity of a sensor to be varied through control of its geometric factor (GF) (much like an aperture on a camera). The methods studied were inner filter plates, split hemispherical analyzer (SHA) and top-cap electrode. This is the first discussion of the filter plate concept and also the first study where all three systems are studied within a common analyzer design, so that their relative merits could be fairly compared. Filter plates were found to have the important advantage that they facilitate the reduction in instrument sensitivity whilst keeping all other instrument parameters constant. However, it was discovered that filter plates have numerous disadvantages that make such a system impracticable for a top-hat electrostatic analyzer. It was found that both the top-cap electrode and SHA are promising variable geometric factor system (VGFS) concepts for implementation into a top-hat electrostatic analyzer, each with distinct advantages over the other

  16. Development of a Spatial Decision Support System for Analyzing Changes in Hydro-meteorological Risk

    Science.gov (United States)

    van Westen, Cees

    2013-04-01

    In the framework of the EU FP7 Marie Curie ITN Network "CHANGES: Changing Hydro-meteorological Risks, as Analyzed by a New Generation of European Scientists (http://www.changes-itn.eu)", a spatial decision support system is under development with the aim to analyze the effect of risk reduction planning alternatives on reducing the risk now and in the future, and support decision makers in selecting the best alternatives. The SDSS is one of the main outputs of the CHANGES network, which will develop an advanced understanding of how global changes, related to environmental and climate change as well as socio-economical change, may affect the temporal and spatial patterns of hydro-meteorological hazards and associated risks in Europe; how these changes can be assessed, modeled, and incorporated in sustainable risk management strategies, focusing on spatial planning, emergency preparedness and risk communication. The CHANGES network consists of 11 full partners and 6 associate partners of which 5 private companies, representing 10 European countries. The CHANGES network has hired 12 Early Stage Researchers (ESRs) and is currently hiring 3-6 researchers more for the implementation of the SDSS. The Spatial Decision Support System will be composed of a number of integrated components. The Risk Assessment component allows to carry out spatial risk analysis, with different degrees of complexity, ranging from simple exposure (overlay of hazard and assets maps) to quantitative analysis (using different hazard types, temporal scenarios and vulnerability curves) resulting into risk curves. The platform does not include a component to calculate hazard maps, and existing hazard maps are used as input data for the risk component. The second component of the SDSS is a risk reduction planning component, which forms the core of the platform. This component includes the definition of risk reduction alternatives (related to disaster response planning, risk reduction measures and

  17. Computer-controlled system for plasma ion energy auto-analyzer

    International Nuclear Information System (INIS)

    Wu Xianqiu; Chen Junfang; Jiang Zhenmei; Zhong Qinghua; Xiong Yuying; Wu Kaihua

    2003-01-01

    A computer-controlled system for plasma ion energy auto-analyzer was technically studied for rapid and online measurement of plasma ion energy distribution. The system intelligently controls all the equipments via a RS-232 port, a printer port and a home-built circuit. The software designed by LabVIEW G language automatically fulfils all of the tasks such as system initializing, adjustment of scanning-voltage, measurement of weak-current, data processing, graphic export, etc. By using the system, a few minutes are taken to acquire the whole ion energy distribution, which rapidly provide important parameters of plasma process techniques based on semiconductor devices and microelectronics

  18. Analyzing Systems Integration Best Practices and Assessment in DoD Space Systems Acquisition

    Science.gov (United States)

    2009-12-01

    AFIT/GLM/LAL/93S-1, 1993. 2. Adoption of ISO /IEC 15288:2002 Systems Engineering – System Life Cycle Process. Institute of Electrical and Electronic...Air Force. 40. Pennel, L.W. and Knight, F.L (Eds). TOR-2005( 8583 )-3, Systems Engineering. The Aerospace Corporation, 2005. 41. Roche, James G

  19. Remote sensing, geographical information systems, and spatial modeling for analyzing public transit services

    Science.gov (United States)

    Wu, Changshan

    Public transit service is a promising transportation mode because of its potential to address urban sustainability. Current ridership of public transit, however, is very low in most urban regions, particularly those in the United States. This woeful transit ridership can be attributed to many factors, among which poor service quality is key. Given this, there is a need for transit planning and analysis to improve service quality. Traditionally, spatially aggregate data are utilized in transit analysis and planning. Examples include data associated with the census, zip codes, states, etc. Few studies, however, address the influences of spatially aggregate data on transit planning results. In this research, previous studies in transit planning that use spatially aggregate data are reviewed. Next, problems associated with the utilization of aggregate data, the so-called modifiable areal unit problem (MAUP), are detailed and the need for fine resolution data to support public transit planning is argued. Fine resolution data is generated using intelligent interpolation techniques with the help of remote sensing imagery. In particular, impervious surface fraction, an important socio-economic indicator, is estimated through a fully constrained linear spectral mixture model using Landsat Enhanced Thematic Mapper Plus (ETM+) data within the metropolitan area of Columbus, Ohio in the United States. Four endmembers, low albedo, high albedo, vegetation, and soil are selected to model heterogeneous urban land cover. Impervious surface fraction is estimated by analyzing low and high albedo endmembers. With the derived impervious surface fraction, three spatial interpolation methods, spatial regression, dasymetric mapping, and cokriging, are developed to interpolate detailed population density. Results suggest that cokriging applied to impervious surface is a better alternative for estimating fine resolution population density. With the derived fine resolution data, a multiple

  20. Summary of [alpha]-FLOW, a general purpose three-dimensional fluid analyzing system. Han[prime]yo sanjigen ryutai kaiseki system [alpha]-FLOW no gaiyo

    Energy Technology Data Exchange (ETDEWEB)

    Koike, H [Fuji Research Institute Corporation, Tokyo (Japan)

    1992-08-01

    The [alpha]-FLOW is a three-dimensional fluid analyzing software developed from cooperations among research institutes of private business companies and universities in Japan under the assistance from the Ministry of International Trade and Industry. This paper describes its summary and features. The system is a discrete system utilizing a supercomputer and a work station. The analysis modules incorporated in the system include those for non-compressive fluid analysis, compressive fluid analysis, analysis of non-compressive fluid including free surface, analysis of flows including combustion and chemical reactions, substance migration analysis, and heat transfer analysis. It has a feature that even non-specialists can analyze fluids easily as a result of the development of an expert system to support the numerical analysis. Development of the input data preparing system enables to utilize the work station to process from shape modeling to grid generation, and from inputting analyzing condition data to calculating the flows and outputting the calculation result, all in dialogue modes. An open architecture was adopted. 27 refs., 7 figs., 10 tabs.

  1. The software for the USB-based multi-channel analyzer system

    International Nuclear Information System (INIS)

    Zhou Tong; Wei Yixiang

    2002-01-01

    A new type of multi-channel analyzer system is introduced, which takes advantage of Universal Serial Bus to communicate with computer and gets the merit of fast speed, universality and Plug and Play. The authors discussed the framework of the system, primary functions, display of spectrum date and the way of communication with hardware. The environment of the program is Visual Basic 6.0

  2. Analyzing Web Server Logs to Improve a Site's Usage. The Systems Librarian

    Science.gov (United States)

    Breeding, Marshall

    2005-01-01

    This column describes ways to streamline and optimize how a Web site works in order to improve both its usability and its visibility. The author explains how to analyze logs and other system data to measure the effectiveness of the Web site design and search engine.

  3. Developing an Indicator System for Monitoring, Analyzing, and Assessing Airport Sustainability

    NARCIS (Netherlands)

    Janic, M.

    2010-01-01

    This paper deals with developing an indicator system for monitoring, analyzing, and assessing sustainability of airports. The sustainability implies simultaneous increasing of the overall socialeconomic benefits and increasing at a slower rate, stagnating, and/or diminishing of the negative impacts

  4. EDAS-manual. SATAN - system to analyze tremendous amounts of nuclear data. Vol. 2

    International Nuclear Information System (INIS)

    Goeringer, H.; Gralla, S.; Malzacher, P.; Richter, M.; Schall, D.; Winkelmann, K.

    1988-09-01

    The system to analyze tremendous amounts of nuclear data (SATAN) shows different steps of a special experiment data evaluation called 'Linearisation'. The report contains the EDAS-manual with EDAS-command, TSO-command, macro and procedure. Syntax and usage of EDAS macros are explained. (DG)

  5. Requirements for a system to analyze HEP events using database computing

    International Nuclear Information System (INIS)

    May, E.; Lifka, D.; Lusk, E.; Price, L.E.; Day, C.T.; Loken, S.; MacFarlane, J.F.; Baden, A.

    1992-01-01

    We describe the requirements for the design and prototyping of an object-oriented database designed to analyze data in high energy physics. Our goal is to satisfy the data processing and analysis needs of a generic high energy physics experiment to be proposed for the Superconducting SuperCollider (SSC), and requires the collection and analysis of between 10 and 100 million sets of vectors (events), each approximately one megabyte in length. We sketch how this analysis would proceed using an object-oriented database which support the basic data types used in HEP

  6. A distributed system for visualizing and analyzing multivariate and multidisciplinary data

    Science.gov (United States)

    Jacobson, Allan S.; Allen, Mark; Bailey, Michael; Blom, Ronald; Blume, Leo; Elson, Lee

    1993-01-01

    THe Linked Windows Interactive Data System (LinkWinds) is being developed with NASA support. The objective of this proposal is to adapt and apply that system in a complex network environment containing elements to be found by scientists working multidisciplinary teams on very large scale and distributed data sets. The proposed three year program will develop specific visualization and analysis tools, to be exercised locally and remotely in the LinkWinds environment, to demonstrate visual data analysis, interdisciplinary data analysis and cooperative and interactive televisualization and analysis of data by geographically separated science teams. These demonstrators will involve at least two science disciplines with the aim of producing publishable results.

  7. Analyzing the topological, electrical and reliability characteristics of a power transmission system for identifying its critical elements

    International Nuclear Information System (INIS)

    Zio, E.; Golea, L.R.

    2012-01-01

    The subject of this paper is the analysis of an electrical transmission system with the objective of identifying its most critical elements with respect to failures and attacks. The methodological approach undertaken is based on graph-theoretical (topological) network analysis. Four different perspectives of analysis are considered within the formalism of weighed networks, adding to the purely topological analysis of the system, the reliability and electrical characteristics of its components. In each phase of the analysis: i) a graph-theoretical representation is offered to highlight the structure of the most important system connections according to the particular characteristics examined (topological, reliability, electrical or electrical-reliability), ii) the classical degree index of a network node is extended to account for the different characteristics considered. The application of these concepts of analysis to an electrical transmission system of literature confirms the importance of different perspectives of analysis on such a critical infrastructure. - Highlights: ► We analyze a power system from topological, reliability and electrical perspectives. ► We rank critical components within a vulnerability assessment framework. ► We compute an extended degree to rank critical energy paths. ► We compare several analytical approaches and provide a table for choosing among them. ► We suggest network changes to increase the reliability of highly loaded energy paths.

  8. When Workflow Management Systems and Logging Systems Meet: Analyzing Large-Scale Execution Traces

    Energy Technology Data Exchange (ETDEWEB)

    Gunter, Daniel

    2008-07-31

    This poster shows the benefits of integrating a workflow management system with logging and log mining capabilities. By combing two existing, mature technologies: Pegasus-WMS and Netlogger, we are able to efficiently process execution logs of earthquake science workflows consisting of hundreds of thousands to one million tasks. In particular we show results of processing logs of CyberShake, a workflow application running on the TeraGrid. Client-side tools allow scientists to quickly gather statistics about a workflow run and find out which tasks executed, where they were executed, what was their runtime, etc. These statistics can be used to understand the performance characteristics of a workflow and help tune the execution parameters of the workflow management system. This poster shows the scalability of the system presenting results of uploading task execution records into the system and by showing results of querying the system for overall workflow performance information.

  9. An embedded EEG analyzing system based on muC/os-II.

    Science.gov (United States)

    Liu, Boqiang; Zhang, Yanyan; Liu, Zhongguo; Yin, Cong

    2007-01-01

    An EEG analyzing system based on Advanced RISC Machines (ARM) and muC/os-II real time operating system is discussed in this paper. The detailed system design including the producing of event signals and the synchronization between event signals and EEG signals is described. The details of data acquisition, data preprocessing, data transmitting through USB and system configurations are also contained in the system design. In this paper the design of high capability amplifier and the software of embedded subsystem are discussed. Also the design of realizing multi-task system in muC/os-II, the definition of communicating protocols between PC and the equipment and the detail configurations of USB are given out. The final test shows that the filter behaviors of this equipment are feasible.

  10. Modeling, Control and Analyze of Multi-Machine Drive Systems using Bond Graph Technique

    Directory of Open Access Journals (Sweden)

    J. Belhadj

    2006-03-01

    Full Text Available In this paper, a system viewpoint method has been investigated to study and analyze complex systems using Bond Graph technique. These systems are multimachine multi-inverter based on Induction Machine (IM, well used in industries like rolling mills, textile, and railway traction. These systems are multi-domains, multi-scales time and present very strong internal and external couplings, with non-linearity characterized by a high model order. The classical study with analytic model is difficult to manipulate and it is limited to some performances. In this study, a “systemic approach” is presented to design these kinds of systems, using an energetic representation based on Bond Graph formalism. Three types of multimachine are studied with their control strategies. The modeling is carried out by Bond Graph and results are discussed to show the performances of this methodology

  11. Infrared analyzers for breast milk analysis: fat levels can influence the accuracy of protein measurements.

    Science.gov (United States)

    Kwan, Celia; Fusch, Gerhard; Bahonjic, Aldin; Rochow, Niels; Fusch, Christoph

    2017-10-26

    Currently, there is a growing interest in lacto-engineering in the neonatal intensive care unit, using infrared milk analyzers to rapidly measure the macronutrient content in breast milk before processing and feeding it to preterm infants. However, there is an overlap in the spectral information of different macronutrients, so they can potentially impact the robustness of the measurement. In this study, we investigate whether the measurement of protein is dependent on the levels of fat present while using an infrared milk analyzer. Breast milk samples (n=25) were measured for fat and protein content before and after being completely defatted by centrifugation, using chemical reference methods and near-infrared milk analyzer (Unity SpectraStar) with two different calibration algorithms provided by the manufacturer (released 2009 and 2015). While the protein content remained unchanged, as measured by elemental analysis, measurements by infrared milk analyzer show a difference in protein measurements dependent on fat content; high fat content can lead to falsely high protein content. This difference is less pronounced when measured using the more recent calibration algorithm. Milk analyzer users must be cautious of their devices' measurements, especially if they are changing the matrix of breast milk using more advanced lacto-engineering.

  12. Design and development of VHDL based IP core for coincidence analyzer for FPGA based TDCR system

    International Nuclear Information System (INIS)

    Agarwal, Shivam; Gupta, Ashutosh; Chaudhury, Probal; Sharma, M.K.; Kulkarni, M.S.

    2018-01-01

    The coincidence counting technique is used in activity measurement methods to determine the activity of radionuclide e.g. 4πβ-γ method and Triple to Double Coincidence Ratio (TDCR) method etc. The 4πβ-γ method requires two inputs Coincidence Analyzer (CA) whereas; TDCR method requires three inputs CA. A VHDL (Very High Speed Integrated Circuit Hardware Description Language) based IP (Intellectual Property) core for coincidence analyzer has been designed and implemented in FPGA (Field Programmable Gate Array) for TDCR system. The developed IP not only facilitates the coincidence counting of three channels simultaneously but also provides an extendable dead time feature

  13. The visual and remote analyzing software for a Linux-based radiation information acquisition system

    International Nuclear Information System (INIS)

    Fan Zhaoyang; Zhang Li; Chen Zhiqiang

    2003-01-01

    A visual and remote analyzing software for the radiation information, which has the merit of universality and credibility, is developed based on the Linux operating system and the TCP/IP network protocol. The software is applied to visually debug and real time monitor of the high-speed radiation information acquisition system, and a safe, direct and timely control can assured. The paper expatiates the designing thought of the software, which provides the reference for other software with the same purpose for the similar systems

  14. New Theoretical Analysis of the LRRM Calibration Technique for Vector Network Analyzers

    OpenAIRE

    Purroy Martín, Francesc; Pradell i Cara, Lluís

    2001-01-01

    In this paper, a new theoretical analysis of the four-standards line-reflect-reflect-match (LRRM) vector network-analyzer (VNA) calibration technique is presented. As a result, it is shown that the reference-impedance (to which the LRRM calibration is referred) cannot generally be defined whenever nonideal standards are used. Based on this consideration, a new algorithm to determine the on-wafer match standard is proposed that improves the LRRM calibration accuracy. Experimental verification ...

  15. Method and system for formation and withdrawal of a sample from a surface to be analyzed

    Science.gov (United States)

    Van Berkel, Gary J.; Kertesz, Vilmos

    2017-10-03

    A method and system for formation and withdrawal of a sample from a surface to be analyzed utilizes a collection instrument having a port through which a liquid solution is conducted onto the surface to be analyzed. The port is positioned adjacent the surface to be analyzed, and the liquid solution is conducted onto the surface through the port so that the liquid solution conducted onto the surface interacts with material comprising the surface. An amount of material is thereafter withdrawn from the surface. Pressure control can be utilized to manipulate the solution balance at the surface to thereby control the withdrawal of the amount of material from the surface. Furthermore, such pressure control can be coordinated with the movement of the surface relative to the port of the collection instrument within the X-Y plane.

  16. System, apparatus and methods to implement high-speed network analyzers

    Science.gov (United States)

    Ezick, James; Lethin, Richard; Ros-Giralt, Jordi; Szilagyi, Peter; Wohlford, David E

    2015-11-10

    Systems, apparatus and methods for the implementation of high-speed network analyzers are provided. A set of high-level specifications is used to define the behavior of the network analyzer emitted by a compiler. An optimized inline workflow to process regular expressions is presented without sacrificing the semantic capabilities of the processing engine. An optimized packet dispatcher implements a subset of the functions implemented by the network analyzer, providing a fast and slow path workflow used to accelerate specific processing units. Such dispatcher facility can also be used as a cache of policies, wherein if a policy is found, then packet manipulations associated with the policy can be quickly performed. An optimized method of generating DFA specifications for network signatures is also presented. The method accepts several optimization criteria, such as min-max allocations or optimal allocations based on the probability of occurrence of each signature input bit.

  17. Automated activation-analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.; Denton, M.M.

    1982-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day

  18. Microgamma Scan System for analyzing radial isotopic profiles of irradiated transmutation fuels

    International Nuclear Information System (INIS)

    Hilton, Bruce A.; McGrath, Christopher A.

    2008-01-01

    The U. S. Global Nuclear Energy Partnership / Advanced Fuel Cycle Initiative (GNEP/AFCI) is developing metallic transmutation alloys as a fuel form to transmute the long-lived transuranic actinide isotopes contained in spent nuclear fuel into shorter-lived fission products. A micro-gamma scan system is being developed to analyze the radial distribution of fission products, such as Cs-137, Cs-134, Ru-106, and Zr-95, in irradiated fuel cross-sections. The micro-gamma scan system consists of a precision linear stage with integrated sample holder and a tungsten alloy collimator, which interfaces with the Idaho National Laboratory (INL) Analytical Laboratory Hot Cell (ALHC) Gamma Scan System high purity germanium detector, multichannel analyzer, and removable collimators. A simplified model of the micro-gamma scan system was developed in MCNP (Monte-Carlo N-Particle Transport Code) and used to investigate the system performance and to interpret data from the scoping studies. Preliminary measurements of the micro-gamma scan system are discussed. (authors)

  19. Analyzing systemic risk using non-linear marginal expected shortfall and its minimum spanning tree

    Science.gov (United States)

    Song, Jae Wook; Ko, Bonggyun; Chang, Woojin

    2018-02-01

    The aim of this paper is to propose a new theoretical framework for analyzing the systemic risk using the marginal expected shortfall (MES) and its correlation-based minimum spanning tree (MST). At first, we develop two parametric models of MES with their closed-form solutions based on the Capital Asset Pricing Model. Our models are derived from the non-symmetric quadratic form, which allows them to consolidate the non-linear relationship between the stock and market returns. Secondly, we discover the evidences related to the utility of our models and the possible association in between the non-linear relationship and the emergence of severe systemic risk by considering the US financial system as a benchmark. In this context, the evolution of MES also can be regarded as a reasonable proxy of systemic risk. Lastly, we analyze the structural properties of the systemic risk using the MST based on the computed series of MES. The topology of MST conveys the presence of sectoral clustering and strong co-movements of systemic risk leaded by few hubs during the crisis. Specifically, we discover that the Depositories are the majority sector leading the connections during the Non-Crisis period, whereas the Broker-Dealers are majority during the Crisis period.

  20. Transient analyzer

    International Nuclear Information System (INIS)

    Muir, M.D.

    1975-01-01

    The design and design philosophy of a high performance, extremely versatile transient analyzer is described. This sub-system was designed to be controlled through the data acquisition computer system which allows hands off operation. Thus it may be placed on the experiment side of the high voltage safety break between the experimental device and the control room. This analyzer provides control features which are extremely useful for data acquisition from PPPL diagnostics. These include dynamic sample rate changing, which may be intermixed with multiple post trigger operations with variable length blocks using normal, peak to peak or integrate modes. Included in the discussion are general remarks on the advantages of adding intelligence to transient analyzers, a detailed description of the characteristics of the PPPL transient analyzer, a description of the hardware, firmware, control language and operation of the PPPL transient analyzer, and general remarks on future trends in this type of instrumentation both at PPPL and in general

  1. An Application of Multiplier Analysis in Analyzing the Role of Mining Sectors on Indonesian National Economy

    Science.gov (United States)

    Subanti, S.; Hakim, A. R.; Hakim, I. M.

    2018-03-01

    This purpose of the current study aims is to analyze the multiplier analysis on mining sector in Indonesia. The mining sectors defined by coal and metal; crude oil, natural gas, and geothermal; and other mining and quarrying. The multiplier analysis based from input output analysis, this divided by income multiplier and output multiplier. This results show that (1) Indonesian mining sectors ranked 6th with contribute amount of 6.81% on national total output; (2) Based on total gross value added, this sector contribute amount of 12.13% or ranked 4th; (3) The value from income multiplier is 0.7062 and the value from output multiplier is 1.2426.

  2. A Systems Approach to Analyzing Cyber-Physical Threats in the Smart Grid

    Energy Technology Data Exchange (ETDEWEB)

    Almajali, Anas; Rice, Eric; Viswanathan, Arun; Tan, Kymie; Neuman, Clifford

    2016-10-01

    This paper presents a systems analysis approach to characterizing the risk of a Smart Grid to a load-drop attack. A characterization of the risk is necessary for the design of detection and remediation strategies to address the consequences of such attacks. Using concepts from systems health management and system engineering, this work (a) first identifies metrics that can be used to generate constraints for security features, and (b) lays out an end-to-end integrated methodology using separate network and power simulations to assess system risk. We demonstrate our approach by performing a systems-style analysis of a load-drop attack implemented over the AMI subsystem and targeted at destabilizing the underlying power grid.

  3. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  4. [Morphometrical analyze of the middle cerebral artery system at the 13-15 weeks fetuses].

    Science.gov (United States)

    Macovei, Georgeta Nataşa; Varlam, H; St Antohe, D

    2002-01-01

    Tele-encephalization process is accompanied by the appearance and progressive complication of the middle cerebral artery system. The aim of our study is to analyze the morphometrical parameters of the middle cerebral artery branches in the beginning of the edification of its system. We used 162 cerebral hemispheres from 88 fetuses aged of 13-15 weeks. Middle cerebral artery system was injected with a gelatin-China ink mixture and images recorded by means of a Zeiss surgical microscope. Parameters evaluation (length, proximal and distal diameters, external surface, volume, angles of bifurcation) was realized with KS-300 program. At this early age middle cerebral artery system has only 4-5 generations of branches usually resulting from acute angle bifurcations.

  5. Development of ultracold neutron detectors and a polarization analyzing system for the measurement of the neutron electric dipole moment

    International Nuclear Information System (INIS)

    Rogel, Gwendal

    2009-01-01

    This thesis was performed in the context of a project aiming to measure the electric dipole moment (EDM) of the neutron at the Paul Scherrer Institute. Two aspects have been studied: The detection and the polarization analysis of ultracold neutrons. Three types of detectors have been tested at the Institut Laue-Langevin (ILL): The Cascade-U (GEM technology), the "3He gas detector and "6Li-doped glass scintillators (GS family). Their detection efficiency and their background sensitivity have been measured. The GS10 scintillator is competitive with the "3He gas detector under the conditions realized with the EDM spectrometer. A GS3/GS20 scintillator stack has enabled to improve the neutron/gamma discrimination. It has been found 20% less efficient than the "3He gas detector under the EDM spectrometer. The Cascade-U detector has been observed to be 20% less efficient than a 500 microns thick GS10 glass as confirmed by simulations. A new system for simultaneous spin analysis is presented. It consists of two independent detection systems (arms) which are each made of an adiabatic spin flipper, a spin analyzer, and a detector. The arms detect opposite spin components, allowing the simultaneous counting of both neutron spin orientations. A prototype mounted in horizontal configuration has been tested at ILL. The analyzing power of both arms has been measured to be 80%. The transmission of the system without spin analyzers has been found to be 50%. (author) [fr

  6. Systems engineering and analysis

    CERN Document Server

    Blanchard, Benjamin S

    2010-01-01

    For senior-level undergraduate and first and second year graduate systems engineering and related courses. A total life-cycle approach to systems and their analysis. This practical introduction to systems engineering and analysis provides the concepts, methodologies, models, and tools needed to understand and implement a total life-cycle approach to systems and their analysis. The authors focus first on the process of bringing systems into being--beginning with the identification of a need and extending that need through requirements determination, functional analysis and allocation, design synthesis, evaluation, and validation, operation and support, phase-out, and disposal. Next, the authors discuss the improvement of systems currently in being, showing that by employing the iterative process of analysis, evaluation, feedback, and modification, most systems in existence can be improved in their affordability, effectiveness, and stakeholder satisfaction.

  7. Isotope ratio analysis by a combination of element analyzer and mass spectrometer

    International Nuclear Information System (INIS)

    Pichlmayer, F.

    1987-06-01

    The use of stable isotope ratios of carbon, nitrogen and sulfur as analytical tool in many fields of research is of growing interest. A method has therefore been developed, consisting in essential of coupling an Elemental Analyzer with an Isotope Mass Spectrometer, which enables the gas preparation of carbon dioxide, nitrogen and sulfur dioxide from any solid or liquid sample in a fast and easy way. Results of carbon isotope measurements in food analysis are presented, whereat it is possible to check origin and treatment of sugar, oils, fats, mineral waters, spirituous liquors etc. and to detect adulterations as well. Also applications in the field of environmental research are given. (Author)

  8. Wind energy analysis system

    OpenAIRE

    2014-01-01

    M.Ing. (Electrical & Electronic Engineering) One of the most important steps to be taken before a site is to be selected for the extraction of wind energy is the analysis of the energy within the wind on that particular site. No wind energy analysis system exists for the measurement and analysis of wind power. This dissertation documents the design and development of a Wind Energy Analysis System (WEAS). Using a micro-controller based design in conjunction with sensors, WEAS measure, calcu...

  9. Improvement of the reliability graph with general gates to analyze the reliability of dynamic systems that have various operation modes

    Energy Technology Data Exchange (ETDEWEB)

    Shin, Seung Ki [Div. of Research Reactor System Design, Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); No, Young Gyu; Seong, Poong Hyun [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)

    2016-04-15

    The safety of nuclear power plants is analyzed by a probabilistic risk assessment, and the fault tree analysis is the most widely used method for a risk assessment with the event tree analysis. One of the well-known disadvantages of the fault tree is that drawing a fault tree for a complex system is a very cumbersome task. Thus, several graphical modeling methods have been proposed for the convenient and intuitive modeling of complex systems. In this paper, the reliability graph with general gates (RGGG) method, one of the intuitive graphical modeling methods based on Bayesian networks, is improved for the reliability analyses of dynamic systems that have various operation modes with time. A reliability matrix is proposed and it is explained how to utilize the reliability matrix in the RGGG for various cases of operation mode changes. The proposed RGGG with a reliability matrix provides a convenient and intuitive modeling of various operation modes of complex systems, and can also be utilized with dynamic nodes that analyze the failure sequences of subcomponents. The combinatorial use of a reliability matrix with dynamic nodes is illustrated through an application to a shutdown cooling system in a nuclear power plant.

  10. Improvement of the reliability graph with general gates to analyze the reliability of dynamic systems that have various operation modes

    International Nuclear Information System (INIS)

    Shin, Seung Ki; No, Young Gyu; Seong, Poong Hyun

    2016-01-01

    The safety of nuclear power plants is analyzed by a probabilistic risk assessment, and the fault tree analysis is the most widely used method for a risk assessment with the event tree analysis. One of the well-known disadvantages of the fault tree is that drawing a fault tree for a complex system is a very cumbersome task. Thus, several graphical modeling methods have been proposed for the convenient and intuitive modeling of complex systems. In this paper, the reliability graph with general gates (RGGG) method, one of the intuitive graphical modeling methods based on Bayesian networks, is improved for the reliability analyses of dynamic systems that have various operation modes with time. A reliability matrix is proposed and it is explained how to utilize the reliability matrix in the RGGG for various cases of operation mode changes. The proposed RGGG with a reliability matrix provides a convenient and intuitive modeling of various operation modes of complex systems, and can also be utilized with dynamic nodes that analyze the failure sequences of subcomponents. The combinatorial use of a reliability matrix with dynamic nodes is illustrated through an application to a shutdown cooling system in a nuclear power plant

  11. Analyzing the decision making process of certifying digital control systems of nuclear power plants

    International Nuclear Information System (INIS)

    Yih, Swu; Fan, Chin-Feng

    2012-01-01

    Highlights: ► We have performed basic research in analyzing certification process and developed a regulatory decision making model for nuclear digital control system certification. The model views certification as an evidence–confidence conversion process. ► We have applied this model to analyze previous nuclear digital I and C certification experiences and obtained valuable insights. ► Furthermore, a prototype of a computer-aided licensing support system based on the model has been developed to enhance regulatory review efficiency. - Abstract: Safety-critical computing systems need regulators’ approval before operation. Such a permit issue process is called “certification”. Digital instrumentation and Control (I and C) certification in the nuclear domain has always been problematic and lengthy. Thus, the certification efficiency has always been a crucial concern to the applicant whose business depends on the regulatory decision. However, to our knowledge, there is little basic research on this topic. This study presents a Regulatory Decision-Making Model aiming at analyzing the characteristics and efficiency influence factors in a generic certification process. This model is developed from a dynamic operational perspective by viewing the certification process as an evidence–confidence conversion process. The proposed model is then applied to previous nuclear digital I and C certification experiences to successfully explain why some cases were successful and some were troublesome. Lessons learned from these cases provide invaluable insights regarding to the regulatory review activity. Furthermore, to utilize the insights obtained from the model, a prototype of a computer-aided licensing support system has been developed to speed up review evidence preparation and manipulation; thus, regulatory review efficiency can be further improved.

  12. How to: Using Mode Analysis to Quantify, Analyze, and Interpret the Mechanisms of High-Density Collective Motion

    Directory of Open Access Journals (Sweden)

    Arianna Bottinelli

    2017-12-01

    Full Text Available While methods from statistical mechanics were some of the earliest analytical tools used to understand collective motion, the field has substantially expanded in scope beyond phase transitions and fluctuating order parameters. In part, this expansion is driven by the increasing variety of systems being studied, which in turn, has increased the need for innovative approaches to quantify, analyze, and interpret a growing zoology of collective behaviors. For example, concepts from material science become particularly relevant when considering the collective motion that emerges at high densities. Here, we describe methods originally developed to study inert jammed granular materials that have been borrowed and adapted to study dense aggregates of active particles. This analysis is particularly useful because it projects difficult-to-analyze patterns of collective motion onto an easier-to-interpret set of eigenmodes. Carefully viewed in the context of non-equilibrium systems, mode analysis identifies hidden long-range motions and localized particle rearrangements based solely on the knowledge of particle trajectories. In this work, we take a “how to” approach and outline essential steps, diagnostics, and know-how used to apply this analysis to study densely-packed active systems.

  13. On the computations analyzing natural optic flow : Quantitative model analysis of the blowfly motion vision pathway

    NARCIS (Netherlands)

    Lindemann, J.P.; Kern, R.; Hateren, J.H. van; Ritter, H.; Egelhaaf, M.

    2005-01-01

    For many animals, including humans, the optic flow generated on the eyes during locomotion is an important source of information about self-motion and the structure of the environment. The blowfly has been used frequently as a model system for experimental analysis of optic flow processing at the

  14. Data acquisition systems for uses of multi-counter time analyzer and one-dimensional PSD pulse height analyzer to neutron scattering measurements

    International Nuclear Information System (INIS)

    Ono, Masayoshi; Tasaki, Seiji; Okamoto, Sunao

    1989-01-01

    A data acquisition system having the various modern electronic devices was designed and tested for practical use of neutron time-of-flight (TOF) measurements with multiple counters. The system is principally composed of TOF logic units (load-able up to 128 units) with a control unit and a conventional micro-computer. The TOF logic unit (main memory, 2048 ch, 24 bits/ch) demonstrates about 1.7 times higher efficiency for neutron counting rate per channel than the one by a conventional TOF logic unit. Meanwhile, some data-access functions of the TOF logic unit were applied to position sensitive analyzer of one-dimensional neutron PSD for small angle scattering. The analyzer was tested with use of pulse generator. The result shows good linearity. (author)

  15. Digital video timing analyzer for the evaluation of PC-based real-time simulation systems

    Science.gov (United States)

    Jones, Shawn R.; Crosby, Jay L.; Terry, John E., Jr.

    2009-05-01

    Due to the rapid acceleration in technology and the drop in costs, the use of commercial off-the-shelf (COTS) PC-based hardware and software components for digital and hardware-in-the-loop (HWIL) simulations has increased. However, the increase in PC-based components creates new challenges for HWIL test facilities such as cost-effective hardware and software selection, system configuration and integration, performance testing, and simulation verification/validation. This paper will discuss how the Digital Video Timing Analyzer (DiViTA) installed in the Aviation and Missile Research, Development and Engineering Center (AMRDEC) provides quantitative characterization data for PC-based real-time scene generation systems. An overview of the DiViTA is provided followed by details on measurement techniques, applications, and real-world examples of system benefits.

  16. Advances in the Control System for a High Precision Dissolved Organic Carbon Analyzer

    Science.gov (United States)

    Liao, M.; Stubbins, A.; Haidekker, M.

    2017-12-01

    Dissolved organic carbon (DOC) is a master variable in aquatic ecosystems. DOC in the ocean is one of the largest carbon stores on earth. Studies of the dynamics of DOC in the ocean and other low DOC systems (e.g. groundwater) are hindered by the lack of high precision (sub-micromolar) analytical techniques. Results are presented from efforts to construct and optimize a flow-through, wet chemical DOC analyzer. This study focused on the design, integration and optimization of high precision components and control systems required for such a system (mass flow controller, syringe pumps, gas extraction, reactor chamber with controlled UV and temperature). Results of the approaches developed are presented.

  17. Towards for Analyzing Alternatives of Interaction Design Based on Verbal Decision Analysis of User Experience

    Directory of Open Access Journals (Sweden)

    Marília Soares Mendes

    2010-04-01

    Full Text Available In domains (as digital TV, smart home, and tangible interfaces that represent a new paradigm of interactivity, the decision of the most appropriate interaction design solution is a challenge. HCI researchers have promoted in their works the validation of design alternative solutions with users before producing the final solution. User experience with technology is a subject that has also gained ground in these works in order to analyze the appropriate solution(s. Following this concept, a study was accomplished under the objective of finding a better interaction solution for an application of mobile TV. Three executable applications of mobile TV prototypes were built. A Verbal Decision Analysis model was applied on the investigations for the favorite characteristics in each prototype based on the user’s experience and their intentions of use. This model led a performance of a qualitative analysis which objectified the design of a new prototype.

  18. Using business intelligence to analyze and share health system infrastructure data in a rural health authority.

    Science.gov (United States)

    Haque, Waqar; Urquhart, Bonnie; Berg, Emery; Dhanoa, Ramandeep

    2014-08-06

    Health care organizations gather large volumes of data, which has been traditionally stored in legacy formats making it difficult to analyze or use effectively. Though recent government-funded initiatives have improved the situation, the quality of most existing data is poor, suffers from inconsistencies, and lacks integrity. Generating reports from such data is generally not considered feasible due to extensive labor, lack of reliability, and time constraints. Advanced data analytics is one way of extracting useful information from such data. The intent of this study was to propose how Business Intelligence (BI) techniques can be applied to health system infrastructure data in order to make this information more accessible and comprehensible for a broader group of people. An integration process was developed to cleanse and integrate data from disparate sources into a data warehouse. An Online Analytical Processing (OLAP) cube was then built to allow slicing along multiple dimensions determined by various key performance indicators (KPIs), representing population and patient profiles, case mix groups, and healthy community indicators. The use of mapping tools, customized shape files, and embedded objects further augment the navigation. Finally, Web forms provide a mechanism for remote uploading of data and transparent processing of the cube. For privileged information, access controls were implemented. Data visualization has eliminated tedious analysis through legacy reports and provided a mechanism for optimally aligning resources with needs. Stakeholders are able to visualize KPIs on a main dashboard, slice-and-dice data, generate ad hoc reports, and quickly find the desired information. In addition, comparison, availability, and service level reports can also be generated on demand. All reports can be drilled down for navigation at a finer granularity. We have demonstrated how BI techniques and tools can be used in the health care environment to make informed

  19. Analyzing how we do Analysis and Consume Data, Results from the SciDAC-Data Project

    Science.gov (United States)

    Ding, P.; Aliaga, L.; Mubarak, M.; Tsaris, A.; Norman, A.; Lyon, A.; Ross, R.

    2017-10-01

    One of the main goals of the Dept. of Energy funded SciDAC-Data project is to analyze the more than 410,000 high energy physics datasets that have been collected, generated and defined over the past two decades by experiments using the Fermilab storage facilities. These datasets have been used as the input to over 5.6 million recorded analysis projects, for which detailed analytics have been gathered. The analytics and meta information for these datasets and analysis projects are being combined with knowledge of their part of the HEP analysis chains for major experiments to understand how modern computing and data delivery is being used. We present the first results of this project, which examine in detail how the CDF, D0, NOvA, MINERvA and MicroBooNE experiments have organized, classified and consumed petascale datasets to produce their physics results. The results include analysis of the correlations in dataset/file overlap, data usage patterns, data popularity, dataset dependency and temporary dataset consumption. The results provide critical insight into how workflows and data delivery schemes can be combined with different caching strategies to more efficiently perform the work required to mine these large HEP data volumes and to understand the physics analysis requirements for the next generation of HEP computing facilities. In particular we present a detailed analysis of the NOvA data organization and consumption model corresponding to their first and second oscillation results (2014-2016) and the first look at the analysis of the Tevatron Run II experiments. We present statistical distributions for the characterization of these data and data driven models describing their consumption.

  20. Objective evaluation of analyzer performance based on a retrospective meta-analysis of instrument validation studies: point-of-care hematology analyzers.

    Science.gov (United States)

    Cook, Andrea M; Moritz, Andreas; Freeman, Kathleen P; Bauer, Natali

    2017-06-01

    Information on quality requirements and objective evaluation of performance of veterinary point-of-care analyzers (POCAs) is scarce. The study was aimed at assessing observed total errors (TE obs s) for veterinary hematology POCAs via meta-analysis and comparing TE obs to allowable total error (TE a ) specifications based on experts' opinions. The TE obs for POCAs (impedance and laser-based) was calculated based on data from instrument validation studies published between 2006 and 2013 as follows: TE obs = 2 × CV [%] + bias [%]. The CV was taken from published studies; the bias was estimated from the regression equation at 2 different concentration levels of measurands. To fulfill quality requirements, TE obs should be 60% of analyzers showed TE obs hematology variables, respectively. For the CBC, TE obs was TE a (data from 3 analyzers). This meta-analysis is considered a pilot study. Experts' requirements (TE obs < TE a ) were fulfilled for most measurands except HGB (due to instrument-related bias for the ADVIA 2120) and platelet counts. Available data on the WBC differential count suggest an analytic bias, so nonstatistical quality control is recommended. © 2017 American Society for Veterinary Clinical Pathology.

  1. Analyzer-based imaging system performance in a synchrotron clinical environment: a feasibility study

    International Nuclear Information System (INIS)

    Arfelli, F.; Khromova, A.; Rigon, L.; Menk, R.H.; Dreossi, D.; Pinamonti, M.; Zanconati, F.

    2017-01-01

    X-ray phase contrast imaging arises from changes of the propagation direction of the radiant wave field when traversing the object and it can yield higher contrast for soft tissues than conventional x-ray radiology based on attenuation. Commonly intermediate steps are required to transform wave front modulations into intensity modulations measurable by the detection system. One of these phase contrast techniques is analyzer-based imaging (ABI), which utilizes an analyzer crystal as angular filter with a bandwidth in the micro-radian regime placed between the sample and the detector. Furthermore employing appropriate algorithms, attenuation, refraction and scattering/dark field images can be extracted providing complementary information. The implementation of ABI requires X-ray optics with very high stability and micro-radian resolution. In return, this method possesses an extremely high sensitivity among the phase contrast techniques. At the medical beamline of the Italian synchrotron ELETTRA, a patient room has been implemented in order to perform clinical mammography with free-space propagation phase contrast. In this work we have tested the feasibility of ABI in a preclinical set-up implementing the system in the patient room. High quality images of breast tissues samples are presented and compared to images acquired at a conventional mammography unit. The system has shown excellent stability and imaging performances.

  2. Problem-solving tools for analyzing system problems. The affinity map and the relationship diagram.

    Science.gov (United States)

    Lepley, C J

    1998-12-01

    The author describes how to use two management tools, an affinity map and a relationship diagram, to define and analyze aspects of a complex problem in a system. The affinity map identifies the key influencing elements of the problem, whereas the relationship diagram helps to identify the area that is the most important element of the issue. Managers can use the tools to draw a map of problem drivers, graphically display the drivers in a diagram, and use the diagram to develop a cause-and-effect relationship.

  3. Development of a PWR CRDM [control rod drive mechanism] data-analyzing system

    International Nuclear Information System (INIS)

    Miyaguchi, Jinichi

    1989-01-01

    Control rod drive mechanisms (CRDMs) play an important role in the nuclear power plant, and their reliability impacts plant operation and reactor safety. The CRDM performance might decline if the CRDM has been operated for a long time. The CRDM's operation time is expected to increase significantly, depending on the variations of plant operation, so it is desirable to upgrade preventive maintenance of CRDMs and drive lines through periodic inspection and condition monitoring. Furthermore, in the case of CRDM malfunction, it is necessary to cope immediately with the trouble, based on technical judgment. The CRDM data-analyzing system has been developed in order to achieve highly reliable CRDMs by predicting malfunctions

  4. Industrial applications of formal methods to model, design and analyze computer systems

    CERN Document Server

    Craigen, Dan

    1995-01-01

    Formal methods are mathematically-based techniques, often supported by reasoning tools, that can offer a rigorous and effective way to model, design and analyze computer systems. The purpose of this study is to evaluate international industrial experience in using formal methods. The cases selected are representative of industrial-grade projects and span a variety of application domains. The study had three main objectives: · To better inform deliberations within industry and government on standards and regulations; · To provide an authoritative record on the practical experience of formal m

  5. Simulation Tools and Techniques for Analyzing the Impacts of Photovoltaic System Integration

    Science.gov (United States)

    Hariri, Ali

    utility simulation software. On the other hand, EMT simulation tools provide high accuracy and visibility over a wide bandwidth of frequencies at the expense of larger processing and memory requirements, limited network size, and long simulation time. Therefore, there is a gap in simulation tools and techniques that can efficiently and effectively identify potential PV impact. New planning simulation tools are needed in order to accommodate for the simulation requirements of new integrated technologies in the electric grid. The dissertation at hand starts by identifying some of the potential impacts that are caused by high PV penetration. A phasor-based quasi-static time series (QSTS) analysis tool is developed in order to study the slow dynamics that are caused by the variations in the PV generation that lead to voltage fluctuations. Moreover, some EMT simulations are performed in order to study the impacts of PV systems on the electric network harmonic levels. These studies provide insights into the type and duration of certain impacts, as well as the conditions that may lead to adverse phenomena. In addition these studies present an idea about the type of simulation tools that are sufficient for each type of study. After identifying some of the potential impacts, certain planning tools and techniques are proposed. The potential PV impacts may cause certain utilities to refrain from integrating PV systems into their networks. However, each electric network has a certain limit beyond which the impacts become substantial and may adversely interfere with the system operation and the equipment along the feeder; this limit is referred to as the hosting limit (or hosting capacity). Therefore, it is important for utilities to identify the PV hosting limit on a specific electric network in order to safely and confidently integrate the maximum possible PV systems. In the following dissertation, two approaches have been proposed for identifying the hosing limit: 1. Analytical

  6. Fiscal system analysis - contractual systems

    International Nuclear Information System (INIS)

    Kaiser, M.J.

    2006-01-01

    Production sharing contracts are one of the most popular forms of contractual system used in petroleum agreements around the world, but the manner in which the fiscal terms and contract parameters impact system measures is complicated and not well understood. The purpose of this paper is to quantify the influence of private and market uncertainty in contractual fiscal systems. A meta-modelling approach is employed that couples the results of a simulation model with regression analysis to construct numerical functionals that quantify the fiscal regime. Relationships are derived that specify how the present value, rate of return, and take statistics vary as a function of the system parameters. The deepwater Girassol field development in Angola is taken as a case study. (author)

  7. Solution identification and quantitative analysis of fiber-capacitive drop analyzer based on multivariate statistical methods

    Science.gov (United States)

    Chen, Zhe; Qiu, Zurong; Huo, Xinming; Fan, Yuming; Li, Xinghua

    2017-03-01

    A fiber-capacitive drop analyzer is an instrument which monitors a growing droplet to produce a capacitive opto-tensiotrace (COT). Each COT is an integration of fiber light intensity signals and capacitance signals and can reflect the unique physicochemical property of a liquid. In this study, we propose a solution analytical and concentration quantitative method based on multivariate statistical methods. Eight characteristic values are extracted from each COT. A series of COT characteristic values of training solutions at different concentrations compose a data library of this kind of solution. A two-stage linear discriminant analysis is applied to analyze different solution libraries and establish discriminant functions. Test solutions can be discriminated by these functions. After determining the variety of test solutions, Spearman correlation test and principal components analysis are used to filter and reduce dimensions of eight characteristic values, producing a new representative parameter. A cubic spline interpolation function is built between the parameters and concentrations, based on which we can calculate the concentration of the test solution. Methanol, ethanol, n-propanol, and saline solutions are taken as experimental subjects in this paper. For each solution, nine or ten different concentrations are chosen to be the standard library, and the other two concentrations compose the test group. By using the methods mentioned above, all eight test solutions are correctly identified and the average relative error of quantitative analysis is 1.11%. The method proposed is feasible which enlarges the applicable scope of recognizing liquids based on the COT and improves the concentration quantitative precision, as well.

  8. Development of a model system to analyze chondrogenic differentiation of mesenchymal stem cells

    Science.gov (United States)

    Ruedel, Anke; Hofmeister, Simone; Bosserhoff, Anja-Katrin

    2013-01-01

    High-density cell culture is widely used for the analysis of cartilage development of human mesenchymal stem cells (HMSCs) in vitro. Several cell culture systems, as micromass, pellet culture and alginate culture, are applied by groups in the field to induce chondrogenic differentiation of HMSCs. A draw back of all model systems is the high amount of cells necessary for the experiments. Further, handling of large experimental approaches is difficult due to culturing e.g. in 15 ml tubes. Therefore, we aimed to develop a new model system based on “hanging drop” cultures using 10 to 100 fold less cells. Here, we demonstrate that differentiation of chondrogenic cells was induced as previously shown in other model systems. Real time RT-PCR analysis demonstrated that Collagen type II and MIA/CD-RAP were upregulated during culturing whereas for induction of hypertrophic markers like Collagen type X and AP-2 epsilon treatment with TGF beta was needed. To further test the system, siRNA against Sox9 was used and effects on chondrogenic gene expression were evaluated. In summary, the hanging drop culture system was determined to be a promising tool for in vitro chondrogenic studies. PMID:24294400

  9. Voice preprocessing system incorporating a real-time spectrum analyzer with programmable switched-capacitor filters

    Science.gov (United States)

    Knapp, G.

    1984-01-01

    As part of a speaker verification program for BISS (Base Installation Security System), a test system is being designed with a flexible preprocessing system for the evaluation of voice spectrum/verification algorithm related problems. The main part of this report covers the design, construction, and testing of a voice analyzer with 16 integrating real-time frequency channels ranging from 300 Hz to 3 KHz. The bandpass filter response of each channel is programmable by NMOS switched capacitor quad filter arrays. Presently, the accuracy of these units is limited to a moderate precision by the finite steps of programming. However, repeatability of characteristics between filter units and sections seems to be excellent for the implemented fourth-order Butterworth bandpass responses. We obtained a 0.1 dB linearity error of signal detection and measured a signal-to-noise ratio of approximately 70 dB. The proprocessing system discussed includes preemphasis filter design, gain normalizer design, and data acquisition system design as well as test results.

  10. Application of grey model on analyzing the passive natural circulation residual heat removal system of HTR-10

    Institute of Scientific and Technical Information of China (English)

    ZHOU Tao; PENG Changhong; WANG Zenghui; WANG Ruosu

    2008-01-01

    Using the grey correlation analysis, it can be concluded that the reactor pressure vessel wall temperature has the strongest effect on the passive residual heat removal system in HTR (High Temperature gas-cooled Reactor),the chimney height takes the second place, and the influence of inlet air temperature of the chimney is the least. This conclusion is the same as that analyzed by the traditional method. According to the grey model theory, the GM(1,1) and GM(1, 3) model are built based on the inlet air temperature of chimney, pressure vessel temperature and the chimney height. Then the effect of three factors on the heat removal power is studied in this paper. The model plays an important role on data prediction, and is a new method for studying the heat removal power. The method can provide a new theoretical analysis to the passive residual heat removal system of HTR.

  11. Development of a visual control and display system for the SMART plant analyzer

    International Nuclear Information System (INIS)

    Kang, Han Ok; Yoon, Ju Hyeon; Seo, Jae Kwang; Lee, Doo Jeong

    2000-01-01

    A Visual Control and Display System (VCDS) for the SMART plant analyzer has been developed using the MMS simulation tools. The SAMRT plant analyzer consists of the VCDS and the MMS SMART model. The MMS SMART model is a numerical simulation model for the SMART plant and is composed of the MMS real-time modules and control blocks. It covers the whole plant including primary, secondary and auxiliary systems. The developed VCDS is Graphical User Interfaces (GUI) that is running in a synchronized way with the SMART model. The VCDS consists of the MMS Simulation tools and seven control and display screens. The VCDS provides easy means for the control and display of the SMART model status. The VCDS allows users to display and change a specified list of model variables and transient scenarios interactively through the MMS simulation tools. The control and display screens are developed with Visual Basic 6.0 and MMI32 ActiveX controls and it can be executed in several TCP/IP networked computers simultaneously. The developed VCDS can be utilized for the engineering simulation of the SMART plant operation, and for control logic and operational procedure developments

  12. Analyzing Multiple-Choice Questions by Model Analysis and Item Response Curves

    Science.gov (United States)

    Wattanakasiwich, P.; Ananta, S.

    2010-07-01

    In physics education research, the main goal is to improve physics teaching so that most students understand physics conceptually and be able to apply concepts in solving problems. Therefore many multiple-choice instruments were developed to probe students' conceptual understanding in various topics. Two techniques including model analysis and item response curves were used to analyze students' responses from Force and Motion Conceptual Evaluation (FMCE). For this study FMCE data from more than 1000 students at Chiang Mai University were collected over the past three years. With model analysis, we can obtain students' alternative knowledge and the probabilities for students to use such knowledge in a range of equivalent contexts. The model analysis consists of two algorithms—concentration factor and model estimation. This paper only presents results from using the model estimation algorithm to obtain a model plot. The plot helps to identify a class model state whether it is in the misconception region or not. Item response curve (IRC) derived from item response theory is a plot between percentages of students selecting a particular choice versus their total score. Pros and cons of both techniques are compared and discussed.

  13. Analyzer-based phase-contrast imaging system using a micro focus x-ray source

    International Nuclear Information System (INIS)

    Zhou, Wei; Majidi, Keivan; Brankov, Jovan G.

    2014-01-01

    Here we describe a new in-laboratory analyzer based phase contrast-imaging (ABI) instrument using a conventional X-ray tube source (CXS) aimed at bio-medical imaging applications. Phase contrast-imaging allows visualization of soft tissue details usually obscured in conventional X-ray imaging. The ABI system design and major features are described in detail. The key advantage of the presented system, over the few existing CXS ABI systems, is that it does not require high precision components, i.e., CXS, X-ray detector, and electro-mechanical components. To overcome a main problem introduced by these components, identified as temperature stability, the system components are kept at a constant temperature inside of three enclosures, thus minimizing the electrical and mechanical thermal drifts. This is achieved by using thermoelectric (Peltier) cooling/heating modules that are easy to control precisely. For CXS we utilized a microfocus X-ray source with tungsten (W) anode material. In addition the proposed system eliminates tungsten's multiple spectral lines by selecting monochromator crystal size appropriately therefore eliminating need for the costly mismatched, two-crystal monochromator. The system imaging was fine-tuned for tungsten Kα 1 line with the energy of 59.3 keV since it has been shown to be of great clinical significance by a number of researchers at synchrotron facilities. In this way a laboratory system that can be used for evaluating and quantifying tissue properties, initially explored at synchrotron facilities, would be of great interest to a larger research community. To demonstrate the imaging capability of our instrument we use a chicken thigh tissue sample

  14. Analyzer-based phase-contrast imaging system using a micro focus x-ray source

    Science.gov (United States)

    Zhou, Wei; Majidi, Keivan; Brankov, Jovan G.

    2014-08-01

    Here we describe a new in-laboratory analyzer based phase contrast-imaging (ABI) instrument using a conventional X-ray tube source (CXS) aimed at bio-medical imaging applications. Phase contrast-imaging allows visualization of soft tissue details usually obscured in conventional X-ray imaging. The ABI system design and major features are described in detail. The key advantage of the presented system, over the few existing CXS ABI systems, is that it does not require high precision components, i.e., CXS, X-ray detector, and electro-mechanical components. To overcome a main problem introduced by these components, identified as temperature stability, the system components are kept at a constant temperature inside of three enclosures, thus minimizing the electrical and mechanical thermal drifts. This is achieved by using thermoelectric (Peltier) cooling/heating modules that are easy to control precisely. For CXS we utilized a microfocus X-ray source with tungsten (W) anode material. In addition the proposed system eliminates tungsten's multiple spectral lines by selecting monochromator crystal size appropriately therefore eliminating need for the costly mismatched, two-crystal monochromator. The system imaging was fine-tuned for tungsten Kα1 line with the energy of 59.3 keV since it has been shown to be of great clinical significance by a number of researchers at synchrotron facilities. In this way a laboratory system that can be used for evaluating and quantifying tissue properties, initially explored at synchrotron facilities, would be of great interest to a larger research community. To demonstrate the imaging capability of our instrument we use a chicken thigh tissue sample.

  15. Analyzer-based phase-contrast imaging system using a micro focus x-ray source

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Wei [BME Department, Illinois Institute of Technology, Chicago, Illinois 60616 (United States); Majidi, Keivan; Brankov, Jovan G., E-mail: brankov@iit.edu [ECE Department, Illinois Institute of Technology, Chicago, Illinois 60616 (United States)

    2014-08-15

    Here we describe a new in-laboratory analyzer based phase contrast-imaging (ABI) instrument using a conventional X-ray tube source (CXS) aimed at bio-medical imaging applications. Phase contrast-imaging allows visualization of soft tissue details usually obscured in conventional X-ray imaging. The ABI system design and major features are described in detail. The key advantage of the presented system, over the few existing CXS ABI systems, is that it does not require high precision components, i.e., CXS, X-ray detector, and electro-mechanical components. To overcome a main problem introduced by these components, identified as temperature stability, the system components are kept at a constant temperature inside of three enclosures, thus minimizing the electrical and mechanical thermal drifts. This is achieved by using thermoelectric (Peltier) cooling/heating modules that are easy to control precisely. For CXS we utilized a microfocus X-ray source with tungsten (W) anode material. In addition the proposed system eliminates tungsten's multiple spectral lines by selecting monochromator crystal size appropriately therefore eliminating need for the costly mismatched, two-crystal monochromator. The system imaging was fine-tuned for tungsten Kα{sub 1} line with the energy of 59.3 keV since it has been shown to be of great clinical significance by a number of researchers at synchrotron facilities. In this way a laboratory system that can be used for evaluating and quantifying tissue properties, initially explored at synchrotron facilities, would be of great interest to a larger research community. To demonstrate the imaging capability of our instrument we use a chicken thigh tissue sample.

  16. A novel approach to analyzing fMRI and SNP data via parallel independent component analysis

    Science.gov (United States)

    Liu, Jingyu; Pearlson, Godfrey; Calhoun, Vince; Windemuth, Andreas

    2007-03-01

    There is current interest in understanding genetic influences on brain function in both the healthy and the disordered brain. Parallel independent component analysis, a new method for analyzing multimodal data, is proposed in this paper and applied to functional magnetic resonance imaging (fMRI) and a single nucleotide polymorphism (SNP) array. The method aims to identify the independent components of each modality and the relationship between the two modalities. We analyzed 92 participants, including 29 schizophrenia (SZ) patients, 13 unaffected SZ relatives, and 50 healthy controls. We found a correlation of 0.79 between one fMRI component and one SNP component. The fMRI component consists of activations in cingulate gyrus, multiple frontal gyri, and superior temporal gyrus. The related SNP component is contributed to significantly by 9 SNPs located in sets of genes, including those coding for apolipoprotein A-I, and C-III, malate dehydrogenase 1 and the gamma-aminobutyric acid alpha-2 receptor. A significant difference in the presences of this SNP component is found between the SZ group (SZ patients and their relatives) and the control group. In summary, we constructed a framework to identify the interactions between brain functional and genetic information; our findings provide new insight into understanding genetic influences on brain function in a common mental disorder.

  17. Computational method and system for modeling, analyzing, and optimizing DNA amplification and synthesis

    Science.gov (United States)

    Vandersall, Jennifer A.; Gardner, Shea N.; Clague, David S.

    2010-05-04

    A computational method and computer-based system of modeling DNA synthesis for the design and interpretation of PCR amplification, parallel DNA synthesis, and microarray chip analysis. The method and system include modules that address the bioinformatics, kinetics, and thermodynamics of DNA amplification and synthesis. Specifically, the steps of DNA selection, as well as the kinetics and thermodynamics of DNA hybridization and extensions, are addressed, which enable the optimization of the processing and the prediction of the products as a function of DNA sequence, mixing protocol, time, temperature and concentration of species.

  18. Quality requirements for veterinary hematology analyzers in small animals-a survey about veterinary experts' requirements and objective evaluation of analyzer performance based on a meta-analysis of method validation studies: bench top hematology analyzer.

    Science.gov (United States)

    Cook, Andrea M; Moritz, Andreas; Freeman, Kathleen P; Bauer, Natali

    2016-09-01

    Scarce information exists about quality requirements and objective evaluation of performance of large veterinary bench top hematology analyzers. The study was aimed at comparing the observed total error (TEobs ) derived from meta-analysis of published method validation data to the total allowable error (TEa ) for veterinary hematology variables in small animals based on experts' opinions. Ideally, TEobs should be hematology analyzers (ADVIA 2120; Sysmex XT2000iV, and CellDyn 3500) was calculated based on method validation studies published between 2005 and 2013 (n = 4). The percent TEobs = 2 * CV (%) + bias (%). The CV was derived from published studies except for the ADVIA 2120 (internal data), and bias was estimated from the regression equation. A total of 41 veterinary experts (19 diplomates, 8 residents, 10 postgraduate students, 4 anonymous specialists) responded. The proposed range of TEa was wide, but generally ≤ 20%. The TEobs was < TEa for all variables and analyzers except for canine and feline HGB (high bias, low CV) and platelet counts (high bias, high CV). Overall, veterinary bench top analyzers fulfilled experts' requirements except for HGB due to method-related bias, and platelet counts due to known preanalytic/analytic issues. © 2016 American Society for Veterinary Clinical Pathology.

  19. Imaging systems and algorithms to analyze biological samples in real-time using mobile phone microscopy.

    Science.gov (United States)

    Shanmugam, Akshaya; Usmani, Mohammad; Mayberry, Addison; Perkins, David L; Holcomb, Daniel E

    2018-01-01

    Miniaturized imaging devices have pushed the boundaries of point-of-care imaging, but existing mobile-phone-based imaging systems do not exploit the full potential of smart phones. This work demonstrates the use of simple imaging configurations to deliver superior image quality and the ability to handle a wide range of biological samples. Results presented in this work are from analysis of fluorescent beads under fluorescence imaging, as well as helminth eggs and freshwater mussel larvae under white light imaging. To demonstrate versatility of the systems, real time analysis and post-processing results of the sample count and sample size are presented in both still images and videos of flowing samples.

  20. Model Based Reasoning by Introductory Students When Analyzing Earth Systems and Societal Challenges

    Science.gov (United States)

    Holder, L. N.; Herbert, B. E.

    2014-12-01

    Understanding how students use their conceptual models to reason about societal challenges involving societal issues such as natural hazard risk assessment, environmental policy and management, and energy resources can improve instructional activity design that directly impacts student motivation and literacy. To address this question, we created four laboratory exercises for an introductory physical geology course at Texas A&M University that engages students in authentic scientific practices by using real world problems and issues that affect societies based on the theory of situated cognition. Our case-study design allows us to investigate the various ways that students utilize model based reasoning to identify and propose solutions to societally relevant issues. In each of the four interventions, approximately 60 students in three sections of introductory physical geology were expected to represent and evaluate scientific data, make evidence-based claims about the data trends, use those claims to express conceptual models, and use their models to analyze societal challenges. Throughout each step of the laboratory exercise students were asked to justify their claims, models, and data representations using evidence and through the use of argumentation with peers. Cognitive apprenticeship was the foundation for instruction used to scaffold students so that in the first exercise they are given a partially completed model and in the last exercise students are asked to generate a conceptual model on their own. Student artifacts, including representation of earth systems, representation of scientific data, verbal and written explanations of models and scientific arguments, and written solutions to specific societal issues or environmental problems surrounding earth systems, were analyzed through the use of a rubric that modeled authentic expertise and students were sorted into three categories. Written artifacts were examined to identify student argumentation and

  1. MACRO1: a code to test a methodology for analyzing nuclear-waste management systems

    International Nuclear Information System (INIS)

    Edwards, L.L.

    1979-01-01

    The code is primarily a manager of probabilistic data and deterministic mathematical models. The user determines the desired aggregation of the available models into a composite model of a physical system. MACRO1 then propagates the finite probability distributions of the inputs to the model to finite probability distributions over the outputs. MACRO1 has been applied to a sample analysis of a nuclear-waste repository, and its results compared satisfactorily with previously obtained Monte Carlo statistics

  2. On analyzing poblems of distributed systems and current internet in front of the future internet architectures

    Directory of Open Access Journals (Sweden)

    Antonio Marcos Alberti

    2016-11-01

    Full Text Available Nowadays, there are hundreds of underway worldwide projects to redesign both com- munication protocols and architecture of the Internet. These initiatives are collectively called “future Internet” research. Most of these initiatives rely on existing distributed systems, which often limit or even prevent the development of “clean slate” solutions. The main reason is that the great majority of distributed systems are tightly-linked with the TCP/IP protocol stack. In this article, we provide a first glance discussion on the relationships between future Internet and distributed systems research, focusing on dependencies and similar requirements among these areas. From this analysis, it beco- mes evident that many of the future Internet requirements (and open challenges are repeated in the distributed systems landscape. Although there are many studies on both research fronts individually, the study of the key challenges of future Internet when addressing distributed systems requirements is a topic yet not explored in our contemporary research. This paper aims at determining the gaps and requirements future Internet must fulfill in order to support future distributed systems. To support this objective, a set of design metrics are identified and a convergent design space is proposed.

  3. Comparison and verification of two computer programs used to analyze ventilation systems under accident conditions

    International Nuclear Information System (INIS)

    Hartig, S.H.; Wurz, D.E.; Arnitz, T.; Ruedinger, V.

    1985-01-01

    Two computer codes, TVENT and EVENT, which were developed at the Los Alamos National Laboratory (LANL) for the analysis of ventilation systems, have been modified to model air-cleaning systems that include active components with time-dependent flow-resistance characteristics. With both modified programs, fluid-dynamic transients were calculated for a test facility used to simulate accident conditions in air-cleaning systems. Experiments were performed in the test facility whereby flow and pressure transients were generated with the help of two quick-actuating air-stream control valves. The numerical calculations are compared with the test results. Although EVENT makes use of a more complex theoretical flow model than TVENT, the numerical simulations of both codes were found to be very similar for the flow conditions studied and to closely follow the experimental results

  4. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    Science.gov (United States)

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  5. Analyzing Mathematics Beliefs of Pre-Service Teachers Using Confirmatory Factor Analysis

    Directory of Open Access Journals (Sweden)

    Mazlini Adnan

    2011-12-01

    Full Text Available Mathematics beliefs play an important role in enhancing the quality and the effectiveness of teaching and learning. This study analyzes the mathematics beliefs of 317 pre-service teachers from six Higher Education Institutions (HEIs (Government Public Universities who were randomly selected to participate in this study. Questionnaires consisting of twenty three items were given to the respondents during the data collection process. The validation of the items was done by using confirmatory factor analysis (CFA. In order to obtain a model fit for the measurement model of mathematics beliefs, several fit index tests such as CMINDF, GFI, AGFI, IFI, NFI, CFI, TLI and RMSEA were used. Constructivist beliefs and traditional beliefs were identified as the contributing factors in the model. The analysis also revealed that mathematics beliefs consist of structures of two hidden variables. The correlation between the two variables (constructivist beliefs and traditional beliefs is at a moderate level. Hence, pre-service teachers should be able to recognize their type of mathematics beliefs in order to become effective mathematics teachers.

  6. Spectral map-analysis: a method to analyze gene expression data

    OpenAIRE

    Bijnens, Luc J.M.; Lewi, Paul J.; Göhlmann, Hinrich W.; Molenberghs, Geert; Wouters, Luc

    2004-01-01

    bioinformatics; biplot; correspondence factor analysis; data mining; data visualization; gene expression data; microarray data; multivariate exploratory data analysis; principal component analysis; Spectral map analysis

  7. Analyzing the Drivers of Advanced Sustainable Manufacturing System Using AHP Approach

    Directory of Open Access Journals (Sweden)

    K. Madan Shankar

    2016-08-01

    Full Text Available A number of current manufacturing sectors are striving hard to introduce innovative long-term strategies into their operations. As a result, many scholarly studies have found it fruitful to investigate advanced manufacturing strategies such as agile, computer-integrated, and cellular manufacturing. Through the example of downstream cases, manufacturing sectors have learned that financial benefits garnered through automated technologies cannot be counted on as a sole measure to ensure their success in today’s competitive and fluctuating marketplaces. The objective of this study is to integrate those advanced techniques with sustainable operations, to promote advanced sustainable manufacturing so those manufacturing sectors can thrive even in uncertain markets. To establish this connection, this study analyzes the drivers of advanced sustainable manufacturing through a proposed framework validated through a case study in India. Common drivers are collected from the literature, calibrated with opinions from experts, and analyzed through an analytical hierarchy process (AHP, which is a multi-criteria decision making (MCDM approach. This study reveals that quality is the primary driver that pressures manufacturing sectors to adopt advanced sustainable manufacturing. Manufacturers can easily note the top ranked driver and adopt it to soundly implement advanced sustainable manufacturing. In addition, some key future scopes are explored along with possible recommendations for effective implementation of advanced sustainable manufacturing systems.

  8. A Fundamental Scale of Descriptions for Analyzing Information Content of Communication Systems

    Directory of Open Access Journals (Sweden)

    Gerardo Febres

    2015-03-01

    Full Text Available The complexity of the description of a system is a function of the entropy of its symbolic description. Prior to computing the entropy of the system’s description, an observation scale has to be assumed. In texts written in artificial and natural languages, typical scales are binary, characters, and words. However, considering languages as structures built around certain preconceived set of symbols, like words or characters, limits the level of complexity that can be revealed analytically. This study introduces the notion of the fundamental description scale to analyze the essence of the structure of a language. The concept of Fundamental Scale is tested for English and musical instrument digital interface (MIDI music texts using an algorithm developed to split a text in a collection of sets of symbols that minimizes the observed entropy of the system. This Fundamental Scale reflects more details of the complexity of the language than using bits, characters or words. Results show that this Fundamental Scale allows to compare completely different languages, such as English and MIDI coded music regarding its structural entropy. This comparative power facilitates the study of the complexity of the structure of different communication systems.

  9. How Unstable Are Complex Financial Systems? Analyzing an Inter-bank Network of Credit Relations

    Science.gov (United States)

    Sinha, Sitabhra; Thess, Maximilian; Markose, Sheri

    The recent worldwide economic crisis of 2007-09 has focused attention on the need to analyze systemic risk in complex financial networks. We investigate the problem of robustness of such systems in the context of the general theory of dynamical stability in complex networks and, in particular, how the topology of connections influence the risk of the failure of a single institution triggering a cascade of successive collapses propagating through the network. We use data on bilateral liabilities (or exposure) in the derivatives market between 202 financial intermediaries based in USA and Europe in the last quarter of 2009 to empirically investigate the network structure of the over-the-counter (OTC) derivatives market. We observe that the network exhibits both heterogeneity in node properties and the existence of communities. It also has a prominent core-periphery organization and can resist large-scale collapse when subjected to individual bank defaults (however, failure of any bank in the core may result in localized collapse of the innermost core with substantial loss of capital) but is vulnerable to system-wide breakdown as a result of an accompanying liquidity crisis.

  10. Development of Data Storage System for Portable Multichannel Analyzer using S D Card

    International Nuclear Information System (INIS)

    Suksompong, Tanate; Ngernvijit, Narippawaj; Sudprasert, Wanwisa

    2009-07-01

    Full text: The development of data storage system for portable multichannel analyzer (MCA) focused on the application of SD card as a storage device instead of the older devices that could not easily extend their capacity. The entire work consisted of two parts: the first part was the study for pulse detection by designing the input pulse detecting circuit. The second part dealed with the accuracy testing of data storage system for portable MCA, consisting of the design of connecting circuit between micro controller and SD card, the transfer of input pulse data into SD card and the ability of data storage system for radiation detection. It was found that the input pulse detecting circuit could detect the input pulse with the maximum voltage, then the signal was transferred to micro controller for data processing. The micro controller could connect to SD card via SPI MODE. The portable MCA could perfectly verify the input signal ranging from 0.2 to 5.0 volts. The SD card could store the data as . xls file which could easily be accessed by the compatible software such as Microsoft Excel

  11. A Comparison of Geographic Information Systems, Complex Networks, and Other Models for Analyzing Transportation Network Topologies

    Science.gov (United States)

    Alexandrov, Natalia (Technical Monitor); Kuby, Michael; Tierney, Sean; Roberts, Tyler; Upchurch, Christopher

    2005-01-01

    This report reviews six classes of models that are used for studying transportation network topologies. The report is motivated by two main questions. First, what can the "new science" of complex networks (scale-free, small-world networks) contribute to our understanding of transport network structure, compared to more traditional methods? Second, how can geographic information systems (GIS) contribute to studying transport networks? The report defines terms that can be used to classify different kinds of models by their function, composition, mechanism, spatial and temporal dimensions, certainty, linearity, and resolution. Six broad classes of models for analyzing transport network topologies are then explored: GIS; static graph theory; complex networks; mathematical programming; simulation; and agent-based modeling. Each class of models is defined and classified according to the attributes introduced earlier. The paper identifies some typical types of research questions about network structure that have been addressed by each class of model in the literature.

  12. Omega-mode perturbation theory and reactor kinetics for analyzing accelerator-driven subcritical systems

    International Nuclear Information System (INIS)

    Ren-Tai, Chiang

    2003-01-01

    An ω-mode first-order perturbation theory is developed for analyzing the time- and space-dependent neutron behavior in Accelerator-Driven Subcritical Systems (ADSS). The generalized point-kinetics equations are systematically derived using the ω-mode first-order perturbation theory and Fredholm Alternative Theorem. Seven sets of the ω-mode eigenvalues exist with using six groups of delayed neutrons and all ω eigenvalues are negative in ADSS. Seven ω-mode adjoint and forward eigenfunctions are employed to form the point-kinetic parameters. The neutron flux is expressed as a linear combination of the products of seven ω-eigenvalue-mode shape functions and their corresponding time functions up to the first order terms, and the lowest negative ω-eigenvalue mode is the dominant mode. (author)

  13. Bi-Directional Brillouin Optical Time Domain Analyzer System for Long Range Distributed Sensing.

    Science.gov (United States)

    Guo, Nan; Wang, Liang; Wang, Jie; Jin, Chao; Tam, Hwa-Yaw; Zhang, A Ping; Lu, Chao

    2016-12-16

    We propose and experimentally demonstrate a novel scheme of bi-directional Brillouin time domain analyzer (BD-BOTDA) to extend the sensing range. By deploying two pump-probe pairs at two different wavelengths, the Brillouin frequency shift (BFS) distribution over each half of the whole fiber can be obtained with the simultaneous detection of Brillouin signals in both channels. Compared to the conventional unidirectional BOTDA system of the same sensing range, the proposed BD-BOTDA scheme enables distributed sensing with a performance level comparable to the conventional one with half of the sensing range and a spatial resolution of 2 m, while maintaining the Brillouin signal-to-noise ratio (SNR) and the BFS uncertainty. Based on this technique, we have achieved distributed temperature sensing with a measurement range of 81.9 km fiber at a spatial resolution of 2 m and BFS uncertainty of ~0.44 MHz without introducing any complicated components or schemes.

  14. Development of Labview based data acquisition and multichannel analyzer software for radioactive particle tracking system

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, Nur Aira Abd, E-mail: nur-aira@nuclearmalaysia.gov.my; Yussup, Nolida; Ibrahim, Maslina Bt. Mohd; Abdullah, Nor Arymaswati; Mokhtar, Mukhlis B. [Technical Support Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia); Abdullah, Jaafar B.; Hassan, Hearie B. [Industrial Technology Division, Malaysian Nuclear Agency, 43000, Kajang, Selangor (Malaysia)

    2015-04-29

    A DAQ (data acquisition) software called RPTv2.0 has been developed for Radioactive Particle Tracking System in Malaysian Nuclear Agency. RPTv2.0 that features scanning control GUI, data acquisition from 12-channel counter via RS-232 interface, and multichannel analyzer (MCA). This software is fully developed on National Instruments Labview 8.6 platform. Ludlum Model 4612 Counter is used to count the signals from the scintillation detectors while a host computer is used to send control parameters, acquire and display data, and compute results. Each detector channel consists of independent high voltage control, threshold or sensitivity value and window settings. The counter is configured with a host board and twelve slave boards. The host board collects the counts from each slave board and communicates with the computer via RS-232 data interface.

  15. Bi-Directional Brillouin Optical Time Domain Analyzer System for Long Range Distributed Sensing

    Science.gov (United States)

    Guo, Nan; Wang, Liang; Wang, Jie; Jin, Chao; Tam, Hwa-Yaw; Zhang, A. Ping; Lu, Chao

    2016-01-01

    We propose and experimentally demonstrate a novel scheme of bi-directional Brillouin time domain analyzer (BD-BOTDA) to extend the sensing range. By deploying two pump-probe pairs at two different wavelengths, the Brillouin frequency shift (BFS) distribution over each half of the whole fiber can be obtained with the simultaneous detection of Brillouin signals in both channels. Compared to the conventional unidirectional BOTDA system of the same sensing range, the proposed BD-BOTDA scheme enables distributed sensing with a performance level comparable to the conventional one with half of the sensing range and a spatial resolution of 2 m, while maintaining the Brillouin signal-to-noise ratio (SNR) and the BFS uncertainty. Based on this technique, we have achieved distributed temperature sensing with a measurement range of 81.9 km fiber at a spatial resolution of 2 m and BFS uncertainty of ~0.44 MHz without introducing any complicated components or schemes. PMID:27999250

  16. Kinetic---a system code for analyzing nuclear thermal propulsion rocket engine transients

    International Nuclear Information System (INIS)

    Schmidt, E.; Lazareth, O.; Ludewig, H.

    1993-01-01

    A system code suitable for analyzing Nuclear Thermal Propulsion (NTP) rocket engines is described in this paper. The code consists of a point reactor model and nodes to describe the fluid dynamics and heat transfer mechanism. Feedback from the fuel, coolant, moderator and reflector are allowed for, and the control of the reactor is by motion of controls element (drums or rods). The worth of the control element and feedback coefficients are predetermined. Separate models for the turbo-pump assembly (TPA) and nozzle are also included. The model to be described in this paper is specific for the Particle Bed Reactor (PBR). An illustrative problem is solved. This problem consists of a PBR operating in a blowdown mode

  17. Kinetic—a system code for analyzing nuclear thermal propulsion rocket engine transients

    Science.gov (United States)

    Schmidt, Eldon; Lazareth, Otto; Ludewig, Hans

    1993-01-01

    A system code suitable for analyzing Nuclear Thermal Propulsion (NTP) rocket engines is described in this paper. The code consists of a point reactor model and nodes to describe the fluid dynamics and heat transfer mechanism. Feedback from the fuel, coolant, moderator and reflector are allowed for, and the control of the reactor is by motion of controls element (drums or rods). The worth of the control element and feedback coefficients are predetermined. Separate models for the turbo-pump assembly (TPA) and nozzle are also included. The model to be described in this paper is specific for the Particle Bed Reactor (PBR). An illustrative problem is solved. This problem consists of a PBR operating in a blowdown mode.

  18. KINETIC: A system code for analyzing Nuclear thermal propulsion rocket engine transients

    Science.gov (United States)

    Schmidt, E.; Lazareth, O.; Ludewig, H.

    1993-07-01

    A system code suitable for analyzing Nuclear Thermal Propulsion (NTP) rocket engines is described in this paper. The code consists of a point reactor model and nodes to describe the fluid dynamics and heat transfer mechanism. Feedback from the fuel coolant, moderator and reflector are allowed for, and the control of the reactor is by motion of control elements (drums or rods). The worth of the control clement and feedback coefficients are predetermined. Separate models for the turbo-pump assembly (TPA) and nozzle are also included. The model to be described in this paper is specific for the Particle Bed Reactor (PBR). An illustrative problem is solved. This problem consists of a PBR operating in a blowdown mode.

  19. A low-cost multichannel analyzer with data reduction assembly for continuous air monitoring system

    International Nuclear Information System (INIS)

    Zoghi, B.; Lee, Y.; Nelson, D.C.

    1992-01-01

    This paper reports on a microcontroller-based multichannel analyzer (MCA) with a data reduction assembly (DRA) for a plutonium continuous air monitor (CAM) system. The MCA is capable of detecting the airborne alpha emitters in the presence of radon daughter products. The pulse output from the preamplifier has been stretched to allow the peak detector sufficient time to capture the pulse height. The pulse amplitude conversion, the data acquisition, and the output functions are carried out fully by software. The DRA consists of a data reduction unit (DRU) and its operator interface panel. The data reduction assembly has the ability to be networked to a single PC with up to 332 different CAM's remotely connected to it

  20. Noise and vibration analysis system

    International Nuclear Information System (INIS)

    Johnsen, J.R.; Williams, R.L.

    1985-01-01

    The analysis of noise and vibration data from an operating nuclear plant can provide valuable information that can identify and characterize abnormal conditions. Existing plant monitoring equipment, such as loose parts monitoring systems (LPMS) and neutron flux detectors, may be capable of gathering noise data, but may lack the analytical capability to extract useful meanings hidden in the noise. By analyzing neutron noise signals, the structural motion and integrity of core components can be assessed. Computer analysis makes trending of frequency spectra within a fuel cycle and from one cycle to another a practical means of core internals monitoring. The Babcock and Wilcox Noise and Vibration Analysis System (NVAS) is a powerful, compact system that can automatically perform complex data analysis. The system can acquire, process, and store data, then produce report-quality plots of the important parameter. Software to perform neutron noise analysis and loose parts analysis operates on the same hardware package. Since the system is compact, inexpensive, and easy to operate, it allows utilities to perform more frequency analyses without incurring high costs and provides immediate results

  1. Irradiated-Microsphere Gamma Analyzer (IMGA): an integrated system for HTGR coated particle fuel performance assessment

    International Nuclear Information System (INIS)

    Kania, M.J.; Valentine, K.H.

    1980-02-01

    The Irradiated-Microsphere Gamma Analyzer (IMGA) System, designed and built at ORNL, provides the capability of making statistically accurate failure fraction measurements on irradiated HTGR coated particle fuel. The IMGA records the gamma-ray energy spectra from fuel particles and performs quantitative analyses on these spectra; then, using chemical and physical properties of the gamma emitters it makes a failed-nonfailed decision concerning the ability of the coatings to retain fission products. Actual retention characteristics for the coatings are determined by measuring activity ratios for certain gamma emitters such as 137 Cs/ 95 Zr and 144 Ce/ 95 Zr for metallic fission product retention and 134 Cs/ 137 Cs for an indirect measure of gaseous fission product retention. Data from IMGA (which can be put in the form of n failures observed in N examinations) can be accurately described by the binomial probability distribution model. Using this model, a mathematical relationship between IMGA data (n,N), failure fraction, and confidence level was developed. To determine failure fractions of less than or equal to 1% at confidence levels near 95%, this model dictates that from several hundred to several thousand particles must be examined. The automated particle handler of the IMGA system provides this capability. As a demonstration of failure fraction determination, fuel rod C-3-1 from the OF-2 irradiation capsule was analyzed and failure fraction statistics were applied. Results showed that at the 1% failure fraction level, with a 95% confidence level, the fissile particle batch could not meet requirements; however, the fertile particle exceeded these requirements for the given irradiation temperature and burnup

  2. The ALICE analysis train system

    CERN Document Server

    Zimmermann, Markus

    2015-01-01

    In the ALICE experiment hundreds of users are analyzing big datasets on a Grid system. High throughput and short turn-around times are achieved by a centralized system called the LEGO trains. This system combines analysis from different users in so-called analysis trains which are then executed within the same Grid jobs thereby reducing the number of times the data needs to be read from the storage systems. The centralized trains improve the performance, the usability for users and the bookkeeping in comparison to single user analysis. The train system builds upon the already existing ALICE tools, i.e. the analysis framework as well as the Grid submission and monitoring infrastructure. The entry point to the train system is a web interface which is used to configure the analysis and the desired datasets as well as to test and submit the train. Several measures have been implemented to reduce the time a train needs to finish and to increase the CPU efficiency.

  3. PWR systems transient analysis

    International Nuclear Information System (INIS)

    Kennedy, M.F.; Peeler, G.B.; Abramson, P.B.

    1985-01-01

    Analysis of transients in pressurized water reactor (PWR) systems involves the assessment of the response of the total plant, including primary and secondary coolant systems, steam piping and turbine (possibly including the complete feedwater train), and various control and safety systems. Transient analysis is performed as part of the plant safety analysis to insure the adequacy of the reactor design and operating procedures and to verify the applicable plant emergency guidelines. Event sequences which must be examined are developed by considering possible failures or maloperations of plant components. These vary in severity (and calculational difficulty) from a series of normal operational transients, such as minor load changes, reactor trips, valve and pump malfunctions, up to the double-ended guillotine rupture of a primary reactor coolant system pipe known as a Large Break Loss of Coolant Accident (LBLOCA). The focus of this paper is the analysis of all those transients and accidents except loss of coolant accidents

  4. Aviation Safety: Modeling and Analyzing Complex Interactions between Humans and Automated Systems

    Science.gov (United States)

    Rungta, Neha; Brat, Guillaume; Clancey, William J.; Linde, Charlotte; Raimondi, Franco; Seah, Chin; Shafto, Michael

    2013-01-01

    The on-going transformation from the current US Air Traffic System (ATS) to the Next Generation Air Traffic System (NextGen) will force the introduction of new automated systems and most likely will cause automation to migrate from ground to air. This will yield new function allocations between humans and automation and therefore change the roles and responsibilities in the ATS. Yet, safety in NextGen is required to be at least as good as in the current system. We therefore need techniques to evaluate the safety of the interactions between humans and automation. We think that current human factor studies and simulation-based techniques will fall short in front of the ATS complexity, and that we need to add more automated techniques to simulations, such as model checking, which offers exhaustive coverage of the non-deterministic behaviors in nominal and off-nominal scenarios. In this work, we present a verification approach based both on simulations and on model checking for evaluating the roles and responsibilities of humans and automation. Models are created using Brahms (a multi-agent framework) and we show that the traditional Brahms simulations can be integrated with automated exploration techniques based on model checking, thus offering a complete exploration of the behavioral space of the scenario. Our formal analysis supports the notion of beliefs and probabilities to reason about human behavior. We demonstrate the technique with the Ueberligen accident since it exemplifies authority problems when receiving conflicting advices from human and automated systems.

  5. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  6. Nuclear fuel cycle system analysis

    International Nuclear Information System (INIS)

    Ko, W. I.; Kwon, E. H.; Kim, S. G.; Park, B. H.; Song, K. C.; Song, D. Y.; Lee, H. H.; Chang, H. L.; Jeong, C. J.

    2012-04-01

    The nuclear fuel cycle system analysis method has been designed and established for an integrated nuclear fuel cycle system assessment by analyzing various methodologies. The economics, PR(Proliferation Resistance) and environmental impact evaluation of the fuel cycle system were performed using improved DB, and finally the best fuel cycle option which is applicable in Korea was derived. In addition, this research is helped to increase the national credibility and transparency for PR with developing and fulfilling PR enhancement program. The detailed contents of the work are as follows: 1)Establish and improve the DB for nuclear fuel cycle system analysis 2)Development of the analysis model for nuclear fuel cycle 3)Preliminary study for nuclear fuel cycle analysis 4)Development of overall evaluation model of nuclear fuel cycle system 5)Overall evaluation of nuclear fuel cycle system 6)Evaluate the PR for nuclear fuel cycle system and derive the enhancement method 7)Derive and fulfill of nuclear transparency enhancement method The optimum fuel cycle option which is economical and applicable to domestic situation was derived in this research. It would be a basis for establishment of the long-term strategy for nuclear fuel cycle. This work contributes for guaranteeing the technical, economical validity of the optimal fuel cycle option. Deriving and fulfillment of the method for enhancing nuclear transparency will also contribute to renewing the ROK-U.S Atomic Energy Agreement in 2014

  7. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, Carl A.; Johnson, Wayne F.; Walker, William A.

    1988-01-01

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises (1) a whole blood sample disc, (2) a serum sample disc, (3) a sample preparation rotor, and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc in capillary tubes filled by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analytical rotor for analysis by conventional methods.

  8. Review and Extension of Suitability Assessment Indicators of Weather Model Output for Analyzing Decentralized Energy Systems

    Directory of Open Access Journals (Sweden)

    Hans Schermeyer

    2015-12-01

    Full Text Available Electricity from renewable energy sources (RES-E is gaining more and more influence in traditional energy and electricity markets in Europe and around the world. When modeling RES-E feed-in on a high temporal and spatial resolution, energy systems analysts frequently use data generated by numerical weather models as input since there is no spatial inclusive and comprehensive measurement data available. However, the suitability of such model data depends on the research questions at hand and should be inspected individually. This paper focuses on new methodologies to carry out a performance evaluation of solar irradiation data provided by a numerical weather model when investigating photovoltaic feed-in and effects on the electricity grid. Suitable approaches of time series analysis are researched from literature and applied to both model and measurement data. The findings and limits of these approaches are illustrated and a new set of validation indicators is presented. These novel indicators complement the assessment by measuring relevant key figures in energy systems analysis: e.g., gradients in energy supply, maximum values and volatility. Thus, the results of this paper contribute to the scientific community of energy systems analysts and researchers who aim at modeling RES-E feed-in on a high temporal and spatial resolution using weather model data.

  9. Analyzing organic tea certification and traceability system within the Taiwanese tea industry.

    Science.gov (United States)

    Wang, Mao-Chang; Yang, Chin-Ying

    2015-04-01

    We applied game theory to the organic tea certification process and traceability system used by the Taiwanese tea industry to elucidate the strategic choices made by tea farmers and organic tea certification agencies. Thus, this paper clarifies how relevant variables affect the organic certification process and traceability system used within the tea industry. The findings indicate that farmers who generate high revenues experience failures regarding tea deliveries, cash outflow, damage compensation, and quasi-rent. An additional problem included the high costs yielded when tea farmers colluded with or switched organic tea certification agencies. Furthermore, there could be decreasing levels of personal interest in planting non-organic tea and lowering the costs of planting organic tea and the managerial accounting costs of building comprehensive traceability systems; thus, the analysis yielded strong results and a superior equilibrium. This research is unprecedented, using an innovative model and providing a novel analysis structure for use in the tea industry. These results contribute to the field of literature and should serve as a valuable reference for members of the tea industry, government, and academia. © 2014 Society of Chemical Industry.

  10. First result from x-ray pulse height analyzer with radial scanning system for LHD

    Science.gov (United States)

    Muto, Sadatsugu; Morita, Shigeru

    2001-01-01

    Radial profiles of x-ray spectrum have been successfully obtained using an assembly of x-ray pulse height analyzer in large helical device. The observed profile is obtained from plasma heated by ICRF and neutral beam injection (NBI). As a detector, Si(Li) semiconductor is used with a histogramming memory and analog-to-digital converter (ADC) basically working at high counting rate up to 500 kcps. In routine operation a count rate of 62 kcps has been normally obtained with energy resolution better than 400 eV at iron Kα line. The assembly is equipped with four detectors and a radial scanning system which modulates sight lines of the detectors in major radius direction. The profiles of electron temperature and the intensity of metallic impurities have been obtained with a spatial resolution of a few centimeters. Measured electron temperature is in good agreement with that from Thomson scattering. The system is applicable to steady-state discharge. The design philosophy of the assembly and recent results on the performance tests are also presented.

  11. USING ALMOST IDEAL DEMAND SYSTEM TO ANALYZE DEMAND FOR SHRIMP IN US FOOD MARKET

    Directory of Open Access Journals (Sweden)

    Xia “Vivian” Zhou

    2015-07-01

    Full Text Available This paper analyzes the demand for shrimp along with beef, pork, and chicken in the US food market, which contributes much to predicting supply strategies, consumer preferences and policy making. It focuses on the own and cross elasticity relationship between the expenditure share, price, and expenditure changes. An Almost Ideal Demand System (AIDs model and two alternative specifications (both nonlinear AIDs and LA-AIDs are used to estimate a system of expenditure share equations for ocean shrimp, penaeid shrimp, beef, pork, and chicken. Empirical results from nonlinear AIDs model is compared with those from LA-AIDs model. There are quite a few inconsistency between nonlinear and LA results. Results from nonlinear are more expected and more complied with microeconomic theory than those from LA. Also, results indicated that some insignificant slope coefficients and inappropriate signs of them did not comply with microeconomic theory. This could be caused by heteroscedasticity, autocorrelation, a limitation in the data used, or shrimp is a quite different commodity.

  12. Systems Engineering Analysis

    Directory of Open Access Journals (Sweden)

    Alexei Serna M.

    2013-07-01

    Full Text Available The challenges proposed by the development of the new computer systems demand new guidance related to engineer´s education, because they will solve these problems. In the XXI century, system engineers must be able to integrate a number of topics and knowledge disciplines that complement that traditionally has been known as Computer Systems Engineering. We have enough software development engineers, today we need professional engineers for software integration, leaders and system architects that make the most of the technological development for the benefit of society, leaders that integrate sciences to the solutions they build and propose. In this article the current situation of Computer Systems Engineering is analyzed and is presented a theory proposing the need for modifying the approach Universities have given to these careers, to achieve the education of leader engineers according to the needs of this century.

  13. Analyze This! Thematic Analysis: Hostility, Attribution of Intent, and Interpersonal Perception Bias.

    Science.gov (United States)

    Karadenizova, Zhana; Dahle, Klaus-Peter

    2017-11-01

    Research suggests that aggressive individuals exhibit a strong tendency to attribute hostile intent to the behavior of others when confronted with an ambiguous social situation. The vignettes method has become a standard procedure to assess hostile attributions. Vignettes represent incomplete ambiguous social stories, in which the subjects experience a negative outcome and are asked to attribute intent to the provocateur's action. This article explores the ways in which subjects perceive ambiguous social situations and other people's intentions, their tendency to refer negative outcome to oneself, and the components defining hostility in the interpersonal relationships. The sample consisted of male adolescent violent offenders ( N = 45) recruited from the Social Therapy Department of the German correctional facility for juvenile offenders in Berlin. All offenders were incarcerated for a violent or sexual crime and were currently undergoing individual and group psychotherapy. The five hypothetical vignettes used in this study were originally designed to assess hostile attributions in both institutional and noninstitutional social situations. Participants' responses were analyzed using thematic analysis. Thematic analysis revealed three key themes regarding the social perception-positive, negative, and neutral-and two themes regarding the components of hostility-provocateur-related personality features and relationship type. Although the vignettes were originally developed to detect hostility-prone perception bias, they seem to be able to reveal a wider set of different attributions of intent, both positive and negative. Thus, vignettes are not limited to assessment of hostility specifically. They much rather seem to be a measure which is sensitive to diverse attributions of intent in general. The diagnostic qualities of the vignettes, their area of application, limitations of the study, and future perspective are discussed.

  14. Analyzing Drivers' Attitude towards HUD System Using a Stated Preference Survey

    Directory of Open Access Journals (Sweden)

    Hongwei Guo

    2014-02-01

    Full Text Available It is very important for drivers to obtain driving information easily and efficiently. There are many advanced devices used for driving safety assistance. Of these assistance devices, the head-up display (HUD system can promote the reduction of driver's reaction time and improve spatial awareness. The drivers' attitude towards and preference for HUD system are crucial to design the functional framework and interface of HUD system. This study explored the relationships between drivers' attitude and HUD presentation image designs using stated preference data from questionnaire survey. The questionnaire included drivers' attitude towards the use of HUD and the preference for the information display zone and information display elements of the HUD. Contrastive analysis was adopted to examine the variations in drivers' attitude and preference for age and driving skills. According to the results, the participants have varying attitudes to HUD system, but most participants show relatively unified preference for the information display zone and information display elements. The results can also be used to customize a HUD presentation image which is in accordance with the drivers' feelings and preferences.

  15. Advances in electrostatic energy analyzers for ion beam probe diagnostic systems

    International Nuclear Information System (INIS)

    Bird, L.A.; Glowienka, J.C.; Jennings, W.C.; Hickok, R.L.

    1974-01-01

    Two new concepts are discussed for feedback controlled electrostatic energy analyzers; a dual gain analyzer for current density measurements, and bottom plate coupling to provide dc stability and better frequency response. An analyzer incorporating both of these concepts was built and preliminary measurements of its performance were made. These measurements are not reported here. (U.S.)

  16. USING CFD TO ANALYZE NUCLEAR SYSTEMS BEHAVIOR: DEFINING THE VALIDATION REQUIREMENTS

    Energy Technology Data Exchange (ETDEWEB)

    Richard Schultz

    2012-09-01

    A recommended protocol to formulate numeric tool specifications and validation needs in concert with practices accepted by regulatory agencies for advanced reactors is described. The protocol is based on the plant type and perceived transient and accident envelopes that translates to boundary conditions for a process that gives the: (a) key phenomena and figures-of-merit which must be analyzed to ensure that the advanced plant can be licensed, (b) specification of the numeric tool capabilities necessary to perform the required analyses—including bounding calculational uncertainties, and (c) specification of the validation matrices and experiments--including the desired validation data. The result of applying the process enables a complete program to be defined, including costs, for creating and benchmarking transient and accident analysis methods for advanced reactors. By following a process that is in concert with regulatory agency licensing requirements from the start to finish, based on historical acceptance of past licensing submittals, the methods derived and validated have a high probability of regulatory agency acceptance.

  17. Resilience as a framework for analyzing the adaptation of mountain summer pasture systems to climate change

    Directory of Open Access Journals (Sweden)

    Baptiste Nettier

    2017-12-01

    Full Text Available Social-ecological resilience is defined by Brian Walker and colleagues as "the capacity of a social-ecological system (SES to absorb disturbances and reorganize while undergoing change so as to continue to retain essentially the same function, structure, feedbacks, and therefore identity." It is an increasingly widespread concept whose success depends, among other things, on the promise of its rapid transfer from science into practice and its operational character for the sustainable management of SESs. However, tangible examples of management methods based on resilience remain limited in the scientific literature. Here, we test the resilience management framework proposed by Brian Walker and David Salt by applying it to the case of mountain summer pastures in the French Alps, which are complex SESs in which human and ecological dimensions are closely linked and subject to substantial perturbations due to climate change. Three steps were implemented: (1 building a conceptual model based on expert knowledge of the functioning of summer pastures; (2 building, from the model, a template for summer pasture resilience analysis; and (3 testing the operational character of the model and the template for two pairs of contrasting cases. This heuristic tool enables understanding the ways in which farmers and herders manage the resilience of their system but does not aim to quantify resilience. The method developed, together with the resilience concept, provide insights into the functioning of summer pastures from both biophysical and management perspectives. The modeling process constitutes a learning process, which will support the implementation of adaptive management. We identified three critical points for making the method truly operational: basing modeling on an equal consideration of social and ecological dimensions, defining the boundaries of the modeled system based on the social dimension, and selecting a scale of analysis coherent with the type

  18. Use of an Electronic Tongue System and Fuzzy Logic to Analyze Water Samples

    Science.gov (United States)

    Braga, Guilherme S.; Paterno, Leonardo G.; Fonseca, Fernando J.

    2009-05-01

    An electronic tongue (ET) system incorporating 8 chemical sensors was used in combination with two pattern recognition tools, namely principal component analysis (PCA) and Fuzzy logic for discriminating/classification of water samples from different sources (tap, distilled and three brands of mineral water). The Fuzzy program exhibited a higher accuracy than the PCA and allowed the ET to classify correctly 4 in 5 types of water. Exception was made for one brand of mineral water which was sometimes misclassified as tap water. On the other hand, the PCA grouped water samples in three clusters, one with the distilled water; a second with tap water and one brand of mineral water, and the third with the other two other brands of mineral water. Samples in the second and third clusters could not be distinguished. Nevertheless, close grouping between repeated tests indicated that the ET system response is reproducible. The potential use of the Fuzzy logic as the data processing tool in combination with an electronic tongue system is discussed.

  19. Application of the Controllable Unit Approach (CUA) to analyzing safeguards measurement systems

    International Nuclear Information System (INIS)

    Seabaugh, P.W.; Rogers, D.R.; Woltermann, H.A.; Fushimi, F.C.; Ciramella, A.F.

    1978-01-01

    CUA is a material control and accountability methodology that takes into account the system logic and statistical characteristics of a plant process through the formulation of closure equations. The study evaluated CUA methodology to meet performance oriented regulations. The criterion is defined as the detection of a material loss of two kilograms of SNM with 97.5% confidence. Specifically investigated were the timeliness of detection, the ability to localize material loss, process coverage, cost/benefits, and compatibility with other safeguards techniques such as diversion path analysis and data filtering. The feasibility of performance-oriented regulations is demonstrated. To fully use the system of closure equations, a procedure was developed to formally integrate the effect of both short-term and long-term closure equations into an overall systems criterion of performance. Both single and multiple diversion strategies are examined in order to show how the CUA method can protect against either strategy. Quantitative results show that combined closure equations improve the detection sensitivity to material loss, and that multiple diversions provide only diminishing returns

  20. Modern, PC based, high resolution portable EDXRF analyzer offers laboratory performance for field, in-situ analysis of environmental contaminants

    International Nuclear Information System (INIS)

    Piorek, Stanislaw

    1994-01-01

    The introduction of a new, high resolution, portable probe that has improved the sensitivity of the conventional field portable X-ray fluorescence (FPXRF) by up to an order of magnitude had been reported earlier [S. Piorek and J.R. Pasmore, Proc. 2nd Int. Symp. on Field Screening Methods for Hazardous Wastes and Toxic Chemicals, Las Vegas, 1991, p. 737]. A high resolution Si(Li) detector probe operates connected to a multichannel X-ray analyzer (2048 channels) which is housed in a portable, battery powered industrial computer. An improved energy resolution of the detector allows the implementation of more sophisticated data treatment methods to convert the measured intensities into mass concentrations of the analytes. A backscatter with a fundamental parameters approach (BFP) is one of the best methods, specifically for metallic contaminants in soil. A program has been written based on the BFP method for use with the new probe. The new software/probe combination enables one to quickly assess levels of contaminants on the site without the need of analyzed samples for instrument calibration. The performance of the EDXRF system in application to analysis of metals in contaminated soil is discussed in this paper. Also discussed is the extension of this method in the analysis of other types of environmental samples such as air particulates collected on filter paper. ((orig.))

  1. Power System Analysis

    Science.gov (United States)

    Taniguchi, Haruhito

    Electric power generation that relies on various sources as the primary sources of energy is expected to bring down CO2 emissions levels to support the overall strategy to curb global warming. Accordingly, utilities are moving towards integrating more renewable sources for generation, mostly dispersed, and adopting Smart Grid Technologies for system control. In order to construct, operate, and maintain power systems stably and economically in such background, thorough understanding about the characteristics of power systems and their components is essential. This paper presents modeling and simulation techniques available for the analysis of critical aspects such as thermal capacity, stability, voltage stability, and frequency dynamics, vital for the stable operation of power systems.

  2. Analyzing the politico-moral foundations of the Iran's health system based on theories of justice.

    Science.gov (United States)

    Akrami, Forouzan; Abbasi, Mahmoud; Karimi, Abbas; Shahrivari, Akbar; Majdzadeh, Reza; Zali, Alireza

    2017-01-01

    Public health ethics is a field that covers both factual and ethical issues in health policy and science, and has positive obligations to improve the well-being of populations and reduce social inequalities. It is obvious that various philosophies and moral theories can differently shape the framework of public health ethics. For this reason, the present study reviewed theories of justice in order to analyze and criticize Iran's general health policies document, served in 14 Articles in 2014. Furthermore, it explored egalitarianism as the dominant theory in the political philosophy of the country's health care system. According to recent theories of justice, however, health policies must address well-being and its basic dimensions such as health, reasoning, autonomy, and the role of the involved agencies and social institutions in order to achieve social justice beyond distributive justice. Moreover, policy-making in the field of health and biomedical sciences based on Islamic culture necessitates a theory of social justice in the light of theological ethics. Educating people about their rights and duties, increasing their knowledge on individual agency, autonomy, and the role of the government, and empowering them will help achieve social justice. It is recommended to design and implement a strategic plan following each of these policies, based on the above-mentioned values and in collaboration with other sectors, to clarify the procedures in every case.

  3. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study

    Science.gov (United States)

    AMIRI, Mehran; ARDESHIR, Abdollah; FAZEL ZARANDI, Mohammad Hossein

    2014-01-01

    Abstract Background The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. Methods The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Results Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. Conclusion It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions. PMID:26005662

  4. Risk-based Analysis of Construction Accidents in Iran During 2007-2011-Meta Analyze Study.

    Science.gov (United States)

    Amiri, Mehran; Ardeshir, Abdollah; Fazel Zarandi, Mohammad Hossein

    2014-04-01

    The present study aimed to investigate the characteristics of occupational accidents and frequency and severity of work related accidents in the construction industry among Iranian insured workers during the years 20072011. The Iranian Social Security Organization (ISSO) accident database containing 21,864 cases between the years 2007-2011 was applied in this study. In the next step, Total Accident Rate (TRA), Total Severity Index (TSI), and Risk Factor (RF) were defined. The core of this work is devoted to analyzing the data from different perspectives such as age of workers, occupation and construction phase, day of the week, time of the day, seasonal analysis, regional considerations, type of accident, and body parts affected. Workers between 15-19 years old (TAR=13.4%) are almost six times more exposed to risk of accident than the average of all ages (TAR=2.51%). Laborers and structural workers (TAR=66.6%) and those working at heights (TAR=47.2%) experience more accidents than other groups of workers. Moreover, older workers over 65 years old (TSI=1.97%> average TSI=1.60%), work supervisors (TSI=12.20% >average TSI=9.09%), and night shift workers (TSI=1.89% >average TSI=1.47%) are more prone to severe accidents. It is recommended that laborers, young workers, weekend and night shift workers be supervised more carefully in the workplace. Use of Personal Protective Equipment (PPE) should be compulsory in working environments, and special attention should be undertaken to people working outdoors and at heights. It is also suggested that policymakers pay more attention to the improvement of safety conditions in deprived and cold western regions.

  5. The principles involved in building an optimum system of magnetic gas analyzers

    Energy Technology Data Exchange (ETDEWEB)

    Tsai, A N; Sukhanov, S

    1980-01-01

    A short survey is given of magnetomechanical and thermo-magnetic gas analyzers for oxygen. It is noted that the principle of building magnetic gas analyzers that measure secondary phenomena has fact that disadvantages due to the fact that a gas analyzer uses galvanaomagnetic elements from Hall generators. The measurement range is 0-2% of the volume of 0/sub 2/ concentration, the accuracy class is 1 and the threshold sensitivity is .01 percent by volume.

  6. A computational system for analyze nuclear power plants structures, made by panels, using superelements

    International Nuclear Information System (INIS)

    Jesus Miranda, C.A. de.

    1981-03-01

    The analysis of linear static behavior of folded-plate structures like the turbine building of a nuclear power plant by the Finite Element Method. Folded-plate isoparametric plane elements with 48 degrees of freedom each, 8 nodal points, in which shear deformations are considered, and super-elements, whose internal degrees of freedom are condensated, are used. Arbitrary shells can be analized too. A brief exposition of the method is present and the developing of the foregoing element and super-element is also shown. A computer program was developed for the CDC-CYBER 175 computer machine and the FORTRAN IV language was used. The coeficients of the equations system are stored by the technique of block partitioning with a compacted column storage scheme and special attention was dedicated to the preparation of the problem's data and some options were developed for this purpose. (Author) [pt

  7. Energy Systems in the Era of Energy Vectors A Key to Define, Analyze and Design Energy Systems Beyond Fossil Fuels

    CERN Document Server

    Orecchini, Fabio

    2012-01-01

    What lies beyond the era of fossil fuels? While most answers focus on different primary energy resources, Energy Systems in the Era of Energy Vectors provides a completely new approach. Instead of providing a traditional consumption analysis of classical primary energy resources such as oil, coal, nuclear power and gas, Energy Systems in the Era of Energy Vectors describes and assesses energy technologies, markets and future strategies, focusing on their capacity to produce, exchange, and use energy vectors. Special attention is given to the renewable energy resources available in different areas of the world and made exploitable by the integration of energy vectors in the global energy system. Clear definitions of energy vectors and energy systems are used as the basis for a complete explanation and assessment of up-to-date, available technologies for energy resources, transport and storage systems, conversion and use. The energy vectors scheme allows the potential realisation of a worldwide sustainable ener...

  8. Recommendation Systems for Geoscience Data Portals Built by Analyzing Usage Patterns

    Science.gov (United States)

    Crosby, C.; Nandigam, V.; Baru, C.

    2009-04-01

    selections. However, this paradigm has not yet been explored for geoscience data portals. In this presentation we will present an initial analysis of user interaction and access statistics for the GEON OpenTopography LiDAR data distribution and processing system to illustrate what they reveal about user's spatial and temporal data access patterns, data processing parameter selections, and pathways through the data portal. We also demonstrate what these usage statistics can illustrate about aspects of the data sets that are of greatest interest. Finally, we explore how these usage statistics could be used to improve the user's experience in the data portal and to optimize how data access interfaces and tools are designed and implemented.

  9. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    Science.gov (United States)

    Thomas, Philipp; Matuschek, Hannes; Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with

  10. Method and apparatus for automated processing and aliquoting of whole blood samples for analysis in a centrifugal fast analyzer

    Science.gov (United States)

    Burtis, C.A.; Johnson, W.F.; Walker, W.A.

    1985-08-05

    A rotor and disc assembly for use in a centrifugal fast analyzer. The assembly is designed to process multiple samples of whole blood followed by aliquoting of the resultant serum into precisely measured samples for subsequent chemical analysis. The assembly requires minimal operator involvement with no mechanical pipetting. The system comprises: (1) a whole blood sample disc; (2) a serum sample disc; (3) a sample preparation rotor; and (4) an analytical rotor. The blood sample disc and serum sample disc are designed with a plurality of precision bore capillary tubes arranged in a spoked array. Samples of blood are loaded into the blood sample disc by capillary action and centrifugally discharged into cavities of the sample preparation rotor where separation of serum and solids is accomplished. The serum is loaded into the capillaries of the serum sample disc by capillary action and subsequently centrifugally expelled into cuvettes of the analyticaly rotor for conventional methods. 5 figs.

  11. Advances in analysis of pre-earthquake thermal anomalies by analyzing IR satellite data

    Science.gov (United States)

    Ouzounov, D.; Bryant, N.; Filizzola, C.; Pergola, N.; Taylor, P.; Tramutoli, V.

    Presented work addresses the possible relationship between tectonic stress, electro-chemical and thermodynamic processes in the atmosphere and increasing infrared (IR) flux as part of a larger family of electromagnetic (EM) phenomena related to earthquake activity. Thermal infra-red (TIR) surveys performed by polar orbiting (NOAA/AVHRR, MODIS) and geosynchronous weather satellites (GOES, METEOSAT) seems to indicate the appearance (from days to weeks before the event) of "anomalous" space-time TIR transients associated with the place (epicentral area, linear structures and fault systems) and the time of occurrence of a number of major earthquakes with M>5 and focal depths no deeper than 50km. As Earth emitted in 8-14 microns range the TIR signal measured from satellite strongly vary depending on meteorological conditions and other factors (space-time changes in atmospheric transmittance, time/season, solar and satellite zenithal angles and etc) independent from seismic activity, a preliminary definition of "anomalous TIR signal" should be given. To provide reliable discrimination of thermal anomalous area from the natural events (seasonal changes, local morphology) new robust approach (RAT) has been recently proposed (and successfully applied in the field of the monitoring of the major environmental risks) that permits to give a statistically based definition of thermal info-red (TIR) anomaly and reduce of false events detection. New techniques also were specifically developed to assure the precise co-registration of all satellite scenes and permit accurate time-series analysis of satellite observations. As final results we present examples of most recent 2000/2004 worldwide strong earthquakes and the techniques used to capture the tracks of thermal emission mid-IR anomalies and methodology for practical future use of such phenomena in the early warning systems.

  12. A computer program integrating a multichannel analyzer with gamma analysis for the estimation of 226 Ra concentration in soil samples

    International Nuclear Information System (INIS)

    Wilson, J. E.

    1992-08-01

    A new hardware/software system has been implemented using the existing three-regions-of-interest method for determining the concentration of 226 Ra in soil samples for the Pollutant Assessment Group of the Oak Ridge National Laboratory. Consisting of a personal computer containing a multichannel analyzer, the system utilizes a new program combining the multichannel analyzer with a program analyzing gamma-radiation spectra for 226 Ra concentrations. This program uses a menu interface to minimize and simplify the tasks of system operation

  13. VENTILATION TECHNOLOGY SYSTEMS ANALYSIS

    Science.gov (United States)

    The report gives results of a project to develop a systems analysis of ventilation technology and provide a state-of-the-art assessment of ventilation and indoor air quality (IAQ) research needs. (NOTE: Ventilation technology is defined as the hardware necessary to bring outdoor ...

  14. The sequence coding and search system: an approach for constructing and analyzing event sequences at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Mays, G.T.

    1990-01-01

    The U.S. Nuclear Regulatory Commission (NRC) has recognized the importance of the collection, assessment, and feedback of operating experience data from commercial nuclear power plants and has centralized these activities in the Office for Analysis and Evaluation of Operational Data (AEOD). Such data is essential for performing safety and reliability analyses, especially analyses of trends and patterns to identify undesirable changes in plant performance at the earliest opportunity to implement corrective measures to preclude the occurrence of a more serious event. One of NRC's principal tools for collecting and evaluating operating experience data is the Sequence Coding and Search System (SCSS). The SCSS consists of a methodology for structuring event sequences and the requisite computer system to store and search the data. The source information for SCSS is the Licensee Event Report (LER), which is a legally required document. This paper describes the objectives of SCSS, the information it contains, and the format and approach for constructing SCSS event sequences. Examples are presented demonstrating the use of SCSS to support the analysis of LER data. The SCSS contains over 30,000 LERs describing events from 1980 through the present. Insights gained from working with a complex data system from the initial developmental stage to the point of a mature operating system are highlighted. Considerable experience has been gained in the areas of evolving and changing data requirements, staffing requirements, and quality control and quality assurance procedures for addressing consistency, software/hardware considerations for developing and maintaining a complex system, documentation requirements, and end-user needs. Two other approaches for constructing and evaluating event sequences are examined including the Accident Precursor Program (ASP) where sequences having the potential for core damage are identified and analyzed, and the Significant Event Compilation Tree

  15. Centrifugal analyzer development

    International Nuclear Information System (INIS)

    Burtis, C.A.; Bauer, M.L.; Bostick, W.D.

    1976-01-01

    The development of the centrifuge fast analyzer (CFA) is reviewed. The development of a miniature CFA with computer data analysis is reported and applications for automated diagnostic chemical and hematological assays are discussed. A portable CFA system with microprocessor was adapted for field assays of air and water samples for environmental pollutants, including ammonia, nitrates, nitrites, phosphates, sulfates, and silica. 83 references

  16. Comparison of fabric analysis of snow samples by Computer-Integrated Polarization Microscopy and Automatic Ice Texture Analyzer

    Science.gov (United States)

    Leisinger, Sabine; Montagnat, Maurine; Heilbronner, Renée; Schneebeli, Martin

    2014-05-01

    out by the AITA. This device makes a fast and precise measurement of the fabric of many small ice crystals. The high spatial resolution provided by the AITA makes it also practical for snow. The basic principle is actually the same as in CIP. But variations in the set up and differences in the data treatment process make a comparison of the two methods very valuable. However, the comparably large investment in the AITA can make the use of CIP attractive. Here, we compare identical snow and firn samples from both the AITA and CIP to find out differences in ease of operation and quality of the fabric analysis. We prepared snow and firn thin sections and analyzed them by the CIP method as well as by the fully automated AITA device. It will be interesting to directly compare the two results of c-axis orientations. The comparison developed here allows choosing either method based on objective criteria. References: Panozzo Heilbronner, R. and Pauli, C. (1993).%& Integrated spatial and orientation analysis of quartz c-axes by computer-aided microscopy. J. Struct. Geol., 15(3-5), 369-382. Wilson, Ch.J.L., D S. Russel-Head and H. M. Sim C. (1993). The application of an automated fabric analyzer system to the textural evolution of folded ice layers in shear zones. Annals of Glaciology, 37(1), 7-17.

  17. Using a Remotely Piloted Aircraft System (RPAS) to analyze the stability of a natural rock slope

    Science.gov (United States)

    Salvini, Riccardo; Esposito, Giuseppe; Mastrorocco, Giovanni; Seddaiu, Marcello

    2016-04-01

    This paper describes the application of a rotary wing RPAS for monitoring the stability of a natural rock slope in the municipality of Vecchiano (Pisa, Italy). The slope under investigation is approximately oriented NNW-SSE and has a length of about 320 m; elevation ranges from about 7 to 80 m a.s.l.. The hill consists of stratified limestone, somewhere densely fractured, with dip direction predominantly oriented in a normal way respect to the slope. Fracture traces are present in variable lengths, from decimetre to metre, and penetrate inward the rock versant with thickness difficult to estimate, often exceeding one meter in depth. The intersection between different fracture systems and the slope surface generates rocky blocks and wedges of variable size that may be subject to phenomena of gravitational instability (with reference to the variation of hydraulic and dynamic conditions). Geometrical and structural info about the rock mass, necessary to perform the analysis of the slope stability, were obtained in this work from geo-referenced 3D point clouds acquired using photogrammetric and laser scanning techniques. In particular, a terrestrial laser scanning was carried out from two different point of view using a Leica Scanstation2. The laser survey created many shadows in the data due to the presence of vegetation in the lower parts of the slope and limiting the feasibility of geo-structural survey. To overcome such a limitation, we utilized a rotary wing Aibotix Aibot X6 RPAS geared with a Nikon D3200 camera. The drone flights were executed in manual modality and the images were acquired, according to the characteristics of the outcrops, under different acquisition angles. Furthermore, photos were captured very close to the versant (a few meters), allowing to produce a dense 3D point cloud (about 80 Ma points) by the image processing. A topographic survey was carried out in order to guarantee the necessary spatial accuracy to the process of images exterior

  18. Analyzing Public Discourse: Using Media Content Analysis to Understand the Policy Process

    Science.gov (United States)

    Saraisky, Nancy Green

    2016-01-01

    One of the most basic and obvious sources of data for education policy analysis is text. This article discusses content analysis as an important part of the methodological toolbox for elucidating patterns and trends about education policy. Focusing specifically on media, I show how media content analysis can produce nuanced insights about the ways…

  19. Collection Analysis: Powerful Ways To Collect, Analyze, and Present Your Data.

    Science.gov (United States)

    Hart, Amy

    2003-01-01

    Discussion of collection analysis in school libraries focuses on the kinds of data used and how to use library automation software to collect the data. Describes the use of Microsoft Excel and its chart-making capabilities to enhance the presentation of the analysis and suggests ways to use collection analysis output. (LRW)

  20. Capillary Electrophoresis Analysis of Organic Amines and Amino Acids in Saline and Acidic Samples Using the Mars Organic Analyzer

    Science.gov (United States)

    Stockton, Amanda M.; Chiesl, Thomas N.; Lowenstein, Tim K.; Amashukeli, Xenia; Grunthaner, Frank; Mathies, Richard A.

    2009-11-01

    The Mars Organic Analyzer (MOA) has enabled the sensitive detection of amino acid and amine biomarkers in laboratory standards and in a variety of field sample tests. However, the MOA is challenged when samples are extremely acidic and saline or contain polyvalent cations. Here, we have optimized the MOA analysis, sample labeling, and sample dilution buffers to handle such challenging samples more robustly. Higher ionic strength buffer systems with pKa values near pH 9 were developed to provide better buffering capacity and salt tolerance. The addition of ethylaminediaminetetraacetic acid (EDTA) ameliorates the negative effects of multivalent cations. The optimized protocol utilizes a 75 mM borate buffer (pH 9.5) for Pacific Blue labeling of amines and amino acids. After labeling, 50 mM (final concentration) EDTA is added to samples containing divalent cations to ameliorate their effects. This optimized protocol was used to successfully analyze amino acids in a saturated brine sample from Saline Valley, California, and a subcritical water extract of a highly acidic sample from the Río Tinto, Spain. This work expands the analytical capabilities of the MOA and increases its sensitivity and robustness for samples from extraterrestrial environments that may exhibit pH and salt extremes as well as metal ions.

  1. Composite waste analysis system

    International Nuclear Information System (INIS)

    Wachter, J.R.; Hagan, R.C.; Bonner, C.A.; Malcom, J.E.; Camp, K.L.

    1993-01-01

    Nondestructive analysis (NDA) of radioactive waste forms an integral component of nuclear materials accountability programs and waste characterization acceptance criterion. However, waste measurements are often complicated by unknown isotopic compositions and the potential for concealment of special nuclear materials in a manner that is transparent to gamma-ray measurement instruments. To overcome these complications, a new NDA measurement system has been developed to assay special nuclear material in both transuranic and low level waste from the same measurement platform. The system incorporates a NaI detector and customized commercial software routines to measure small quantities of radioactive material in low level waste. Transuranic waste analysis is performed with a coaxial HPGE detector and uses upgraded PC-based segmented gamma scanner software to assay containers up to 55 gal. in volume. Gamma-Ray isotopics analysis of both waste forms is also performed with this detector. Finally, a small neutron counter using specialized software is attached to the measurement platform to satisfy safeguards concerns related to nuclear materials that are not sensed by the gamma-ray instruments. This report describes important features and capabilities of the system and presents a series of test measurements that are to be performed to define system parameters

  2. Network systems security analysis

    Science.gov (United States)

    Yilmaz, Ä.°smail

    2015-05-01

    Network Systems Security Analysis has utmost importance in today's world. Many companies, like banks which give priority to data management, test their own data security systems with "Penetration Tests" by time to time. In this context, companies must also test their own network/server systems and take precautions, as the data security draws attention. Based on this idea, the study cyber-attacks are researched throughoutly and Penetration Test technics are examined. With these information on, classification is made for the cyber-attacks and later network systems' security is tested systematically. After the testing period, all data is reported and filed for future reference. Consequently, it is found out that human beings are the weakest circle of the chain and simple mistakes may unintentionally cause huge problems. Thus, it is clear that some precautions must be taken to avoid such threats like updating the security software.

  3. Analysis And Control System For Automated Welding

    Science.gov (United States)

    Powell, Bradley W.; Burroughs, Ivan A.; Kennedy, Larry Z.; Rodgers, Michael H.; Goode, K. Wayne

    1994-01-01

    Automated variable-polarity plasma arc (VPPA) welding apparatus operates under electronic supervision by welding analysis and control system. System performs all major monitoring and controlling functions. It acquires, analyzes, and displays weld-quality data in real time and adjusts process parameters accordingly. Also records pertinent data for use in post-weld analysis and documentation of quality. System includes optoelectronic sensors and data processors that provide feedback control of welding process.

  4. Application of factor analysis to chemically analyzed data in environmental samples after x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    El-Sayed, A.A.

    2005-01-01

    The underlying principle of factorial analysis is frequency distribution and description of reaction in between and through the element series in specific environmental samples. Application of this factor analysis was elaborated to interpret the variance and covariance of certain elements Si, Al, Ca. K, Fe, Ti and Mg in three different types of common materials in environmental sediments, soil, and rock. These evaluations were proceeded after x-ray fluorescence measurements. Results of applications of factorial statistical data analysis show that three factors cause relationship between the above elements in a certain type of environmental samples are mainly recognized. In such cases, these factors represent the main reason for findings and interpret all hidden relationship between the chemical analyzed data. Factor one, the effect of weathering type alteration and oxidation reaction processes as a main one in case of soil and rock where they are characterized by the close covariance of a group of metals, like iron and manganese, commonly derived from weathered and altered igneous rocks. Factor two and three represents other processes. In case of soil, formation of alumino-silicate is revealed in factor two due to the positive covariance of these elements and also the presence of aluminum oxide, titanium oxide and silicon dioxide together is explained by these positive values. The inverse relation between Ca, K, Fe and Mg while indicate the presence of mineral salts which may be due to fertilization and water of irrigation. In case of factor three in that soil, it is the weakest factor that can be used to explain the relationship between the above elements

  5. Analysis of the sensitivity and sample-furnace thermal-lag of a differential thermal analyzer

    International Nuclear Information System (INIS)

    Roura, P.; Farjas, J.

    2005-01-01

    The heat exchange between the horizontal furnace of a differential thermal analyzer (DTA) and the sample is analyzed with the aim of understanding the parameters governing the thermal signal. The resistance due to radiation and conduction through the gas has been calculated and compared to the experimental values of the thermal-lag between the sample and furnace and apparatus sensitivity. The overall evolution of these parameters with the temperature and their relative values are well understood by considering the temperature differences that arise between the sample and holder. Two RC thermal models are used for describing the apparatus performance at different temperature ranges. Finally, the possibility of improving the signal quality through the control of the leak resistances is stressed

  6. Analyzing inter-organizational systems from a power and interest perspective

    NARCIS (Netherlands)

    Boonstra, A.; de Vries, J.

    2005-01-01

    Inter-organizational systems (IOS) are Information and Communication Technology (ICT)-based systems that enable organizations to share information and to electronically conduct business across organizational boundaries. Especially since the increasing availability of the Internet, there have been

  7. Analyzing the State of Static Analysis : A Large-Scale Evaluation in Open Source Software

    NARCIS (Netherlands)

    Beller, M.; Bholanath, R.; McIntosh, S.; Zaidman, A.E.

    2016-01-01

    The use of automatic static analysis has been a software engineering best practice for decades. However, we still do not know a lot about its use in real-world software projects: How prevalent is the use of Automated Static Analysis Tools (ASATs) such as FindBugs and JSHint? How do developers use

  8. Applicability of supervised discriminant analysis models to analyze astigmatism clinical trial data.

    Science.gov (United States)

    Sedghipour, Mohammad Reza; Sadeghi-Bazargani, Homayoun

    2012-01-01

    In astigmatism clinical trials where more complex measurements are common, especially in nonrandomized small sized clinical trials, there is a demand for the development and application of newer statistical methods. The source data belonged to a project on astigmatism treatment. Data were used regarding a total of 296 eyes undergoing different astigmatism treatment modalities: wavefront-guided photorefractive keratectomy, cross-cylinder photorefractive keratectomy, and monotoric (single) photorefractive keratectomy. Astigmatism analysis was primarily done using the Alpins method. Prior to fitting partial least squares regression discriminant analysis, a preliminary principal component analysis was done for data overview. Through fitting the partial least squares regression discriminant analysis statistical method, various model validity and predictability measures were assessed. The model found the patients treated by the wavefront method to be different from the two other treatments both in baseline and outcome measures. Also, the model found that patients treated with the cross-cylinder method versus the single method didn't appear to be different from each other. This analysis provided an opportunity to compare the three methods while including a substantial number of baseline and outcome variables. Partial least squares regression discriminant analysis had applicability for the statistical analysis of astigmatism clinical trials and it may be used as an adjunct or alternative analysis method in small sized clinical trials.

  9. Development and Operation of Dual-Mode Analyzers for Wireless Power Consortium/Power Matters Alliance Wireless Power Systems.

    Science.gov (United States)

    Um, Keehong

    2016-05-01

    We have designed a protocol analyzer to be used in wireless power systems and analyzed the operation of wireless chargers defined by standards of Qi of Wireless Power Consortium (WPC) and Power Matters Alliance (PMA) protocols. The integrated circuit (IC, or microchip) developed so far for wireless power transmission is not easily adopted by chargers for specific purposes. A device for measuring the performance of test equipment currently available is required to transform and expand the types of protocol. Since a protocol analyzer with these functions is required, we have developed a device that can analyze the two protocols of WPC and PMA at the same time. As a result of our research, we present a dual-mode system that can analyze the protocols of both WPC and PMA.

  10. Stochastic Reachability Analysis of Hybrid Systems

    CERN Document Server

    Bujorianu, Luminita Manuela

    2012-01-01

    Stochastic reachability analysis (SRA) is a method of analyzing the behavior of control systems which mix discrete and continuous dynamics. For probabilistic discrete systems it has been shown to be a practical verification method but for stochastic hybrid systems it can be rather more. As a verification technique SRA can assess the safety and performance of, for example, autonomous systems, robot and aircraft path planning and multi-agent coordination but it can also be used for the adaptive control of such systems. Stochastic Reachability Analysis of Hybrid Systems is a self-contained and accessible introduction to this novel topic in the analysis and development of stochastic hybrid systems. Beginning with the relevant aspects of Markov models and introducing stochastic hybrid systems, the book then moves on to coverage of reachability analysis for stochastic hybrid systems. Following this build up, the core of the text first formally defines the concept of reachability in the stochastic framework and then...

  11. Applicability of supervised discriminant analysis models to analyze astigmatism clinical trial data

    Directory of Open Access Journals (Sweden)

    Sedghipour MR

    2012-09-01

    Full Text Available Mohammad Reza Sedghipour,1 Homayoun Sadeghi-Bazargani2,31Nikoukari Ophthalmology University Hospital, Tabriz, Iran; 2Department of Statistics and Epidemiology, Neuroscience Research Center, Tabriz University of Medical Sciences, Tabriz, Iran; 3Department of Public Health Sciences, Karolinska Institute, Stockholm, SwedenBackground: In astigmatism clinical trials where more complex measurements are common, especially in nonrandomized small sized clinical trials, there is a demand for the development and application of newer statistical methods.Methods: The source data belonged to a project on astigmatism treatment. Data were used regarding a total of 296 eyes undergoing different astigmatism treatment modalities: wavefront-guided photorefractive keratectomy, cross-cylinder photorefractive keratectomy, and monotoric (single photorefractive keratectomy. Astigmatism analysis was primarily done using the Alpins method. Prior to fitting partial least squares regression discriminant analysis, a preliminary principal component analysis was done for data overview. Through fitting the partial least squares regression discriminant analysis statistical method, various model validity and predictability measures were assessed.Results: The model found the patients treated by the wavefront method to be different from the two other treatments both in baseline and outcome measures. Also, the model found that patients treated with the cross-cylinder method versus the single method didn't appear to be different from each other. This analysis provided an opportunity to compare the three methods while including a substantial number of baseline and outcome variables.Conclusion: Partial least squares regression discriminant analysis had applicability for the statistical analysis of astigmatism clinical trials and it may be used as an adjunct or alternative analysis method in small sized clinical trials.Keywords: astigmatism, regression, partial least squares regression

  12. Interactive visualization system to analyze corrugated millimeter-waveguide component of ECH in nuclear fusion with FDTD simulation

    International Nuclear Information System (INIS)

    Kashima, N; Nakamura, H; Kubo, S; Tamura, Y; Ito, A M

    2014-01-01

    We have simulated distribution of electromagnetic waves through the system composed of miter bends by Finite-Difference Time-Domain (FDTD) simulation. We develop the interactive visualization system using a new interactive GUI system which is composed of the virtual reality system and android tablet to analyze the FDTD simulation. The effect of the waveguide system with grooves have been investigated to quantitatively by visualization system. Comparing waveguide system with grooves and without grooves, grooves have been confirmed to suppress the surface current at the metal surface. The surface current at complex shape such as the miter bend have been investigated

  13. Arctic Climate Systems Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ivey, Mark D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Robinson, David G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Boslough, Mark B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Backus, George A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Peterson, Kara J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); van Bloemen Waanders, Bart G. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Swiler, Laura Painton [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Desilets, Darin Maurice [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reinert, Rhonda Karen [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-03-01

    This study began with a challenge from program area managers at Sandia National Laboratories to technical staff in the energy, climate, and infrastructure security areas: apply a systems-level perspective to existing science and technology program areas in order to determine technology gaps, identify new technical capabilities at Sandia that could be applied to these areas, and identify opportunities for innovation. The Arctic was selected as one of these areas for systems level analyses, and this report documents the results. In this study, an emphasis was placed on the arctic atmosphere since Sandia has been active in atmospheric research in the Arctic since 1997. This study begins with a discussion of the challenges and benefits of analyzing the Arctic as a system. It goes on to discuss current and future needs of the defense, scientific, energy, and intelligence communities for more comprehensive data products related to the Arctic; assess the current state of atmospheric measurement resources available for the Arctic; and explain how the capabilities at Sandia National Laboratories can be used to address the identified technological, data, and modeling needs of the defense, scientific, energy, and intelligence communities for Arctic support.

  14. An integrated simulation tool for analyzing the Operation and Interdependency of Natural Gas and Electric Power Systems

    OpenAIRE

    PAMBOUR Kwabena A.; CAKIR BURCIN; BOLADO LAVIN Ricardo; DIJKEMA Gerard

    2016-01-01

    In this paper, we present an integrated simulation tool for analyzing the interdependency of natural gas and electric power systems in terms of security of energy supply. In the first part, we develop mathematical models for the individual systems. In part two, we identify the interconnections between both systems and propose a method for coupling the combined simulation model. Next, we develop the algorithm for solving the combined system and integrate this algorithm into a simulation softwa...

  15. Analyzing Data Generated Through Deliberative Dialogue: Bringing Knowledge Translation Into Qualitative Analysis.

    Science.gov (United States)

    Plamondon, Katrina M; Bottorff, Joan L; Cole, Donald C

    2015-11-01

    Deliberative dialogue (DD) is a knowledge translation strategy that can serve to generate rich data and bridge health research with action. An intriguing alternative to other modes of generating data, the purposeful and evidence-informed conversations characteristic of DD generate data inclusive of collective interpretations. These data are thus dialogic, presenting complex challenges for qualitative analysis. In this article, we discuss the nature of data generated through DD, orienting ourselves toward a theoretically grounded approach to analysis. We offer an integrated framework for analysis, balancing analytical strategies of categorizing and connecting with the use of empathetic and suspicious interpretive lenses. In this framework, data generation and analysis occur in concert, alongside engaging participants and synthesizing evidence. An example of application is provided, demonstrating nuances of the framework. We conclude with reflections on the strengths and limitations of the framework, suggesting how it may be relevant in other qualitative health approaches. © The Author(s) 2015.

  16. Intellectual capital statements on their way to the stock exchange: Analyzing new reporting systems

    DEFF Research Database (Denmark)

    Nielsen, Christian; Bukh, P.N.; Mouritsen, J.

    2006-01-01

    Purpose - The purpose of this paper is to propose and illustrate the use of a set of rules to make an analytical reading of the indicators of an intellectual capital statement possible. Design/methodology/approach - The paper proposes a model to analyze intellectual capital statements and applies...... demonstrates the use of a theoretically anchored and practical, useful model for analysing disclosure in the narrative part of a financial report....

  17. Analyzing the evolutionary mechanisms of the Air Transportation System-of-Systems using network theory and machine learning algorithms

    Science.gov (United States)

    Kotegawa, Tatsuya

    Complexity in the Air Transportation System (ATS) arises from the intermingling of many independent physical resources, operational paradigms, and stakeholder interests, as well as the dynamic variation of these interactions over time. Currently, trade-offs and cost benefit analyses of new ATS concepts are carried out on system-wide evaluation simulations driven by air traffic forecasts that assume fixed airline routes. However, this does not well reflect reality as airlines regularly add and remove routes. A airline service route network evolution model that projects route addition and removal was created and combined with state-of-the-art air traffic forecast methods to better reflect the dynamic properties of the ATS in system-wide simulations. Guided by a system-of-systems framework, network theory metrics and machine learning algorithms were applied to develop the route network evolution models based on patterns extracted from historical data. Constructing the route addition section of the model posed the greatest challenge due to the large pool of new link candidates compared to the actual number of routes historically added to the network. Of the models explored, algorithms based on logistic regression, random forests, and support vector machines showed best route addition and removal forecast accuracies at approximately 20% and 40%, respectively, when validated with historical data. The combination of network evolution models and a system-wide evaluation tool quantified the impact of airline route network evolution on air traffic delay. The expected delay minutes when considering network evolution increased approximately 5% for a forecasted schedule on 3/19/2020. Performance trade-off studies between several airline route network topologies from the perspectives of passenger travel efficiency, fuel burn, and robustness were also conducted to provide bounds that could serve as targets for ATS transformation efforts. The series of analysis revealed that high

  18. SSYST: A code-system for analyzing transient LWR fuel rod behaviour under off-normal conditions

    International Nuclear Information System (INIS)

    Borgwaldt, H.; Gulden, W.

    1983-01-01

    SSYST is a code-system for analyzing transient fuel rod behaviour under off-normal conditions, developed conjointly by the Institut fur Kernenergetik und Energiesysteme (IKE), Stuttgart, and Kernforschungszentrum Karlsruhe (KfK) under contract of Projekt Nukleare Sicherheit (PNS) at KfK. The main differences between SSYST and similar codes are an open-ended modular code organization, and a preference for simple models, wherever possible. While the first feature makes SSYST a very flexible tool, easily adapted to changing requirements, the second feature leads to short execution times. The analysis of transient rod behaviour under LOCA boundary conditions takes 2 min cpu-time (IBM-3033), so that extensive parametric studies become possible. This paper gives an outline of the overall code organisation and a general overview of the physical models implemented. Besides explaining the routine application of SSYST in the analysis of loss-of-coolant accidents, examples are given of special applications which have led to a satisfactory understanding of the decisive influence of deviations from rotational symmetry on the fuel rod perimeter

  19. An anomaly analysis framework for database systems

    NARCIS (Netherlands)

    Vavilis, S.; Egner, A.I.; Petkovic, M.; Zannone, N.

    2015-01-01

    Anomaly detection systems are usually employed to monitor database activities in order to detect security incidents. These systems raise an alert when anomalous activities are detected. The raised alerts have to be analyzed to timely respond to the security incidents. Their analysis, however, is

  20. A novel approach for analyzing glass-transition temperature vs. composition patterns: application to pharmaceutical compound+polymer systems.

    Science.gov (United States)

    Kalogeras, Ioannis M

    2011-04-18

    In medicine, polymer-based materials are commonly used as excipients of poorly water-soluble drugs. The success of the encapsulation, as well as the physicochemical stability of the products, is often reflected on their glass transition temperature (T(g)) vs. composition (w) dependencies. The shape of the T(g)(w) patterns is critically influenced by polymer's molecular mass, drug molecule's shape and molecular volume, the type and degree of shielding of hydrogen-bonding capable functional groups, as well as aspects of the preparation process. By altering mixture's T(g) the amorphous solid form of the active ingredient may be retained at ambient or body temperatures, with concomitant improvements in handling, solubility, dissolution rate and oral bioavailability. Given the importance of the problem, the glass transitions observed in pharmaceutical mixtures have been extensively analyzed, aiming to appraise the state of mixing and intermolecular interactions. Here, accumulated experimental information on related systems is re-evaluated and comparably discussed under the light of a more effective and system-inclusive T(g)(w) equation. The present analysis indicates that free volume modifications and conformational changes of the macromolecular chains dominate, over enthalpic effects of mixing, in determining thermal characteristics and crystallization inhibition/retardation. Moreover, hydrogen-bonding and ion-dipole heterocontacts--although favorable of a higher degree of mixing--appear less significant compared to the steric hindrances and the antiplasticization proffered by the higher viscosity component. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. Beyond utilitarianism: a method for analyzing competing ethical principles in a decision analysis of liver transplantation.

    Science.gov (United States)

    Volk, Michael L; Lok, Anna S F; Ubel, Peter A; Vijan, Sandeep

    2008-01-01

    The utilitarian foundation of decision analysis limits its usefulness for many social policy decisions. In this study, the authors examine a method to incorporate competing ethical principles in a decision analysis of liver transplantation for a patient with acute liver failure (ALF). A Markov model was constructed to compare the benefit of transplantation for a patient with ALF versus the harm caused to other patients on the waiting list and to determine the lowest acceptable 5-y posttransplant survival for the ALF patient. The weighting of the ALF patient and other patients was then adjusted using a multiattribute variable incorporating utilitarianism, urgency, and other principles such as fair chances. In the base-case analysis, the strategy of transplanting the ALF patient resulted in a 0.8% increase in the risk of death and a utility loss of 7.8 quality-adjusted days of life for each of the other patients on the waiting list. These harms cumulatively outweighed the benefit of transplantation for an ALF patient having a posttransplant survival of less than 48% at 5 y. However, the threshold for an acceptable posttransplant survival for the ALF patient ranged from 25% to 56% at 5 y, depending on the ethical principles involved. The results of the decision analysis vary depending on the ethical perspective. This study demonstrates how competing ethical principles can be numerically incorporated in a decision analysis.

  2. Analysis of the SIAM Infrared Acquisition System

    Energy Technology Data Exchange (ETDEWEB)

    Varnado, S.G.

    1974-02-01

    This report describes and presents the results of an analysis of the performance of the infrared acquisition system for a Self-Initiated Antiaircraft Missile (SIAM). A description of the optical system is included, and models of target radiant intensity, atmospheric transmission, and background radiance are given. Acquisition probabilities are expressed in terms of the system signal-to-noise ratio. System performance against aircraft and helicopter targets is analyzed, and background discrimination techniques are discussed. 17 refs., 22 figs., 6 tabs.

  3. Intrinsic noise analyzer: a software package for the exploration of stochastic biochemical kinetics using the system size expansion.

    Directory of Open Access Journals (Sweden)

    Philipp Thomas

    Full Text Available The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA, which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen's system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA's performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network

  4. Intrinsic Noise Analyzer: A Software Package for the Exploration of Stochastic Biochemical Kinetics Using the System Size Expansion

    Science.gov (United States)

    Grima, Ramon

    2012-01-01

    The accepted stochastic descriptions of biochemical dynamics under well-mixed conditions are given by the Chemical Master Equation and the Stochastic Simulation Algorithm, which are equivalent. The latter is a Monte-Carlo method, which, despite enjoying broad availability in a large number of existing software packages, is computationally expensive due to the huge amounts of ensemble averaging required for obtaining accurate statistical information. The former is a set of coupled differential-difference equations for the probability of the system being in any one of the possible mesoscopic states; these equations are typically computationally intractable because of the inherently large state space. Here we introduce the software package intrinsic Noise Analyzer (iNA), which allows for systematic analysis of stochastic biochemical kinetics by means of van Kampen’s system size expansion of the Chemical Master Equation. iNA is platform independent and supports the popular SBML format natively. The present implementation is the first to adopt a complementary approach that combines state-of-the-art analysis tools using the computer algebra system Ginac with traditional methods of stochastic simulation. iNA integrates two approximation methods based on the system size expansion, the Linear Noise Approximation and effective mesoscopic rate equations, which to-date have not been available to non-expert users, into an easy-to-use graphical user interface. In particular, the present methods allow for quick approximate analysis of time-dependent mean concentrations, variances, covariances and correlations coefficients, which typically outperforms stochastic simulations. These analytical tools are complemented by automated multi-core stochastic simulations with direct statistical evaluation and visualization. We showcase iNA’s performance by using it to explore the stochastic properties of cooperative and non-cooperative enzyme kinetics and a gene network associated with

  5. Analyzing interdependencies between policy mixes and technological innovation systems : The case of offshore wind in Germany

    NARCIS (Netherlands)

    Reichardt, Kristin; Negro, Simona O.; Rogge, Karoline S.; Hekkert, Marko P.

    2016-01-01

    One key approach for studying emerging technologies in the field of sustainability transitions is that of technological innovation systems (TIS). While most TIS studies aim at deriving policy recommendations - typically by identifying system barriers - the actual role of these proposed policies in

  6. Analyzing the Operation of Performance-Based Accountability Systems for Public Services. Technical Report

    Science.gov (United States)

    Camm, Frank; Stecher, Brian M.

    2010-01-01

    Empirical evidence of the effects of performance-based public management is scarce. This report describes a framework used to organize available empirical information on one form of performance-based management, a performance-based accountability system (PBAS). Such a system identifies individuals or organizations that must change their behavior…

  7. Eddy covariance measurements with a new fast-response, enclosed-path analyzer: Spectral characteristics and cross-system comparisons

    Science.gov (United States)

    K. Novick; J. Walker; W.S. Chan; A. Schmidt; C. Sobek; J.M. Vose

    2013-01-01

    A new class of enclosed path gas analyzers suitable for eddy covariance applications combines the advantages of traditional closed-path systems (small density corrections, good performance in poor weather) and open-path systems (good spectral response, low power requirements), and permits estimates of instantaneous gas mixing ratio. Here, the extent to which these...

  8. A Method for Analyzing the Dynamic Response of a Structural System with Variable Mass, Damping and Stiffness

    Directory of Open Access Journals (Sweden)

    Mike D.R. Zhang

    2001-01-01

    Full Text Available In this paper, a method for analyzing the dynamic response of a structural system with variable mass, damping and stiffness is first presented. The dynamic equations of the structural system with variable mass and stiffness are derived according to the whole working process of a bridge bucket unloader. At the end of the paper, an engineering numerical example is given.

  9. Analyzing Social Interactions: Promises and Challenges of Cross Recurrence Quantification Analysis

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Konvalinka, Ivana; Wallot, Sebastian

    2014-01-01

    The scientific investigation of social interactions presents substantial challenges: interacting agents engage each other at many different levels and timescales (motor and physiological coordination, joint attention, linguistic exchanges, etc.), often making their behaviors interdependent in non......-linear ways. In this paper we review the current use of Cross Recurrence Quantification Analysis (CRQA) in the analysis of social interactions, and assess its potential and challenges. We argue that the method can sensitively grasp the dynamics of human interactions, and that it has started producing valuable...

  10. Reformation of Pretreatment System of Oxygen Analyzer%氧分仪预处理系统改造

    Institute of Scientific and Technical Information of China (English)

    郑友军

    2016-01-01

    In the process of acrylonitrile unit, oxygen content in the process materials at the outlet of the reactor is a very important parameter. It is both a measure of propylene, ammonia and air ratio in the reactor and an important indicator of the reaction rate. It is also an important index of whether the reaction is normal or not in the reaction section, so from both process and safety aspects, online analysis and monitoring oxygen concentration is required. This paper is mainly about application problem analysis of on line oxygen analyzer in acrylonitrile plant reaction section. Based on the problems, reformed the pretreatment system of oxygen analyzer. After the completion of the transformation, fault rate of oxygen analyzer is greatly reduced, the instrument's precision and stability is greatly improved, the transformation also improve the acrylonitrile plant startup speed, reduces the waste of the material. It is played an important role in the aspect of process and safety of acrylonitrile plant, economic effect is very obvious.%丙烯腈联合装置中,反应器出口处工艺物料中的氧气含量是十分关键的工艺参数指标。它是衡量反应器内丙烯、氨及空气的配比关系,反应速率安全等重要指标,也是丙烯腈装置反应工段生产是否正常的一项重要指标,从安全、工艺上都要求对氧浓度进行在线分析和监测。本文主要分析丙烯腈反应工段氧浓度在线分析仪表在应用中出现的问题,并针对出现的问题由氧分析仪预处理系统进行改造。在氧分仪预处理系统投用后,氧分析仪的故障率大大降低,仪表检测精度和运行稳定性大大提高,而且提高了丙烯腈装置开车速度,降低了物料的浪费,且对保证丙烯腈生产和安全起了重要作用,经济效果非常明显。

  11. Analyzing the Historical Development and Transition of the Korean Health Care System.

    Science.gov (United States)

    Lee, Sang-Yi; Kim, Chul-Woung; Seo, Nam-Kyu; Lee, Seung Eun

    2017-08-01

    Many economically advanced countries have attempted to minimize public expenditures and pursue privatization based on the principles of neo-liberalism. However, Korea has moved contrary to this global trend. This study examines why and how the Korean health care system was formed, developed, and transformed into an integrated, single-insurer, National Health Insurance (NHI) system. We describe the transition in the Korean health care system using an analytical framework that incorporates such critical variables as government economic development strategies and the relationships among social forces, state autonomy, and state power. This study focuses on how the relationships among social forces can change as a nation's economic development or governing strategy changes in response to changes in international circumstances such as globalization. The corporatist Social Health Insurance (SHI) system (multiple insurers) introduced in 1977 was transformed into the single-insurer NHI in July 2000. These changes were influenced externally by globalization and internally by political democratization, keeping Korea's private-dominant health care provision system unchanged over several decades. Major changes such as integration reform occurred, when high levels of state autonomy were ensured. The state's power (its policy capability), based on health care infrastructures, acts to limit the direction of any change in the health care system because it is very difficult to build the infrastructure for a health care system in a short timeframe.

  12. Nuclear Plant Analyzer desktop workstation: An integrated interactive simulation, visualization and analysis tool

    International Nuclear Information System (INIS)

    Beelman, R.J.

    1991-01-01

    The advanced, best-estimate, reactor thermal-hydraulic codes were originally developed as mainframe computer applications because of speed, precision, memory and mass storage requirements. However, the productivity of numerical reactor safety analysts has historically been hampered by mainframe dependence due to limited mainframe CPU allocation, accessibility and availability, poor mainframe job throughput, and delays in obtaining and difficulty comprehending printed numerical results. The Nuclear Plant Analyzer (NPA) was originally developed as a mainframe computer-graphics aid for reactor safety analysts in addressing the latter consideration. Rapid advances in microcomputer technology have since enabled the installation and execution of these reactor safety codes on desktop computers thereby eliminating mainframe dependence. The need for a complementary desktop graphics display generation and presentation capability, coupled with the need for software standardization and portability, has motivated the redesign of the NPA as a UNIX/X-Windows application suitable for both mainframe and microcomputer

  13. Seahorse Xfe24 Extracellular Flux Analyzer-based analysis of cellular respiration in Caenorhabditis elegans

    Science.gov (United States)

    Luz, Anthony L.; Smith, Latasha L.; Rooney, John P.

    2015-01-01

    Mitochondria are critical for their role in ATP production as well as multiple nonenergetic functions, and mitochondrial dysfunction is causal in myriad human diseases. Less well appreciated is the fact that mitochondria integrate environmental and inter- as well as intracellular signals to modulate function. Because mitochondria function in an organismal milieu, there is need for assays capable of rapidly assessing mitochondrial health in vivo. Here, using the Seahorse XFe24 Extracellular Flux Analyzer and the pharmacological inhibitors dicyclohexylcarbodiimide (DCCD, ATP synthase inhibitor), carbonyl cyanide-p-trifluoromethoxyphenylhydrazone (FCCP, mitochondrial uncoupler) and sodium azide (cytochrome c oxidase inhibitor), we describe how to obtain in vivo measurements of the fundamental parameters (basal oxygen consumption rate (OCR), ATP-linked respiration, maximal OCR, spare respiratory capacity and proton leak) of the mitochondrial respiratory chain in the model organism Caenorhabditis elegans. PMID:26523474

  14. Analysis of NSPP experiment with ART code for analyzing transport behavior of Aerosol and radionuclides

    International Nuclear Information System (INIS)

    Ishigami, Tsutomu; Kobayashi, Kensuke; Kajimoto, Mitsuhiro.

    1989-01-01

    The ART code calculates transport behavior of aerosols and radionuclides during core meltdown accidents in the light water reactors. Since aerosols play an important role in carrying fission products from the core region to the environment, the ART code includes detailed models of aerosol behavior. Aerosols including several radionuclides are classified into many groups according to the aerosol mass. The models of aerosol behavior include agglomeration processes caused by Brownian motion, aerosol settling velocity difference and turbulent flow, and natural deposition processes due to diffusion, thermophoresis, diffusiophoresis, gravitational settling and forced convection. In order to examine validity of the ART models, the NSPP aerosol experiment was analyzed. The ART calculated results showed good agreement with the experimental data. It was ascertained that aerosol growth due to agglomeration, gravitational settling, thermophoresis in an air atmosphere, and diffusiophoresis in an air-steam atmosphere were important physical phenomena in the aerosol behavior. (author)

  15. Modular Accident Analysis Program (MAAP) - MELCOR Crosswalk: Phase II Analyzing a Partially Recovered Accident Scenario

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Nathan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Faucett, Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Haskin, Troy Christopher [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Luxat, Dave [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Geiger, Garrett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Codella, Brittany [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-10-01

    Following the conclusion of the first phase of the crosswalk analysis, one of the key unanswered questions was whether or not the deviations found would persist during a partially recovered accident scenario, similar to the one that occurred in TMI - 2. In particular this analysis aims to compare the impact of core degradation morphology on quenching models inherent within the two codes and the coolability of debris during partially recovered accidents. A primary motivation for this study is the development of insights into how uncertainties in core damage progression models impact the ability to assess the potential for recovery of a degraded core. These quench and core recovery models are of the most interest when there is a significant amount of core damage, but intact and degraded fuel still remain in the cor e region or the lower plenum. Accordingly this analysis presents a spectrum of partially recovered accident scenarios by varying both water injection timing and rate to highlight the impact of core degradation phenomena on recovered accident scenarios. This analysis uses the newly released MELCOR 2.2 rev. 966 5 and MAAP5, Version 5.04. These code versions, which incorporate a significant number of modifications that have been driven by analyses and forensic evidence obtained from the Fukushima - Daiichi reactor site.

  16. Using a Mixed Methods Content Analysis to Analyze Mission Statements from Colleges of Engineering

    Science.gov (United States)

    Creamer, Elizabeth G.; Ghoston, Michelle

    2013-01-01

    A mixed method design was used to conduct a content analysis of the mission statements of colleges of engineering to map inductively derived codes with the EC 2000 outcomes and to test if any of the codes were significantly associated with institutions with reasonably strong representation of women. Most institution's (25 of 48) mission statement…

  17. Application of Critical Classroom Discourse Analysis (CCDA) in Analyzing Classroom Interaction

    Science.gov (United States)

    Sadeghi, Sima; Ketabi, Saeed; Tavakoli, Mansoor; Sadeghi, Moslem

    2012-01-01

    As an area of classroom research, Interaction Analysis developed from the need and desire to investigate the process of classroom teaching and learning in terms of action-reaction between individuals and their socio-cultural context (Biddle, 1967). However, sole reliance on quantitative techniques could be problematic, since they conceal more than…

  18. Cite Globally, Analyze Locally: Citation Analysis from a Local Latin American Studies Perspective

    Science.gov (United States)

    Schadl, Suzanne M.; Todeschini, Marina

    2015-01-01

    This citation analysis examines the use of Spanish- and Portuguese-language books and articles in PhD dissertations on Latin America at the University of New Mexico between 2000 and 2009. Two sets of data are presented: The first identifies the use of Spanish- and Portuguese-language books and articles across 17 academic departments; and the…

  19. Analyzing the effect of gain time on soft task scheduling policies in real-time systems

    OpenAIRE

    Búrdalo Rapa, Luis Antonio; Terrasa Barrena, Andrés Martín; Espinosa Minguet, Agustín Rafael; García Fornes, Ana María

    2012-01-01

    In hard real-time systems, gain time is defined as the difference between the Worst Case Execution Time (WCET) of a hard task and its actual processor consumption at runtime. This paper presents the results of an empirical study about how the presence of a significant amount of gain time in a hard real-time system questions the advantages of using the most representative scheduling algorithms or policies for aperiodic or soft tasks in fixed-priority preemptive systems. The work presented here...

  20. Analyzing resilience with communicative systems theory an example from European fisheries

    DEFF Research Database (Denmark)

    Wilson, Douglas Clyde; Jacobsen, Rikke Becker

    2013-01-01

    The present paper argues that our understanding of the resilience of social-ecological systems can be improved by considering “communicative resilience” based on Communicative Systems Theory, which focuses on communicative action oriented to achieving mutual understandings. It further argues...... that it is possible to theorise and analyse resilience within complex social-ecological systems from this communicative perspective in a way that is very different from, but complementary to, agent-based approaches focussed on incentives. The paper presents data from multispecies mixed fisheries in Europe...

  1. The application and analyze of the publish-subscribe communication system for radiation and environmental monitoring system

    International Nuclear Information System (INIS)

    Ismet Isnaini; I Putu Susila; Istofa

    2016-01-01

    As part of the RAMONA (Radiation and Meteorological Monitoring Analysis System), a publish and subscribe communication system has been designed and implemented, to enable the Ultrasonic Device of Maretron WS0100 which is connected to the client computer to communicate with the server and/or other client. The Maretron is connected to other devices through an interface which use an NMEA2000 protocol, a communication protocol standard set by the National Maritime Electrical Association (NMEA), which usually used in the communication between sensors in the ships and its display. The Maretron device has several sensors embedded such as humidity, wind direction and speed, temperature as well as speed. The communication between Maretron is utilizing a MQTT (Message Queueing Telemetry Transport) system, a publish/subscribe protocol, in which a client publish its data to a data bus with a certain topic, while the server or other client who subscribe to that topic through a broker will then grab and process the data. The data format sent by the Maretron is in JSON (Java Scrip Object Notation) format, which will be parsed by the subscriber and later will be saved on a database or displayed in a website as per requirement. (author)

  2. A Hybrid DGTD-MNA Scheme for Analyzing Complex Electromagnetic Systems

    KAUST Repository

    Li, Peng; Jiang, Li-Jun; Bagci, Hakan

    2015-01-01

    lumped circuit elements, the standard Newton-Raphson method is applied at every time step. Additionally, a local time-stepping scheme is developed to improve the efficiency of the hybrid solver. Numerical examples consisting of EM systems loaded

  3. Procedures for analyzing the effectiveness of siren systems for alerting the public

    International Nuclear Information System (INIS)

    Keast, D.N.; Towers, D.A.; Anderson, G.S.; Kenoyer, J.L.; Desrosiers, A.E.

    1982-09-01

    NUREG-0654, Revision 1 (Criteria for Preparation and Evaluation of Radiological Emergency Response Plans and Preparedness in Support of Nuclear Power Plants), Appendix 3, discusses requirements of the licensees to implement a prompt notification system within the 10-mile emergency planning zone (EPZ) surrounding a nuclear facility. Sirens are being installed for use as part of or as the entire notification system by many licensees. This report describes a procedure for predicting siren system effectiveness under defined conditions within the EPZ's. The procedure requires a good topographical map and knowledge of the meteorology, demographics, and human activity patterns within the EPZ. The procedure is intended to be applied to systems of sirens and to obtain average results for a large number (30 or more) of listener locations

  4. A Framework For Analyzing And Mitigating The Vulnerabilities Of Complex Systems Via Attack And Protection Trees

    National Research Council Canada - National Science Library

    Edge, Kenneth S

    2007-01-01

    .... In addition to developing protection trees, this research improves the existing concept of attack trees and develops rule sets for the manipulation of metrics used in the security of complex systems...

  5. Analyzing a single nucleotide polymorphism in schizophrenia: a meta-analysis approach

    Directory of Open Access Journals (Sweden)

    Falola O

    2017-08-01

    Full Text Available Oluwadamilare Falola,1 Victor Chukwudi Osamor,1,2 Marion Adebiyi,1,2 Ezekiel Adebiyi1,2 1Covenant University Bioinformatics Research (CUBRe, 2Department of Computer and Information Sciences, College of Science and Technology, Covenant University, Ota, Ogun State, Nigeria Background: Schizophrenia is a severe mental disorder affecting >21 million people worldwide. Some genetic studies reported that single nucleotide polymorphism (SNP involving variant rs1344706 from the ZNF804A gene in human beings is associated with the risk of schizophrenia in several populations. Similar results tend to conflict with other reports in literature, indicating that no true significant association exists between rs1344706 and schizophrenia. We seek to determine the level of association of this SNP with schizophrenia in the Asian population using more recent genome-wide association study (GWAS datasets. Methods: Applying a computational approach with inclusion of more recent GWAS datasets, we conducted a meta-analysis to examine the level of association of SNP rs1344706 and the risk of schizophrenia disorder among the Asian population constituting Chinese, Indonesians, Japanese, Kazakhs and Singaporeans. For a total of 21 genetic studies, including a total of 28,842 cases and 35,630 controls, regression analysis, publication bias, Cochran’s Q and I2 tests were performed. The DerSimonian and Laird random-effects model was used to assess the association of the genetic variant to schizophrenia. Leave-one-out sensitivity analysis was also conducted to determine the influence of each study on the final outcome of the association study. Results: Our summarized analysis for Asian population revealed a pooled odds ratio of 1.06, 95% confidence interval of 1.01–1.11 and two-tailed P-value of 0.0228. Our test for heterogeneity showed the presence of large heterogeneity (I2=53.44%, P =0.00207 and Egger’s regression test (P =0.8763 and Begg’s test (P =0

  6. Logistic Regression and Path Analysis Method to Analyze Factors influencing Students’ Achievement

    Science.gov (United States)

    Noeryanti, N.; Suryowati, K.; Setyawan, Y.; Aulia, R. R.

    2018-04-01

    Students' academic achievement cannot be separated from the influence of two factors namely internal and external factors. The first factors of the student (internal factors) consist of intelligence (X1), health (X2), interest (X3), and motivation of students (X4). The external factors consist of family environment (X5), school environment (X6), and society environment (X7). The objects of this research are eighth grade students of the school year 2016/2017 at SMPN 1 Jiwan Madiun sampled by using simple random sampling. Primary data are obtained by distributing questionnaires. The method used in this study is binary logistic regression analysis that aims to identify internal and external factors that affect student’s achievement and how the trends of them. Path Analysis was used to determine the factors that influence directly, indirectly or totally on student’s achievement. Based on the results of binary logistic regression, variables that affect student’s achievement are interest and motivation. And based on the results obtained by path analysis, factors that have a direct impact on student’s achievement are students’ interest (59%) and students’ motivation (27%). While the factors that have indirect influences on students’ achievement, are family environment (97%) and school environment (37).

  7. Variable threshold algorithm for division of labor analyzed as a dynamical system.

    Science.gov (United States)

    Castillo-Cagigal, Manuel; Matallanas, Eduardo; Navarro, Iñaki; Caamaño-Martín, Estefanía; Monasterio-Huelin, Félix; Gutiérrez, Álvaro

    2014-12-01

    Division of labor is a widely studied aspect of colony behavior of social insects. Division of labor models indicate how individuals distribute themselves in order to perform different tasks simultaneously. However, models that study division of labor from a dynamical system point of view cannot be found in the literature. In this paper, we define a division of labor model as a discrete-time dynamical system, in order to study the equilibrium points and their properties related to convergence and stability. By making use of this analytical model, an adaptive algorithm based on division of labor can be designed to satisfy dynamic criteria. In this way, we have designed and tested an algorithm that varies the response thresholds in order to modify the dynamic behavior of the system. This behavior modification allows the system to adapt to specific environmental and collective situations, making the algorithm a good candidate for distributed control applications. The variable threshold algorithm is based on specialization mechanisms. It is able to achieve an asymptotically stable behavior of the system in different environments and independently of the number of individuals. The algorithm has been successfully tested under several initial conditions and number of individuals.

  8. SSD for R: A Comprehensive Statistical Package to Analyze Single-System Data

    Science.gov (United States)

    Auerbach, Charles; Schudrich, Wendy Zeitlin

    2013-01-01

    The need for statistical analysis in single-subject designs presents a challenge, as analytical methods that are applied to group comparison studies are often not appropriate in single-subject research. "SSD for R" is a robust set of statistical functions with wide applicability to single-subject research. It is a comprehensive package…

  9. Analysis of retarding field energy analyzer transmission by simulation of ion trajectories

    Science.gov (United States)

    van de Ven, T. H. M.; de Meijere, C. A.; van der Horst, R. M.; van Kampen, M.; Banine, V. Y.; Beckers, J.

    2018-04-01

    Retarding field energy analyzers (RFEAs) are used routinely for the measurement of ion energy distribution functions. By contrast, their ability to measure ion flux densities has been considered unreliable because of lack of knowledge about the effective transmission of the RFEA grids. In this work, we simulate the ion trajectories through a three-gridded RFEA using the simulation software SIMION. Using idealized test cases, it is shown that at high ion energy (i.e., >100 eV) the transmission is equal to the optical transmission rather than the product of the individual grid transparencies. Below 20 eV, ion trajectories are strongly influenced by the electric fields in between the grids. In this region, grid alignment and ion focusing effects contribute to fluctuations in transmission with ion energy. Subsequently the model has been used to simulate the transmission and energy resolution of an experimental RFEA probe. Grid misalignments reduce the transmission fluctuations at low energy. The model predicts the minimum energy resolution, which has been confirmed experimentally by irradiating the probe with a beam of ions with a small energy bandwidth.

  10. The combination of system dynamics and game theory in analyzing oligopoly markets

    Directory of Open Access Journals (Sweden)

    Ali Mohammadi

    2016-04-01

    Full Text Available In this paper, we present a hybrid method of game theory and dynamic systems to study the behavior of firms in an oligopoly market. The aim of this study is to build a model for an oligopoly game on the basis of feedback loops and system dynamics approach and to solve the resulted problems under some special conditions where traditional game theory methods are unable to handle. The method includes a combination of qualitative methods including interviews with industry experts to prepare the model and quantitative methods of system dynamics, simulation methodologies and game theory. The results indicate that competitive behavior and the important parameters such as volume of demand, interest rates and price fluctuation will be stabilized after a transition period.

  11. Analyzing Systemic Risk in CEE Markets in 2007–2008 Financial Crisis

    Directory of Open Access Journals (Sweden)

    Renata Karkowska

    2013-01-01

    Full Text Available The purpose of the article is to attempt to answer the question ofhow the crisis affected the banking systems of cee countries, withspecial emphasis on liquidity risk. It seems that this problem hasparticularly affected emerging economies. First, the liquidity riskbegan to exert considerable influence on the functioning bankingsystem and, indirectly, the whole economy. In this paper authorwanted to answer the following questions: What are the channelsof transmission systemic risk on cee markets? What is the role ofbig world banking groups in these financial systems? This conceptis applied to ten Central Eastern European countries, whichexperienced a financial crisis. In the research author hypothesizedabout interconnectedness of liquidity in financial systemsand solvency problems of big banking groups operating in CEE.

  12. Analyze the optimal solutions of optimization problems by means of fractional gradient based system using VIM

    Directory of Open Access Journals (Sweden)

    Firat Evirgen

    2016-04-01

    Full Text Available In this paper, a class of Nonlinear Programming problem is modeled with gradient based system of fractional order differential equations in Caputo's sense. To see the overlap between the equilibrium point of the fractional order dynamic system and theoptimal solution of the NLP problem in a longer timespan the Multistage Variational İteration Method isapplied. The comparisons among the multistage variational iteration method, the variationaliteration method and the fourth order Runge-Kutta method in fractional and integer order showthat fractional order model and techniques can be seen as an effective and reliable tool for finding optimal solutions of Nonlinear Programming problems.

  13. A Regional Earth System Model of the Northeast Corridor: Analyzing 21st Century Climate and Environment

    Science.gov (United States)

    Vorosmarty, C. J.; Duchin, F.; Melillo, J. M.; Wollheim, W. M.; Gonzalez, J.; Kicklighter, D. W.; Rosenzweig, B.; Yang, P.; Lengyel, F.; Fekete, B. M.

    2012-12-01

    The Northeast region (NE) exhibits many of the changes taking place across the Nation's landscapes and watersheds, yet also provides a unique lens through which to assess options for managing large-scale natural resource systems. We report here on a regional NSF-funded Earth System Modeling (EaSM) project, which has assembled an interdisciplinary research team from academia and government with expertise in physics, biogeochemistry, engineering, energy, economics, and policy engagement. The team is simultaneously studying the evolution of regional human-environment systems and seeking to improve the translation of research findings to the planning community. We hypothesize that there are regionally-significant consequences of human decisions on environmental systems of the NE, expressed through the action of both natural and engineered human systems that dictate the region's biogeophysical state, ecosystem services, energy and economic output. Our central goal is: To build a Northeast Regional Earth System Model (NE-RESM) that improves understanding and capacity to forecast the implications of planning decisions on the region's environment, ecosystem services, energy systems and economy through the 21st century. We are using scenario experiments to test our hypothesis and to make forecasts about the future. We see the proposed research as a major step forward in developing a capacity to diagnose and understand the state of large, interacting human-natural systems. Major foci include: the application of meso-scale atmospheric physics models to drive terrestrial-aquatic ecosystem models; a linked ecosystem services accounting tool; geospatial modeling of anthropogenic GHG emissions and biotic source/sinks at improved space/time resolutions; and meso-economic input-output model to evaluate the impacts of ecosystem services constraints on subregional economies. The presentation will report on recent progress across three strategic planning fronts, which are important to

  14. Analyzing the Applicability of Airline Booking Systems for Cloud Computing Offerings

    Science.gov (United States)

    Watzl, Johannes; Felde, Nils Gentschen; Kranzlmuller, Dieter

    This paper introduces revenue management systems for Cloud computing offerings on the Infrastructure as a Service level. One of the main fields revenue management systems are deployed in is the airline industry. At the moment, the predominant part of the Cloud providers use static pricing models. In this work, a mapping of Cloud resources to flights in different categories and classes is presented together with a possible strategy to make use of these models in the emerging area of Cloud computing. The latter part of this work then describes a first step towards an inter-cloud brokering and trading platform by deriving requirements for a potential architectural design.

  15. Social network analysis as a method for analyzing interaction in collaborative online learning environments

    Directory of Open Access Journals (Sweden)

    Patricia Rice Doran

    2011-12-01

    Full Text Available Social network analysis software such as NodeXL has been used to describe participation and interaction in numerous social networks, but it has not yet been widely used to examine dynamics in online classes, where participation is frequently required rather than optional and participation patterns may be impacted by the requirements of the class, the instructor’s activities, or participants’ intrinsic engagement with the subject matter. Such social network analysis, which examines the dynamics and interactions among groups of participants in a social network or learning group, can be valuable in programs focused on teaching collaborative and communicative skills, including teacher preparation programs. Applied to these programs, social network analysis can provide information about instructional practices likely to facilitate student interaction and collaboration across diverse student populations. This exploratory study used NodeXL to visualize students’ participation in an online course, with the goal of identifying (1 ways in which NodeXL could be used to describe patterns in participant interaction within an instructional setting and (2 identifying specific patterns in participant interaction among students in this particular course. In this sample, general education teachers demonstrated higher measures of connection and interaction with other participants than did those from specialist (ESOL or special education backgrounds, and tended to interact more frequently with all participants than the majority of participants from specialist backgrounds. We recommend further research to delineate specific applications of NodeXL within an instructional context, particularly to identify potential patterns in student participation based on variables such as gender, background, cultural and linguistic heritage, prior training and education, and prior experience so that instructors can ensure their practice helps to facilitate student interaction

  16. VGC analyzer: a software for statistical analysis of fully crossed multiple-reader multiple-case visual grading characteristics studies

    International Nuclear Information System (INIS)

    Baath, Magnus; Hansson, Jonny

    2016-01-01

    Visual grading characteristics (VGC) analysis is a non-parametric rank-invariant method for analysis of visual grading data. In VGC analysis, image quality ratings for two different conditions are compared by producing a VGC curve, similar to how the ratings for normal and abnormal cases in receiver operating characteristic (ROC) analysis are used to create an ROC curve. The use of established ROC software for the analysis of VGC data has therefore previously been proposed. However, the ROC analysis is based on the assumption of independence between normal and abnormal cases. In VGC analysis, this independence cannot always be assumed, e.g. if the ratings are based on the same patients imaged under both conditions. A dedicated software intended for analysis of VGC studies, which takes possible dependencies between ratings into account in the statistical analysis of a VGC study, has therefore been developed. The software-VGC Analyzer-determines the area under the VGC curve and its uncertainty using non-parametric re-sampling techniques. This article gives an introduction to VGC Analyzer, describes the types of analyses that can be performed and instructs the user about the input and output data. (authors)

  17. COINTOF mass spectrometry: design of a time-of-flight analyzer and development of the analysis method

    International Nuclear Information System (INIS)

    Teyssier, C.

    2012-01-01

    DIAM (Device for the irradiation of molecular clusters) is a newly designed experimental setup to investigate processes resulting from the irradiation of molecular nano-systems by 20-150 keV protons. One of its specificities relies on the original technique of mass spectrometry named COINTOF (Correlated Ion and Neutral Time Of Flight) consisting in correlated measurements of the time of flight of charged and neutral fragments produced by the dissociation of a single molecular ion parent. A strategy of treatment and analysis of the detection signals was developed to distinguish two fragments close in time ( 3 O + and two water molecules. The distribution of the time of flight difference between the two neutral fragments is measured providing an estimate of the kinetic energy release of a few eV. In parallel, a second time-of-flight mass spectrometer was designed. It associates a linear time-of-flight and an orthogonal time-of-flight and integrates position detectors (delay line anode). Simulations demonstrate the potentials of the new analyzer. Finally, research works were led at the laboratory R.-J. A. Levesque (Universite de Montreal) on the imaging capabilities of the multi-pixel detectors of the MPX-ATLAS collaboration. (author)

  18. A Framework For Analyzing And Mitigating The Vulnerabilities Of Complex Systems Via Attack And Protection Trees

    Science.gov (United States)

    2007-07-01

    Systems, Ciudad Real, Spain, 2002. [Ame00] "Metamorphosis," in American Heritage Dictionary of the English Language Fourth ed: Houghton Mifflin Company...Beyond Fear: Thinking Sensibly About Security in an Uncertain World. New York: Copernicus Books, 2003. [Sch99] Schneier, B. "Modeling Security

  19. Analyzing the Implementation of an ERP System by Self-Assessment in Higher Education

    Science.gov (United States)

    Máté, Domicián; Bács, Zoltán; Takács, Viktor László

    2017-01-01

    Over the last few decades, not only organizations but also Higher Education Institutions should be more responsive to the demands of the changed global business environment and improve their effectiveness. Our motivation to write this paper is to assess the implementation of an Enterprise Resource Planning (ERP) system in higher education and…

  20. Design of the expert system to analyze disease in Plant Teak using Forward Chaining

    Directory of Open Access Journals (Sweden)

    Poningsih Poningsih

    2017-06-01

    Full Text Available Teak is one kind of plant that is already widely known and developed by the wider community in the form of plantations and community forests. This is because until now Teak wood is a commodity of luxury, high quality, the price is expensive, and high economic value. Expert systems are a part of the method sciences artificial intelligence to make an application program disease diagnosis teak computerized seek to replace and mimic the reasoning process of an expert or experts in solving the problem specification that can be said to be a duplicate from an expert because science knowledge is stored inside a database  Expert System for the diagnosis of disease teak using forward chaining method aims to explore the characteristics shown in the form of questions in order to diagnose the disease teak with web-based software. Device keel expert system can recognize the disease after consulting identity by answering some of the questions presented by the application of expert systems and can infer some kind of disease in plants teak. Data disease known customize rules (rules are made to match the characteristics of teak disease and provide treatment solutions.

  1. Study to Analyze the Acquisition of Automatic Test Equipment (ATE) Systems. Data Sequence Number A003

    Science.gov (United States)

    1973-12-27

    Systems Test Equipment Comparator, ASTEC ) at NAEC can provide a very accurate Ion a pin by pin basis) match between the UUT and ATE in their data bank...In addition, abbreviated summary data on the ATE is also available to users. ASTEC will also file the UUT data as part of its data bank so that

  2. Design and construction of a system to analyze Radon 222 by means of alpha spectroscopy

    International Nuclear Information System (INIS)

    Martinez, J.B.

    1991-01-01

    Design and construction of a system to measure gaseous Radon 222 which arise from a source of Radium 226 electrodeposited in a stainless disc is described. Such a system allows to differentiate the energies of radium where they come from, as well as energies of daughter products. In this way it is possible to have a more precise measure of the alpha activity of this isotope. The system was constructed in a stainless steel hermetic container made of the camera, a cape and a valve, the used sample was a standards of Radium 226 attained from carnotite ore. The Radon 222 alpha particles, as well as the alpha particles of its decay products namely Polonium 210. Polonium 218 and Polonium 214 were identified by a surface barrier detector. The results in this manner obtained shows clearly well definite peaks of Radon 222 and also peaks of the Radon 222 daughter products with energies of 5.43, 5.31, 6.0 and 7.69 Mev respectively. The system allows to separate and to indentify the energies of Radon and its daughter products coming directly from a standard solid sample of Radium 226 (Author)

  3. Analyzing Fourier Transforms for NASA DFRC's Fiber Optic Strain Sensing System

    Science.gov (United States)

    Fiechtner, Kaitlyn Leann

    2010-01-01

    This document provides a basic overview of the fiber optic technology used for sensing stress, strain, and temperature. Also, the document summarizes the research concerning speed and accuracy of the possible mathematical algorithms that can be used for NASA DFRC's Fiber Optic Strain Sensing (FOSS) system.

  4. Analyzing the Impacts of Alternated Number of Iterations in Multiple Imputation Method on Explanatory Factor Analysis

    Directory of Open Access Journals (Sweden)

    Duygu KOÇAK

    2017-11-01

    Full Text Available The study aims to identify the effects of iteration numbers used in multiple iteration method, one of the methods used to cope with missing values, on the results of factor analysis. With this aim, artificial datasets of different sample sizes were created. Missing values at random and missing values at complete random were created in various ratios by deleting data. For the data in random missing values, a second variable was iterated at ordinal scale level and datasets with different ratios of missing values were obtained based on the levels of this variable. The data were generated using “psych” program in R software, while “dplyr” program was used to create codes that would delete values according to predetermined conditions of missing value mechanism. Different datasets were generated by applying different iteration numbers. Explanatory factor analysis was conducted on the datasets completed and the factors and total explained variances are presented. These values were first evaluated based on the number of factors and total variance explained of the complete datasets. The results indicate that multiple iteration method yields a better performance in cases of missing values at random compared to datasets with missing values at complete random. Also, it was found that increasing the number of iterations in both missing value datasets decreases the difference in the results obtained from complete datasets.

  5. Content analysis of medical students' seminars: a unique method of analyzing clinical thinking.

    Science.gov (United States)

    Takata, Yukari; Stein, Gerald H; Endo, Kuniyuki; Arai, Akiko; Kohsaka, Shun; Kitano, Yuka; Honda, Hitoshi; Kitazono, Hidetaka; Tokunaga, Hironobu; Tokuda, Yasuharu; Obika, Mikako; Miyoshi, Tomoko; Kataoka, Hitomi; Terasawa, Hidekazu

    2013-12-01

    The study of communication skills of Asian medical students during structured Problem-based Learning (PBL) seminars represented a unique opportunity to assess their critical thinking development. This study reports the first application of the health education technology, content analysis (CA), to a Japanese web-based seminar (webinar). The authors assigned twelve randomly selected medical students from two universities and two clinical instructors to two virtual classrooms for four PBL structured tutoring sessions that were audio-video captured for CA. Both of the instructors were US-trained physicians. This analysis consisted of coding the students' verbal comments into seven types, ranging from trivial to advanced knowledge integration comments that served as a proxy for clinical thinking. The most basic level of verbal simple responses accounted for a majority (85%) of the total students' verbal comments. Only 15% of the students' comments represented more advanced types of critical thinking. The male students responded more than the female students; male students attending University 2 responded more than male students from University 1. The total mean students' verbal response time for the four sessions with the male instructor was 6.9%; total mean students' verbal response time for the four sessions with the female instructor was 19% (p thinking for medical students. This report may stimulate improvements for implementation.

  6. Systems analysis of the CANDU 3 Reactor

    Energy Technology Data Exchange (ETDEWEB)

    Wolfgong, J.R.; Linn, M.A.; Wright, A.L.; Olszewski, M.; Fontana, M.H. [Oak Ridge National Lab., TN (United States)

    1993-07-01

    This report presents the results of a systems failure analysis study of the CANDU 3 reactor design; the study was performed for the US Nuclear Regulatory Commission. As part of the study a review of the CANDU 3 design documentation was performed, a plant assessment methodology was developed, representative plant initiating events were identified for detailed analysis, and a plant assessment was performed. The results of the plant assessment included classification of the CANDU 3 event sequences that were analyzed, determination of CANDU 3 systems that are ``significant to safety,`` and identification of key operator actions for the analyzed events.

  7. Secondary emission ion analyzer provided with an electron gun for insulating material analysis

    International Nuclear Information System (INIS)

    Blanchard, Bruno; Carrier, Patrick; Marguerite, J.-L.; Rocco, J.-C.

    1976-01-01

    This invention relates to a secondary emission ion analyser, fitted with an electron gun. It is used in the mass spectrometry analysis of electrically insulating bodies. It has already been suggested to bombard the target with an electron beam in conjunction with the beam of primary particles, in order to reduce the space charge near the target. The object of this invention is the application of this known process to appliances of the ion analyser type with a high electric field near the target. Its main characteristic is the use of an electron gun emitting an electron beam through the extraction lens placed opposite the target. The extraction electric field influences the path of the electrons but the electric and mechanical specifications of the electron gun in the invention are such that the target is correctly sprayed by the electron beam [fr

  8. A Framework to Analyze the Robustness of Social-ecological Systems from an Institutional Perspective

    Directory of Open Access Journals (Sweden)

    John M. Anderies

    2004-06-01

    Full Text Available What makes social-ecological systems (SESs robust? In this paper, we look at the institutional configurations that affect the interactions among resources, resource users, public infrastructure providers, and public infrastructures. We propose a framework that helps identify potential vulnerabilities of SESs to disturbances. All the links between components of this framework can fail and thereby reduce the robustness of the system. We posit that the link between resource users and public infrastructure providers is a key variable affecting the robustness of SESs that has frequently been ignored in the past. We illustrate the problems caused by a disruption in this link. We then briefly describe the design principles originally developed for robust common-pool resource institutions, because they appear to be a good starting point for the development of design principles for more general SESs and do include the link between resource users and public infrastructure providers.

  9. Content analysis of medical students’ seminars: a unique method of analyzing clinical thinking

    Science.gov (United States)

    2013-01-01

    Background The study of communication skills of Asian medical students during structured Problem-based Learning (PBL) seminars represented a unique opportunity to assess their critical thinking development. This study reports the first application of the health education technology, content analysis (CA), to a Japanese web-based seminar (webinar). Methods The authors assigned twelve randomly selected medical students from two universities and two clinical instructors to two virtual classrooms for four PBL structured tutoring sessions that were audio-video captured for CA. Both of the instructors were US-trained physicians. This analysis consisted of coding the students’ verbal comments into seven types, ranging from trivial to advanced knowledge integration comments that served as a proxy for clinical thinking. Results The most basic level of verbal simple responses accounted for a majority (85%) of the total students’ verbal comments. Only 15% of the students’ comments represented more advanced types of critical thinking. The male students responded more than the female students; male students attending University 2 responded more than male students from University 1. The total mean students’ verbal response time for the four sessions with the male instructor was 6.9%; total mean students’ verbal response time for the four sessions with the female instructor was 19% (p student clinical training webinar in two Japanese medical schools. These results are preliminary, mostly limited by a small sample size (n = 12) and limited time frame (four sessions). CA technology has the potential to improve clinical thinking for medical students. This report may stimulate improvements for implementation. PMID:24289320

  10. Analyzing Risks and Vulnerabilities of Various Computer Systems and Undergoing Exploitation using Embedded Devices

    Science.gov (United States)

    Branch, Drew Alexander

    2014-01-01

    Security is one of the most if not the most important areas today. After the several attacks on the United States, security everywhere has heightened from airports to communication among the military branches legionnaires. With advanced persistent threats (APTs) on the rise following Stuxnet, government branches and agencies are required, more than ever, to follow several standards, policies and procedures to reduce the likelihood of a breach. Attack vectors today are very advanced and are going to continue to get more and more advanced as security controls advance. This creates a need for networks and systems to be in an updated, patched and secured state in a launch control system environment. Attacks on critical systems are becoming more and more relevant and frequent. Nation states are hacking into critical networks that might control electrical power grids or water dams as well as carrying out advanced persistent threat (APTs) attacks on government entities. NASA, as an organization, must protect its self from attacks from all different types of attackers with different motives. Although the International Space Station was created, there is still competition between the different space programs. With that in mind, NASA might get attacked and breached for various reasons such as espionage or sabotage. My project will provide a way for NASA to complete an in house penetration test which includes: asset discovery, vulnerability scans, exploit vulnerabilities and also provide forensic information to harden systems. Completing penetration testing is a part of the compliance requirements of the Federal Information Security Act (FISMA) and NASA NPR 2810.1 and related NASA Handbooks. This project is to demonstrate how in house penetration testing can be conducted that will satisfy all of the compliance requirements of the National Institute of Standards and Technology (NIST), as outlined in FISMA. By the end of this project, I hope to have carried out the tasks stated

  11. Application of Method of Variation to Analyze and Predict Human Induced Modifications of Water Resource Systems

    Science.gov (United States)

    Dessu, S. B.; Melesse, A. M.; Mahadev, B.; McClain, M.

    2010-12-01

    Water resource systems have often used gravitational surface and subsurface flows because of their practicality in hydrological modeling and prediction. Activities such as inter/intra-basin water transfer, the use of small pumps and the construction of micro-ponds challenge the tradition of natural rivers as water resource management unit. On the contrary, precipitation is barely affected by topography and plot harvesting in wet regions can be more manageable than diverting from rivers. Therefore, it is indicative to attend to systems where precipitation drives the dynamics while the internal mechanics constitutes spectrum of human activity and decision in a network of plots. The trade-in volume and path of harvested precipitation depends on water balance, energy balance and the kinematics of supply and demand. Method of variation can be used to understand and predict the implication of local excess precipitation harvest and exchange on the natural water system. A system model was developed using the variational form of Euler-Bernoulli’s equation for the Kenyan Mara River basin. Satellite derived digital elevation models, precipitation estimates, and surface properties such as fractional impervious surface area, are used to estimate the available water resource. Four management conditions are imposed in the model: gravitational flow, open water extraction and high water use investment at upstream and downstream respectively. According to the model, the first management maintains the basin status quo while the open source management could induce externality. The high water market at the upstream in the third management offers more than 50% of the basin-wide total revenue to the upper third section of the basin thus may promote more harvesting. The open source and upstream exploitation suggest potential drop of water availability to downstream. The model exposed the latent potential of economic gradient to reconfigure the flow network along the direction where the

  12. ICECO-CEL: a coupled Eulerian-Lagrangian code for analyzing primary system response in fast reactors

    International Nuclear Information System (INIS)

    Wang, C.Y.

    1981-02-01

    This report describes a coupled Eulerian-Lagrangian code, ICECO-CEL, for analyzing the response of the primary system during hypothetical core disruptive accidents. The implicit Eulerian method is used to calculate the fluid motion so that large fluid distortion, two-dimensional sliding interface, flow around corners, flow through coolant passageways, and out-flow boundary conditions can be treated. The explicit Lagrangian formulation is employed to compute the response of the containment vessel and other elastic-plastic solids inside the reactor containment. Large displacements, as well as geometrical and material nonlinearities are considered in the analysis. Marker particles are utilized to define the free surface or the material interface and to visualize the fluid motion. The basic equations and numerical techniques used in the Eulerian hydrodynamics and Lagrangian structural dynamics are described. Treatment of the above-core hydrodynamics, sodium spillage, fluid cavitation, free-surface boundary conditions and heat transfer are also presented. Examples are given to illustrate the capabilities of the computer code. Comparisons of the code predictions with available experimental data are also made

  13. Analyzing the Impacts of Implementation of Human Resource Information System in Mergers and Acquisitions

    Institute of Scientific and Technical Information of China (English)

    QIN; YIWEN

    2016-01-01

    Mergers&Acquisitions(M&A)have become very popular throughout the world in the recent years,due to globalization,liberalization,technological developments and intensely competitive business environment.Inchoately,it suggests that strategic or financial terms are the main issues result in the failure of M&A.In this paper,it analyses the challenges of mergers and acquisitions and the implementation of HRIS from the perspective of timing,data entry,staff training,and security.Through the analysis,it is clear how to make sure the post-merger organization development successfully.

  14. Logical analysis of biological systems

    DEFF Research Database (Denmark)

    Mardare, Radu Iulian

    2005-01-01

    R. Mardare, Logical analysis of biological systems. Fundamenta Informaticae, N 64:271-285, 2005.......R. Mardare, Logical analysis of biological systems. Fundamenta Informaticae, N 64:271-285, 2005....

  15. Natural hazard impacts on transport systems: analyzing the data base of transport accidents in Russia

    Science.gov (United States)

    Petrova, Elena

    2015-04-01

    We consider a transport accident as any accident that occurs during transportation of people and goods. It comprises of accidents involving air, road, rail, water, and pipeline transport. With over 1.2 million people killed each year, road accidents are one of the world's leading causes of death; another 20-50 million people are injured each year on the world's roads while walking, cycling, or driving. Transport accidents of other types including air, rail, and water transport accidents are not as numerous as road crashes, but the relative risk of each accident is much higher because of the higher number of people killed and injured per accident. Pipeline ruptures cause large damages to the environment. That is why safety and security are of primary concern for any transport system. The transport system of the Russian Federation (RF) is one of the most extensive in the world. It includes 1,283,000 km of public roads, more than 600,000 km of airlines, more than 200,000 km of gas, oil, and product pipelines, 115,000 km of inland waterways, and 87,000 km of railways. The transport system, especially the transport infrastructure of the country is exposed to impacts of various natural hazards and weather extremes such as heavy rains, snowfalls, snowdrifts, floods, earthquakes, volcanic eruptions, landslides, snow avalanches, debris flows, rock falls, fog or icing roads, and other natural factors that additionally trigger many accidents. In June 2014, the Ministry of Transport of the RF has compiled a new version of the Transport Strategy of the RF up to 2030. Among of the key pillars of the Strategy are to increase the safety of the transport system and to reduce negative environmental impacts. Using the data base of technological accidents that was created by the author, the study investigates temporal variations and regional differences of the transport accidents' risk within the Russian federal regions and a contribution of natural factors to occurrences of different

  16. Light-pollution measurement with the Wide-field all-sky image analyzing monitoring system

    Science.gov (United States)

    Vítek, S.

    2017-07-01

    The purpose of this experiment was to measure light pollution in the capital of Czech Republic, Prague. As a measuring instrument is used calibrated consumer level digital single reflex camera with IR cut filter, therefore, the paper reports results of measuring and monitoring of the light pollution in the wavelength range of 390 - 700 nm, which most affects visual range astronomy. Combining frames of different exposure times made with a digital camera coupled with fish-eye lens allow to create high dynamic range images, contain meaningful values, so such a system can provide absolute values of the sky brightness.

  17. Corrosion potential analysis system

    Science.gov (United States)

    Kiefer, Karl F.

    1998-03-01

    Many cities in the northeastern U.S. transport electrical power from place to place via underground cables, which utilize voltages from 68 kv to 348 kv. These cables are placed in seamless steel pipe to protect the conductors. These buried pipe-type-cables (PTCs) are carefully designed and constantly pressurized with transformer oil to prevent any possible contamination. A protective coating placed on the outside diameter of the pipe during manufacture protects the steel pipe from the soil environment. Notwithstanding the protection mechanisms available, the pipes remain vulnerable to electrochemical corrosion processes. If undetected, corrosion can cause the pipes to leak transformer oil into the environment. These leaks can assume serious proportions due to the constant pressure on the inside of the pipe. A need exists for a detection system that can dynamically monitor the corrosive potential on the length of the pipe and dynamically adjust cathodic protection to counter local and global changes in the cathodic environment surrounding the pipes. The northeastern United States contains approximately 1000 miles of this pipe. This milage is critical to the transportation and distribution of power. So critical, that each of the pipe runs has a redundant double running parallel to it. Invocon, Inc. proposed and tested a technically unique and cost effective solution to detect critical corrosion potential and to communicate that information to a central data collection and analysis location. Invocon's solution utilizes the steel of the casing pipe as a communication medium. Each data gathering station on the pipe can act as a relay for information gathered elsewhere on the pipe. These stations must have 'smart' network configuration algorithms that constantly test various communication paths and determine the best and most power efficient route through which information should flow. Each network station also performs data acquisition and analysis tasks that ultimately

  18. TimeLapseAnalyzer: Multi-target analysis for live-cell imaging and time-lapse microscopy

    DEFF Research Database (Denmark)

    Huth, Johannes; Buchholz, Malte; Kraus, Johann M.

    2011-01-01

    The direct observation of cells over time using time-lapse microscopy can provide deep insights into many important biological processes. Reliable analyses of motility, proliferation, invasive potential or mortality of cells are essential to many studies involving live cell imaging and can aid in...... counting and tube formation analysis in high throughput screening of live-cell experiments. TimeLapseAnalyzer is freely available (MATLAB, Open Source) at http://www.informatik.uniulm. de/ni/mitarbeiter/HKestler/tla......., we developed TimeLapseAnalyzer. Apart from general purpose image enhancements and segmentation procedures, this extensible, self-contained, modular cross-platform package provides dedicated modalities for fast and reliable analysis of multi-target cell tracking, scratch wound healing analysis, cell...

  19. Integrated minicomputer alpha analysis system

    International Nuclear Information System (INIS)

    Vasilik, D.G.; Coy, D.E.; Seamons, M.; Henderson, R.W.; Romero, L.L.; Thomson, D.A.

    1978-01-01

    Approximately 1,000 stack and occupation air samples from plutonium and uranium facilities at LASL are analyzed daily. The concentrations of radio-nuclides in air are determined by measuring absolute alpha activities of particulates collected on air sample filter media. The Integrated Minicomputer Pulse system (IMPULSE) is an interface between many detectors of extremely simple design and a Digital Equipment Corporation (DEC) PDP-11/04 minicomputer. The detectors are photomultiplier tubes faced with zinc sulfide (ZnS). The average detector background is approximately 0.07 cpm. The IMPULSE system includes two mainframes, each of which can hold up to 64 detectors. The current hardware configuration includes 64 detectors in one mainframe and 40 detectors in the other. Each mainframe contains a minicomputer with 28K words of Random Access Memory. One minicomputer controls the detectors in both mainframes. A second computer was added for fail-safe redundancy and to support other laboratory computer requirements. The main minicomputer includes a dual floppy disk system and a dual DEC 'RK05' disk system for mass storage. The RK05 facilitates report generation and trend analysis. The IMPULSE hardware provides for passage of data from the detectors to the computer, and for passage of status and control information from the computer to the detector stations

  20. A system to analyze the complex physiological states of coal solubilizing fungi

    Energy Technology Data Exchange (ETDEWEB)

    Hoelker, U.; Moenkemann, H.; Hoefer, M. [Universitaet Bonn, Bonn (Germany). Botanisches Institut

    1997-11-01

    The mechanism by which some microorganisms solubilize brown coal is still unknown. The paper discusses the deuteromycetes Fusarium oxysporum and Trichoderma atroviride as a suitable test system to analyse the complex fungal physiology relating to coal solubilization. The two fungi can occur in two different growth substrate-controlled physiological states: a coal-solubilizing one, when cells are grown on glutamate or gluconate as substrate and a non-solubilizing one, when grown on carbohydrates. When grown on carbohydrates, F.oxysporum produces the pigment bikaverein. Purified bikaverein inhibits also coal solubilization by T. atroviride. The ability to solubilize coal is constitutive in F. oxysporum, while in T. atroviride, it has to be induced. 10 refs., 3 figs., 3 tabs.

  1. Hepatitis B infection reported with cancer chemotherapy: analyzing the US FDA Adverse Event Reporting System.

    Science.gov (United States)

    Sanagawa, Akimasa; Hotta, Yuji; Kataoka, Tomoya; Maeda, Yasuhiro; Kondo, Masahiro; Kawade, Yoshihiro; Ogawa, Yoshihiro; Nishikawa, Ryohei; Tohkin, Masahiro; Kimura, Kazunori

    2018-04-16

    We conducted data mining using the US Food and Drug Administration (FDA) Adverse Event Reporting System (FAERS) database on spontaneously reported adverse events to evaluate the association between anticancer drug therapy and hepatitis B infection. Reports of hepatitis B infection were retrieved from the FAERS database. The reporting odds ratio (ROR) was used to estimate the association between hepatitis B infection and various anticancer agents and drug combinations. We detected statistically significant risk signals of hepatitis B for 33 of 64 anticancer agents by ROR (26 cytotoxicity drugs and seven molecular-targeted drugs). We focused on molecular-targeted drugs and assessed the risk of hepatitis B from specific anticancer drug combinations. The frequency of hepatitis B infection was significantly high for drugs such as rituximab, bortezomib, imatinib, and everolimus. The addition of cyclophosphamide, doxorubicin, and fludarabine to drug combinations additively enhanced the frequency of hepatitis B infection. There were no reports on hepatitis B infection associated with trastuzumab or azacitidine monotherapy. However, trastuzumab-containing regimens (e.g., combinations with docetaxel or paclitaxel) were correlated with the incidence of hepatitis B infection, similar to azacitidine monotherapy. Our findings suggest that the concomitant use of anticancer drugs, such as trastuzumab, taxane, and azacitidine, may contribute to the risk of hepatitis B infection. The unique signals detected from the public database might provide clues to eliminate the threat of HBV in oncology. © 2018 The Authors. Cancer Medicine published by John Wiley & Sons Ltd.

  2. Analyze for the Quality Control of General X-ray Systems in Capital region

    International Nuclear Information System (INIS)

    Kang, Byung Sam; Lee, Kang Min; Shin, Woo Yong; Park, Soon Chul; Choi, Hak Dong; Cho, Yong Kwon

    2012-01-01

    Thanks to the rapid increase of the interest in the quality control of the General X-ray systems, this research proposes the direction of the quality control through comparing and inspecting the actual condition of the respective quality control in the Clinic, the educational institution and the hospital. The subjects of the investigation are diagnostic radiation equipment's in the clinic, the educational institution and the hospital around the capital. A test of kVp, mR/mAs out put test and reproducibility of the exposure dose, half value layer, an accordance between the light field and the beam alignment test, and lastly reproducibility of the exposure time. Then the mean difference of the percentage, the CV (Coefficient of Variation, CV) and the attenuated curve which are respectively resulted from the above tests are computed. After that we have evaluated the values according to the regulations on the Diagnostic Radiation Equipment Safety Administration regulations. In the case of the clinic and the educational institution, there were 22 general X-ray devices. And 18.2% of the kVp test, 13.6% of the reproducibility of exposure dose test, 9.1% of the mR/mAs out put test, and 13.6% of the HVL (Half Value Layer) test appeared to be improper. In the case of the hospital, however, there were 28 devices. And 7.1% of the reproducibility of exposure dose, 7.1% of the difference in the light field/ beam alignment, and 7.1% of the reproducibility of the exposure time appeared to be improper. According to the investigation, the hospital's quality control condition is better than the condition in the clinic and the educational institution. The quality control condition of the general X-ray devices in the clinic is unsatisfactory compared to the hospital. Thus, it is considered that realizing the importance of the quality control is necessary.

  3. Architectural Analysis of Dynamically Reconfigurable Systems

    Science.gov (United States)

    Lindvall, Mikael; Godfrey, Sally; Ackermann, Chris; Ray, Arnab; Yonkwa, Lyly

    2010-01-01

    oTpics include: the problem (increased flexibility of architectural styles decrease analyzability, behavior emerges and varies depending on the configuration, does the resulting system run according to the intended design, and architectural decisions can impede or facilitate testing); top down approach to architecture analysis, detection of defects and deviations, and architecture and its testability; currently targeted projects GMSEC and CFS; analyzing software architectures; analyzing runtime events; actual architecture recognition; GMPUB in Dynamic SAVE; sample output from new approach; taking message timing delays into account; CFS examples of architecture and testability; some recommendations for improved testablity; and CFS examples of abstract interfaces and testability; CFS example of opening some internal details.

  4. Computer-automated neutron activation analysis system

    International Nuclear Information System (INIS)

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references

  5. Systems analysis of a security alarm system

    International Nuclear Information System (INIS)

    Schiff, A.

    1975-01-01

    When the Lawrence Livermore Laboratory found that its security alarm system was causing more false alarms and maintenance costs than LLL felt was tolerable, a systems analysis was undertaken to determine what should be done about the situation. This report contains an analysis of security alarm systems in general and ends with a review of the existing Security Alarm Control Console (SACC) and recommendations for its improvement, growth and change. (U.S.)

  6. Performance analysis of switching systems

    NARCIS (Netherlands)

    Berg, van den R.A.

    2008-01-01

    Performance analysis is an important aspect in the design of dynamic (control) systems. Without a proper analysis of the behavior of a system, it is impossible to guarantee that a certain design satisfies the system’s requirements. For linear time-invariant systems, accurate performance analyses are

  7. The extreme condition analyzing for NEMPI shielding of electronic system in high-intensity pulsed radiation diagnosing

    International Nuclear Information System (INIS)

    Cheng Xiaolei; Liu Fang; Ouyang Xiaoping

    2012-01-01

    The difficulty for estimating the NEMPI (electromagnetic pulsed interference caused by the nuclear reaction) on the electronic system in high-intensity pulsed radiation diagnosing is analyzed in this article. To solve the difficulty, a method called 'Extreme Condition Analyzing' is presented for estimating the NEMPI conservatively and reliably. Through an extreme condition hypothesizing which could be described as 'Entire Coupling of Electric Field Energy', the E max (maximum electric field intensity which could be endured by the electronic system in the high-intensity pulsed radiation) could be figured out without any other information of the EMP caused by the nuclear reaction. Then a feasibility inspection is introduced, to confirm that the EMPI shielding request according to E max is not too extreme to be achieved. (authors)

  8. Power system health analysis

    International Nuclear Information System (INIS)

    Billinton, Roy; Fotuhi-Firuzabad, Mahmud; Aboreshaid, Saleh

    1997-01-01

    This paper presents a technique which combines both probabilistic indices and deterministic criteria to reflect the well-being of a power system. This technique permits power system planners, engineers and operators to maximize the probability of healthy operation as well as minimizing the probability of risky operation. The concept of system well-being is illustrated in this paper by application to the areas of operating reserve assessment and composite power system security evaluation

  9. Formal Modeling and Analysis of Timed Systems

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Niebert, Peter

    This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts of ...... systems, discrete time systems, timed languages, and real-time operating systems....... of two invited talks were carefully selected from 36 submissions during two rounds of reviewing and improvement. All current aspects of formal method for modeling and analyzing timed systems are addressed; among the timed systems dealt with are timed automata, timed Petri nets, max-plus algebras, real-time......This book constitutes the thoroughly refereed post-proceedings of the First International Workshop on Formal Modeling and Analysis of Timed Systems, FORMATS 2003, held in Marseille, France in September 2003. The 19 revised full papers presented together with an invited paper and the abstracts...

  10. Rewriting Modulo SMT and Open System Analysis

    Science.gov (United States)

    Rocha, Camilo; Meseguer, Jose; Munoz, Cesar

    2014-01-01

    This paper proposes rewriting modulo SMT, a new technique that combines the power of SMT solving, rewriting modulo theories, and model checking. Rewriting modulo SMT is ideally suited to model and analyze infinite-state open systems, i.e., systems that interact with a non-deterministic environment. Such systems exhibit both internal non-determinism, which is proper to the system, and external non-determinism, which is due to the environment. In a reflective formalism, such as rewriting logic, rewriting modulo SMT can be reduced to standard rewriting. Hence, rewriting modulo SMT naturally extends rewriting-based reachability analysis techniques, which are available for closed systems, to open systems. The proposed technique is illustrated with the formal analysis of: (i) a real-time system that is beyond the scope of timed-automata methods and (ii) automatic detection of reachability violations in a synchronous language developed to support autonomous spacecraft operations.

  11. A New Approach for Analyzing the Reliability of the Repair Facility in a Series System with Vacations

    Directory of Open Access Journals (Sweden)

    Renbin Liu

    2012-01-01

    Full Text Available Based on the renewal process theory we develop a decomposition method to analyze the reliability of the repair facility in an n-unit series system with vacations. Using this approach, we study the unavailability and the mean replacement number during (0,t] of the repair facility. The method proposed in this work is novel and concise, which can make us see clearly the structures of the facility indices of a series system with an unreliable repair facility, two convolution relations. Special cases and numerical examples are given to show the validity of our method.

  12. Fault tree analysis for reactor systems

    International Nuclear Information System (INIS)

    Crosetti, P.A.

    1971-01-01

    Reliability analysis is playing an increasingly important role in quantitative assessment of system performance for assuring nuclear safety, improving plant performance and plant life, and reducing plant operating costs. The complexity of today's nuclear plants warrant the use of techniques which will provide a comprehensive evaluation of systems in their total context. In particular, fault tree analysis with probability evaluation can play a key role in assuring nuclear safety, in improving plant performance and plant life, and in reducing plant operating costs. The technique provides an all inclusive, versatile mathematical tool for analyzing complex systems. Its application can include a complete plant as well as any of the systems and subsystems. Fault tree analysis provides an objective basis for analyzing system design, performing trade-off studies, analyzing common mode failures, demonstrating compliance with AEC requirements, and justifying system changes or additions. The logic of the approach makes it readily understandable and, therefore, it serves as an effective visibility tool for both engineering and management. (U.S.)

  13. Radiometric analyzer

    International Nuclear Information System (INIS)

    Arima, S.; Oda, M.; Miyashita, K.; Takada, M.

    1977-01-01

    A radiometric analyzer for measuring the characteristic values of a sample by radiation includes a humer of radiation measuring subsystems having different ratios of sensitivities to the elements of the sample and linearizing circuits having inverse function characteristics of calibration functions which correspond to the radiation measuring subsystems. A weighing adder operates a desirable linear combination of the outputs of the linearizing circuits. Operators for operating between two or more different linear combinations are included

  14. Systems analysis made simple computerbooks

    CERN Document Server

    Antill, Lyn

    1980-01-01

    Systems Analysis: Made Simple Computerbooks introduces the essential elements of information systems analysis and design and teaches basic technical skills required for the tasks involved. The book covers the aspects to the design of an information system; information systems and the organization, including the types of information processing activity and computer-based information systems; the role of the systems analyst; and the human activity system. The text also discusses information modeling, socio-technical design, man-machine interface, and the database design. Software specification

  15. On-stream analysis systems

    International Nuclear Information System (INIS)

    Howarth, W.J.; Watt, J.S.

    1982-01-01

    An outline of some commercially available on-stream analysis systems in given. Systems based on x-ray tube/crystal spectrometers, scintillation detectors, proportional detectors and solid-state detectors are discussed

  16. Development of a Simulation Framework for Analyzing Security of Supply in Integrated Gas and Electric Power Systems

    Directory of Open Access Journals (Sweden)

    Kwabena Addo Pambour

    2017-01-01

    Full Text Available Gas and power networks are tightly coupled and interact with each other due to physically interconnected facilities. In an integrated gas and power network, a contingency observed in one system may cause iterative cascading failures, resulting in network wide disruptions. Therefore, understanding the impacts of the interactions in both systems is crucial for governments, system operators, regulators and operational planners, particularly, to ensure security of supply for the overall energy system. Although simulation has been widely used in the assessment of gas systems as well as power systems, there is a significant gap in simulation models that are able to address the coupling of both systems. In this paper, a simulation framework that models and simulates the gas and power network in an integrated manner is proposed. The framework consists of a transient model for the gas system and a steady state model for the power system based on AC-Optimal Power Flow. The gas and power system model are coupled through an interface which uses the coupling equations to establish the data exchange and coordination between the individual models. The bidirectional interlink between both systems considered in this studies are the fuel gas offtake of gas fired power plants for power generation and the power supply to liquefied natural gas (LNG terminals and electric drivers installed in gas compressor stations and underground gas storage facilities. The simulation framework is implemented into an innovative simulation tool named SAInt (Scenario Analysis Interface for Energy Systems and the capabilities of the tool are demonstrated by performing a contingency analysis for a real world example. Results indicate how a disruption triggered in one system propagates to the other system and affects the operation of critical facilities. In addition, the studies show the importance of using transient gas models for security of supply studies instead of successions of

  17. An adopter-centric approach to analyze the diffusion patterns of innovative residential heating systems in Sweden

    International Nuclear Information System (INIS)

    Mahapatra, Krushna; Gustavsson, Leif

    2008-01-01

    Innovation and diffusion of renewable energy technologies play a major role in mitigation of climate change. In Sweden replacing electric and oil heating systems with innovative heating systems such as district heating, heat pumps and wood pellet boilers in detached homes is a significant mitigation option. Using an adopter-centric approach, we analyzed the influence of investment subsidy on conversion of resistance heaters and oil boilers, and the variation in diffusion pattern of district heating, heat pumps and pellet boilers in Swedish detached homes. Results from questionnaire surveys of 1500 randomly selected homeowners in September 2004 and January 2007 showed that more than 80% of the respondents did not intend to install a new heating system. Hence, about 37% of the homeowners still have electric and oil heating systems. The government investment subsidy was important for conversion from a resistance heater, but not from an oil boiler. This is because homeowners currently replacing their oil boilers are the laggards, while those replacing resistance heaters are the 'early adopters'. Economic aspects and functional reliability were the most important factors for the homeowners when considering a new heating system. There is a variation in the perceived advantages associated with each of the innovative heating systems and therefore, the diffusion patterns of such systems vary. Installers and interpersonal sources were the most important communication channels for information on heating systems

  18. An adopter-centric approach to analyze the diffusion patterns of innovative residential heating systems in Sweden

    Energy Technology Data Exchange (ETDEWEB)

    Mahapatra, Krushna; Gustavsson, Leif [Ecotechnology, Mid Sweden University, 831 25 Oestersund (Sweden)

    2008-02-15

    Innovation and diffusion of renewable energy technologies play a major role in mitigation of climate change. In Sweden replacing electric and oil heating systems with innovative heating systems such as district heating, heat pumps and wood pellet boilers in detached homes is a significant mitigation option. Using an adopter-centric approach, we analyzed the influence of investment subsidy on conversion of resistance heaters and oil boilers, and the variation in diffusion pattern of district heating, heat pumps and pellet boilers in Swedish detached homes. Results from questionnaire surveys of 1500 randomly selected homeowners in September 2004 and January 2007 showed that more than 80% of the respondents did not intend to install a new heating system. Hence, about 37% of the homeowners still have electric and oil heating systems. The government investment subsidy was important for conversion from a resistance heater, but not from an oil boiler. This is because homeowners currently replacing their oil boilers are the laggards, while those replacing resistance heaters are the 'early adopters'. Economic aspects and functional reliability were the most important factors for the homeowners when considering a new heating system. There is a variation in the perceived advantages associated with each of the innovative heating systems and therefore, the diffusion patterns of such systems vary. Installers and interpersonal sources were the most important communication channels for information on heating systems. (author)

  19. Contamination Analyzer

    Science.gov (United States)

    1994-01-01

    Measurement of the total organic carbon content in water is important in assessing contamination levels in high purity water for power generation, pharmaceutical production and electronics manufacture. Even trace levels of organic compounds can cause defects in manufactured products. The Sievers Model 800 Total Organic Carbon (TOC) Analyzer, based on technology developed for the Space Station, uses a strong chemical oxidizing agent and ultraviolet light to convert organic compounds in water to carbon dioxide. After ionizing the carbon dioxide, the amount of ions is determined by measuring the conductivity of the deionized water. The new technique is highly sensitive, does not require compressed gas, and maintenance is minimal.

  20. On the nitrogen isotope dilution analysis by means of the automated NA-5A type 15N-analyzer

    International Nuclear Information System (INIS)

    Faust, H.; Mueller, G.; Stoerl, H.J.

    1976-01-01

    The analytical conditions are investigated under which quantitative nitrogen determination through isotope dilution analysis is possible using the NA-5A type 15 N-analyzer. Calculation of the nitrogen quantity, estimation of the maximum error and the evaluation technique are considered in detail. Test analyses performed on ammonia, urea, and amino acid nitrogen model solutions labelled in different ways yielded good correspondence with preset values. This technique was applied to determine the nitrogen content of biomedical materials. A working scheme for direct quantitative determination of ammonia-N, urea-N, and total-N in urine is presented. (author)

  1. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    The increasing size and complexity of space systems and space missions pose severe challenges to space systems engineers. When complex systems and Systems-of-Systems are involved, the behavior of the whole entity is not only due to that of the individual systems involved but also to the interactions and dependencies between the systems. Dependencies can be varied and complex, and designers usually do not perform analysis of the impact of dependencies at the level of complex systems, or this analysis involves excessive computational cost, or occurs at a later stage of the design process, after designers have already set detailed requirements, following a bottom-up approach. While classical systems engineering attempts to integrate the perspectives involved across the variety of engineering disciplines and the objectives of multiple stakeholders, there is still a need for more effective tools and methods capable to identify, analyze and quantify properties of the complex system as a whole and to model explicitly the effect of some of the features that characterize complex systems. This research describes the development and usage of Systems Operational Dependency Analysis and Systems Developmental Dependency Analysis, two methods based on parametric models of the behavior of complex systems, one in the operational domain and one in the developmental domain. The parameters of the developed models have intuitive meaning, are usable with subjective and quantitative data alike, and give direct insight into the causes of observed, and possibly emergent, behavior. The approach proposed in this dissertation combines models of one-to-one dependencies among systems and between systems and capabilities, to analyze and evaluate the impact of failures or delays on the outcome of the whole complex system. The analysis accounts for cascading effects, partial operational failures, multiple failures or delays, and partial developmental dependencies. The user of these methods can

  2. Loviisa nuclear power plant analyzer

    International Nuclear Information System (INIS)

    Porkholm, K.; Nurmilaukas, P.; Tiihonen, O.; Haenninen, M.; Puska, E.

    1992-12-01

    The APROS Simulation Environment has been developed since 1986 by Imatran Voima Oy (IVO) and the Technical Research Centre of Finland (VTT). It provides tools, solution algorithms and process components for use in different simulation systems for design, analysis and training purposes. One of its main nuclear applications is the Loviisa Nuclear Power Plant Analyzer (LPA). The Loviisa Plant Analyzer includes all the important plant components both in the primary and in the secondary circuits. In addition, all the main control systems, the protection system and the high voltage electrical systems are included. (orig.)

  3. Energy Usage Analysis System

    Data.gov (United States)

    General Services Administration — The EUAS application is a web based system which serves Energy Center of Expertise, under the Office of Facilitates Management and Service Programs. EUAS is used for...

  4. Temporal and Spatial Independent Component Analysis for fMRI Data Sets Embedded in the AnalyzeFMRI R Package

    Directory of Open Access Journals (Sweden)

    Pierre Lafaye de Micheaux

    2011-10-01

    Full Text Available For statistical analysis of functional magnetic resonance imaging (fMRI data sets, we propose a data-driven approach based on independent component analysis (ICA implemented in a new version of the AnalyzeFMRI R package. For fMRI data sets, spatial dimension being much greater than temporal dimension, spatial ICA is the computationally tractable approach generally proposed. However, for some neuroscientific applications, temporal independence of source signals can be assumed and temporal ICA becomes then an attractive exploratory technique. In this work, we use a classical linear algebra result ensuring the tractability of temporal ICA. We report several experiments on synthetic data and real MRI data sets that demonstrate the potential interest of our R package.

  5. Model systems to analyze the role of miRNAs and commensal microflora in bovine mucosal immune system development.

    Science.gov (United States)

    Liang, Guanxiang; Malmuthuge, Nilusha; Guan, Le Luo; Griebel, Philip

    2015-07-01

    Information is rapidly accumulating regarding the role of miRNAs as key regulators of immune system development and function. It is also increasingly evident that miRNAs play an important role in host-pathogen interactions through regulation of both innate and acquired immune responses. Little is known, however, about the specific role of miRNAs in regulating normal development of the mucosal immune system, especially during the neonatal period. Furthermore, there is limited knowledge regarding the possible role the commensal microbiome may play in regulating mucosal miRNAs expression, although evidence is emerging that a variety of enteric pathogens influence miRNA expression. The current review focuses on recent information that miRNAs play an important role in regulating early development of the bovine mucosal immune system. A possible role for the commensal microbiome in regulating mucosal development by altering miRNA expression is also discussed. Finally, we explore the potential advantages of using the newborn calf as a model to determine how interactions between developmental programming, maternal factors in colostrum, and colonization of the gastrointestinal tract by commensal bacteria may alter mucosal miRNA expression and immune development. Identifying the key factors that regulate mucosal miRNA expression is critical for understanding how the balance between protective immunity and inflammation is maintained to ensure optimal gastrointestinal tract function and health of the whole organism. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Development of an energy analyzer as diagnostic of beam-generated plasma in negative ion beam systems

    Science.gov (United States)

    Sartori, E.; Carozzi, G.; Veltri, P.; Spolaore, M.; Cavazzana, R.; Antoni, V.; Serianni, G.

    2017-08-01

    The measurement of the plasma potential and the energy spectrum of secondary particles in the drift region of a negative ion beam offers an insight into beam-induced plasma formation and beam transport in low pressure gasses. Plasma formation in negative-ion beam systems, and the characteristics of such a plasma are of interest especially for space charge compensation, plasma formation in neutralizers, and the development of improved schemes of beam-induced plasma neutralisers for future fusion devices. All these aspects have direct implications in the ITER Heating Neutral Beam and the operation of the prototypes, SPIDER and MITICA, and also have important role in the conceptual studies for NBI systems of DEMO, while at present experimental data are lacking. In this paper we present the design and development of an ion energy analyzer to measure the beam plasma formation and space charge compensation in negative ion beams. The diagnostic is a retarding field energy analyzer (RFEA), and will measure the transverse energy spectra of plasma molecular ions. The calculations that supported the design are reported, and a method to interpret the measurements in negative ion beam systems is also proposed. Finally, the experimental results of the first test in a magnetron plasma are presented.

  7. RADHEAT-V4: a code system to generate multigroup constants and analyze radiation transport for shielding safety evaluation

    International Nuclear Information System (INIS)

    Yamano, Naoki; Minami, Kazuyoshi; Koyama, Kinji; Naito, Yoshitaka.

    1989-03-01

    A modular code system RADHEAT-V4 has been developed for performing precisely neutron and photon transport analyses, and shielding safety evaluations. The system consists of the functional modules for producing coupled multi-group neutron and photon cross section sets, for analyzing the neutron and photon transport, and for calculating the atom displacement and the energy deposition due to radiations in nuclear reactor or shielding material. A precise method named Direct Angular Representation (DAR) has been developed for eliminating an error associated with the method of the finite Legendre expansion in evaluating angular distributions of cross sections and radiation fluxes. The DAR method implemented in the code system has been described in detail. To evaluate the accuracy and applicability of the code system, some test calculations on strong anisotropy problems have been performed. From the results, it has been concluded that RADHEAT-V4 is successfully applicable to evaluating shielding problems accurately for fission and fusion reactors and radiation sources. The method employed in the code system is very effective in eliminating negative values and oscillations of angular fluxes in a medium having an anisotropic source or strong streaming. Definitions of the input data required in various options of the code system and the sample problems are also presented. (author)

  8. Using the T-scan III system to analyze occlusal function in mandibular reconstruction patients: A pilot study

    Directory of Open Access Journals (Sweden)

    Chao-Wei Liu

    2015-02-01

    Full Text Available Background: This study was designed to analyze the post-rehabilitation occlusal function of subjects treated with complex mandibular resection and subsequently rehabilitated with fibula osteoseptocutaneous flaps, dental implants, and fixed prostheses utilizing the T-scan system. Methods: Ten mandibular complex resection cases that adopted fibula osteoseptocutaneous flaps, dental implants, and fixed prostheses to reconstruct occlusal function were analyzed. The mandibular reconstructions were divided into three groups based on size: full mandibular reconstructions, mandibular reconstructions larger than half of the arch, and mandibular reconstructions smaller than half of the arch. The T-scan III system was used to measure maximum occlusal force, occlusal time, anterior-posterior as well as left-right occlusal force asymmetries, and anterior-posterior as well as left-right asymmetrical locations of occlusal centers. Results: Subjects with larger mandibular reconstructions and dental implants with fixed partial dentures demonstrated decreased average occlusal force; however, the difference did not reach the statistically significant level (p > 0.05. The most significant asymmetry of occlusal center location occurred among subjects with mandibular reconstructed areas larger than half of the mandibular arch. Conclusions: Comparison of the parameters of T-scan system used to analyze the occlusal function showed that the occlusal force was not an objective reference. Measurements of the location of the occlusal center appeared more repeatable, and were less affected by additional factors. The research results of this study showed that the size of a reconstruction did not affect the occlusal force after reconstruction and larger reconstructed areas did not decrease the average occlusal force. The most significant parameter was left and right asymmetry of the occlusion center (LROC and was measured in subjects with reconstruction areas larger than half

  9. Rietveld analysis system RIETAN (translation)

    International Nuclear Information System (INIS)

    Izumi, Fujio

    1991-09-01

    This is the manual of the RIETAN system (a Rietveld analysis program) which is originally written in Japanese by Fujio Izumi. The manual consists of two parts. Part I is a general description of the fundamental concepts and methods of the RIETAN system. Part II is the user's manual of the RIETAN which mainly describes in detail how to create user's data sets, procedures of Rietveld analysis and how to read the results of analysis. (author)

  10. Utilizing Geographic Information Systems (GIS) to analyze geographic and demographic patterns related to forensic case recovery locations in Florida.

    Science.gov (United States)

    Kolpan, Katharine E; Warren, Michael

    2017-12-01

    This paper highlights how Geographic Information Systems (GIS) can be utilized to analyze biases and patterns related to physical and cultural geography in Florida. Using case recovery locations from the C. Addison Pound Human Identification Laboratory (CAPHIL), results indicate that the majority of CAPHIL cases are recovered from urban areas with medium to low population density and low rates of crime. The results also suggest that more accurate record keeping methods would enhance the data. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Integrated system reliability analysis

    DEFF Research Database (Denmark)

    Gintautas, Tomas; Sørensen, John Dalsgaard

    Specific targets: 1) The report shall describe the state of the art of reliability and risk-based assessment of wind turbine components. 2) Development of methodology for reliability and risk-based assessment of the wind turbine at system level. 3) Describe quantitative and qualitative measures...

  12. Using GAIDA (Guide to AI Data Analysis) to analyze data collected from artificial insemination programmes for cattle in developing countries

    International Nuclear Information System (INIS)

    Goodger, W.J.; Clayton, M.; Bennett, T.; Eisele, C.; Garcia, M.; Perera, B.M.A.O.

    2001-01-01

    The objectives of AIDA (Artificial Insemination Database Application) and its companion GAIDA (Guide to AI Data Analysis) are to address two major problems in on-farm research on livestock production. The first is the quality of the data collected and the second is the intellectual rigor of the analyses and their associated results when statistically testing causal hypotheses. The solution is to develop a data management system such as AIDA and an analysis system such as GAIDA to estimate parameters that explain biological mechanisms for on-farm application. The system uses epidemiological study designs in the uncontrolled research environment of the farm, uses a database manager (Microsoft Access) to handle data management issues encountered in preparing data for analysis, and then uses a statistical program (SYSTAT) to do preliminary analyses. These analyses enable the researcher to have better understanding of the biological mechanisms involved in the data contained within the AIDA database. Using GAIDA as a guide, this preliminary analysis helps to determine the strategy for further in-depth analyses. (author)

  13. Evaluation of a lower-powered analyzer and sampling system for eddy-covariance measurements of nitrous oxide fluxes

    Science.gov (United States)

    Brown, Shannon E.; Sargent, Steve; Wagner-Riddle, Claudia

    2018-03-01

    Nitrous oxide (N2O) fluxes measured using the eddy-covariance method capture the spatial and temporal heterogeneity of N2O emissions. Most closed-path trace-gas analyzers for eddy-covariance measurements have large-volume, multi-pass absorption cells that necessitate high flow rates for ample frequency response, thus requiring high-power sample pumps. Other sampling system components, including rain caps, filters, dryers, and tubing, can also degrade system frequency response. This field trial tested the performance of a closed-path eddy-covariance system for N2O flux measurements with improvements to use less power while maintaining the frequency response. The new system consists of a thermoelectrically cooled tunable diode laser absorption spectrometer configured to measure both N2O and carbon dioxide (CO2). The system features a relatively small, single-pass sample cell (200 mL) that provides good frequency response with a lower-powered pump ( ˜ 250 W). A new filterless intake removes particulates from the sample air stream with no additional mixing volume that could degrade frequency response. A single-tube dryer removes water vapour from the sample to avoid the need for density or spectroscopic corrections, while maintaining frequency response. This eddy-covariance system was collocated with a previous tunable diode laser absorption spectrometer model to compare N2O and CO2 flux measurements for two full growing seasons (May 2015 to October 2016) in a fertilized cornfield in Southern Ontario, Canada. Both spectrometers were placed outdoors at the base of the sampling tower, demonstrating ruggedness for a range of environmental conditions (minimum to maximum daily temperature range: -26.1 to 31.6 °C). The new system rarely required maintenance. An in situ frequency-response test demonstrated that the cutoff frequency of the new system was better than the old system (3.5 Hz compared to 2.30 Hz) and similar to that of a closed-path CO2 eddy-covariance system (4

  14. Risk and safety analysis of nuclear systems

    CERN Document Server

    Lee, John C

    2011-01-01

    The book has been developed in conjunction with NERS 462, a course offered every year to seniors and graduate students in the University of Michigan NERS program. The first half of the book covers the principles of risk analysis, the techniques used to develop and update a reliability data base, the reliability of multi-component systems, Markov methods used to analyze the unavailability of systems with repairs, fault trees and event trees used in probabilistic risk assessments (PRAs), and failure modes of systems. All of this material is general enough that it could be used in non-nuclear a

  15. Support system for Neutron Activation Analysis

    International Nuclear Information System (INIS)

    Sasajima, Fumio; Ohtomo, Akitoshi; Sakurai, Fumio; Onizawa, Koji

    1999-01-01

    In the research reactor of JAERI, the Neutron Activation Analysis (NAA) has been utilized as a major part of an irradiation usage. To utilize NAA, research participants are always required to learn necessary technique. Therefore, we started to examine a support system that will enable to carry out INAA easily even by beginners. The system is composed of irradiation device, gamma-ray spectrometer and data analyzing instruments. The element concentration is calculated by using KAYZERO/SOLCOI software with the K 0 standardization method. In this paper, we review on a construction of this INAA support system in JRR-3M of JAERI. (author)

  16. Economic analysis model for total energy and economic systems

    International Nuclear Information System (INIS)

    Shoji, Katsuhiko; Yasukawa, Shigeru; Sato, Osamu

    1980-09-01

    This report describes framing an economic analysis model developed as a tool of total energy systems. To prospect and analyze future energy systems, it is important to analyze the relation between energy system and economic structure. We prepared an economic analysis model which was suited for this purpose. Our model marks that we can analyze in more detail energy related matters than other economic ones, and can forecast long-term economic progress rather than short-term economic fluctuation. From view point of economics, our model is longterm multi-sectoral economic analysis model of open Leontief type. Our model gave us appropriate results for fitting test and forecasting estimation. (author)

  17. Development of spectral analysis math models and software program and spectral analyzer, digital converter interface equipment design

    Science.gov (United States)

    Hayden, W. L.; Robinson, L. H.

    1972-01-01

    Spectral analyses of angle-modulated communication systems is studied by: (1) performing a literature survey of candidate power spectrum computational techniques, determining the computational requirements, and formulating a mathematical model satisfying these requirements; (2) implementing the model on UNIVAC 1230 digital computer as the Spectral Analysis Program (SAP); and (3) developing the hardware specifications for a data acquisition system which will acquire an input modulating signal for SAP. The SAP computational technique uses extended fast Fourier transform and represents a generalized approach for simple and complex modulating signals.

  18. Impedance analysis of subwoofer systems

    NARCIS (Netherlands)

    Berkhoff, Arthur P.

    The electrical impedance of four low-frequency loudspeaker systems is analyzed. The expression for this impedance is obtained directly from the acoustical analogous circuit. Formulas are derived for calculating the small-signal parameters from the frequencies of impedance minima and maxima of two

  19. System Analysis and Risk Assessment (SARA) system

    International Nuclear Information System (INIS)

    Krantz, E.A.; Russell, K.D.; Stewart, H.D.; Van Siclen, V.S.

    1986-01-01

    Utilization of Probabilistic Risk Assessment (PRA) related information in the day-to-day operation of plant systems has, in the past, been impracticable due to the size of the computers needed to run PRA codes. This paper discusses a microcomputer-based database system which can greatly enhance the capability of operators or regulators to incorporate PRA methodologies into their routine decision making. This system is called the System Analysis and Risk Assessment (SARA) system. SARA was developed by EG and G Idaho, Inc. at the Idaho National Engineering Laboratory to facilitate the study of frequency and consequence analyses of accident sequences from a large number of light water reactors (LWRs) in this country. This information is being amassed by several studies sponsored by the United States Nuclear Regulatory Commission (USNRC). To meet the need of portability and accessibility, and to perform the variety of calculations necessary, it was felt that a microcomputer-based system would be most suitable

  20. Analyzing Clickstreams

    DEFF Research Database (Denmark)

    Andersen, Jesper; Giversen, Anders; Jensen, Allan H.

    in modern enterprises. In the data warehousing pproach, selected information is extracted in advance and stored in a repository. This approach is used because of its high performance. However, in many situations a logical (rather than physical) integration of data is preferable. Previous web-based data......On-Line Analytical Processing (OLAP) enables analysts to gain insight into data through fast and interactive access to a variety of possible views on information, organized in a dimensional model. The demand for data integration is rapidly becoming larger as more and more information sources appear....... Extensible Markup Language (XML) is fast becoming the new standard for data representation and exchange on the World Wide Web. The rapid emergence of XML data on the web, e.g., business-to-business (B2B) ecommerce, is making it necessary for OLAP and other data analysis tools to handleXML data as well...

  1. Reliability analysis in interdependent smart grid systems

    Science.gov (United States)

    Peng, Hao; Kan, Zhe; Zhao, Dandan; Han, Jianmin; Lu, Jianfeng; Hu, Zhaolong

    2018-06-01

    Complex network theory is a useful way to study many real complex systems. In this paper, a reliability analysis model based on complex network theory is introduced in interdependent smart grid systems. In this paper, we focus on understanding the structure of smart grid systems and studying the underlying network model, their interactions, and relationships and how cascading failures occur in the interdependent smart grid systems. We propose a practical model for interdependent smart grid systems using complex theory. Besides, based on percolation theory, we also study the effect of cascading failures effect and reveal detailed mathematical analysis of failure propagation in such systems. We analyze the reliability of our proposed model caused by random attacks or failures by calculating the size of giant functioning components in interdependent smart grid systems. Our simulation results also show that there exists a threshold for the proportion of faulty nodes, beyond which the smart grid systems collapse. Also we determine the critical values for different system parameters. In this way, the reliability analysis model based on complex network theory can be effectively utilized for anti-attack and protection purposes in interdependent smart grid systems.

  2. Systems analysis-independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)

    1995-09-01

    The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.

  3. Biomechanics of the cornea evaluated by spectral analysis of waveforms from ocular response analyzer and Corvis-ST.

    Directory of Open Access Journals (Sweden)

    Sushma Tejwani

    Full Text Available In this study, spectral analysis of the deformation signal from Corvis-ST (CoST and reflected light intensity from ocular response analyzer (ORA was performed to evaluate biomechanical concordance with each other.The study was non-interventional, observational, cross-sectional and involved 188 eyes from 94 normal subjects. Three measurements were made on each eye with ORA and CoST each and then averaged for each device. The deformation signal from CoST and reflected light intensity (applanation signal from ORA was compiled for all the eyes. The ORA signal was inverted about a line joining the two applanation peaks. All the signals were analyzed with Fourier series. The area under the signal curves (AUC, root mean square (RMS of all the harmonics, lower order (LO included 1st and 2nd order harmonic, higher order (HO up to 6th harmonic, CoST deformation amplitude (DA, corneal hysteresis (CH and corneal resistance factor (CRF were analyzed.The device variables and those calculated by Fourier transform were statistically significantly different between CoST and ORA. These variables also differed between the eyes of the same subject. There was also statistically significant influence of eyes (left vs. right on the differences in a sub-set of RMS variables only. CH and CRF differed statistically significantly between the eyes of subject (p<0.001 but not DA (p = 0.65.CoST was statistically significantly different from ORA. CoST may be useful in delineating true biomechanical differences between the eyes of a subject as it reports deformation.

  4. Analyzing and modeling interdisciplinary product development a framework for the analysis of knowledge characteristics and design support

    CERN Document Server

    Neumann, Frank

    2015-01-01

    Frank Neumann focuses on establishing a theoretical basis that allows a description of the interplay between individual and collective processes in product development. For this purpose, he introduces the integrated descriptive model of knowledge creation as the first constituent of his research framework. As a second part of the research framework, an analysis and modeling method is proposed that captures the various knowledge conversion activities described by the integrated descriptive model of knowledge creation. Subsequently, this research framework is applied to the analysis of knowledge characteristics of mechatronic product development (MPD). Finally, the results gained from the previous steps are used within a design support system that aims at federating the information and knowledge resources contained in the models published in the various development activities of MPD. Contents Descriptive Model of Knowledge Creation in Interdisciplinary Product Development Research Framework for the Analysis of ...

  5. Tank waste remediation system mission analysis report

    International Nuclear Information System (INIS)

    Acree, C.D.

    1998-01-01

    This document describes and analyzes the technical requirements that the Tank Waste Remediation System (TWRS) must satisfy for the mission. This document further defines the technical requirements that TWRS must satisfy to supply feed to the private contractors' facilities and to store or dispose the immobilized waste following processing in these facilities. This document uses a two phased approach to the analysis to reflect the two-phased nature of the mission

  6. A SWOT Analysis for Organizing a Summer School: Case Study for Advanced Summer School in Analyzing Market Data 2013

    Directory of Open Access Journals (Sweden)

    Radu Herman

    2013-05-01

    Full Text Available The economics scholars agree that investment in education is a competitive advantage. After participating and graduating the “Advanced Summer School in Analyzing Market Data 2013”, the students will gain some formal competences is applied knowledge in Statistics with the IBM SPSS Statistics software. Studies show that the employers seek also practical competences in the undergraduate students, along with the theoretical knowledge. The article focuses on a SWOT analysis for organizing a Summer School in order to compose lists of strengths, weaknesses, opportunities and threats. The purpose of the “Advanced Summer School in Analyzing Market Data 2013“ is to train undergraduate students from social-human sciences to gain competences which are valued in the market and a certificate for attendance, to develop an appropriate training program which combines applied knowledge, statistics and IBM SPSS software and to create a „Summer School quality brand” with high-quality training programs for the Faculty of Administration and Business.

  7. Analyzing patients' values by applying cluster analysis and LRFM model in a pediatric dental clinic in Taiwan.

    Science.gov (United States)

    Wu, Hsin-Hung; Lin, Shih-Yen; Liu, Chih-Wei

    2014-01-01

    This study combines cluster analysis and LRFM (length, recency, frequency, and monetary) model in a pediatric dental clinic in Taiwan to analyze patients' values. A two-stage approach by self-organizing maps and K-means method is applied to segment 1,462 patients into twelve clusters. The average values of L, R, and F excluding monetary covered by national health insurance program are computed for each cluster. In addition, customer value matrix is used to analyze customer values of twelve clusters in terms of frequency and monetary. Customer relationship matrix considering length and recency is also applied to classify different types of customers from these twelve clusters. The results show that three clusters can be classified into loyal patients with L, R, and F values greater than the respective average L, R, and F values, while three clusters can be viewed as lost patients without any variable above the average values of L, R, and F. When different types of patients are identified, marketing strategies can be designed to meet different patients' needs.

  8. Analyzing Patients' Values by Applying Cluster Analysis and LRFM Model in a Pediatric Dental Clinic in Taiwan

    Science.gov (United States)

    Lin, Shih-Yen; Liu, Chih-Wei

    2014-01-01

    This study combines cluster analysis and LRFM (length, recency, frequency, and monetary) model in a pediatric dental clinic in Taiwan to analyze patients' values. A two-stage approach by self-organizing maps and K-means method is applied to segment 1,462 patients into twelve clusters. The average values of L, R, and F excluding monetary covered by national health insurance program are computed for each cluster. In addition, customer value matrix is used to analyze customer values of twelve clusters in terms of frequency and monetary. Customer relationship matrix considering length and recency is also applied to classify different types of customers from these twelve clusters. The results show that three clusters can be classified into loyal patients with L, R, and F values greater than the respective average L, R, and F values, while three clusters can be viewed as lost patients without any variable above the average values of L, R, and F. When different types of patients are identified, marketing strategies can be designed to meet different patients' needs. PMID:25045741

  9. Analyzing Patients’ Values by Applying Cluster Analysis and LRFM Model in a Pediatric Dental Clinic in Taiwan

    Directory of Open Access Journals (Sweden)

    Hsin-Hung Wu

    2014-01-01

    Full Text Available This study combines cluster analysis and LRFM (length, recency, frequency, and monetary model in a pediatric dental clinic in Taiwan to analyze patients’ values. A two-stage approach by self-organizing maps and K-means method is applied to segment 1,462 patients into twelve clusters. The average values of L, R, and F excluding monetary covered by national health insurance program are computed for each cluster. In addition, customer value matrix is used to analyze customer values of twelve clusters in terms of frequency and monetary. Customer relationship matrix considering length and recency is also applied to classify different types of customers from these twelve clusters. The results show that three clusters can be classified into loyal patients with L, R, and F values greater than the respective average L, R, and F values, while three clusters can be viewed as lost patients without any variable above the average values of L, R, and F. When different types of patients are identified, marketing strategies can be designed to meet different patients’ needs.

  10. Integrated piping structural analysis system

    International Nuclear Information System (INIS)

    Motoi, Toshio; Yamadera, Masao; Horino, Satoshi; Idehata, Takamasa

    1979-01-01

    Structural analysis of the piping system for nuclear power plants has become larger in scale and in quantity. In addition, higher quality analysis is regarded as of major importance nowadays from the point of view of nuclear plant safety. In order to fulfill to the above requirements, an integrated piping structural analysis system (ISAP-II) has been developed. Basic philosophy of this system is as follows: 1. To apply the date base system. All information is concentrated. 2. To minimize the manual process in analysis, evaluation and documentation. Especially to apply the graphic system as much as possible. On the basis of the above philosophy four subsystems were made. 1. Data control subsystem. 2. Analysis subsystem. 3. Plotting subsystem. 4. Report subsystem. Function of the data control subsystem is to control all information of the data base. Piping structural analysis can be performed by using the analysis subsystem. Isometric piping drawing and mode shape, etc. can be plotted by using the plotting subsystem. Total analysis report can be made without the manual process through the reporting subsystem. (author)

  11. A Portable, Low-Power Analyzer and Automated Soil Flux Chamber System for Measuring Wetland GHG Emissions

    Science.gov (United States)

    Nickerson, Nick; Kim-Hak, David; McArthur, Gordon

    2017-04-01

    Preservation and restoration of wetlands has the potential to help sequester large amounts of carbon due to the naturally high primary productivity and slow turnover of stored soil carbon. However, the anoxic environmental conditions present in wetland soils are also the largest natural contributor to global methane emissions. While it is well known that wetlands are net carbon sinks over long time scales, given the high global warming potential of methane, the short-term balances between C uptake and storage and loss as CO2 and CH4 need to be carefully considered when evaluating the climate effects of land-use change. It is relatively difficult to measure methane emissions from wetlands with currently available techniques given the temporally and spatially sporadic nature of the processes involved (methanogenesis, methane oxidation, ebullition, etc.). For example, using manual soil flux chambers can often only capture a portion of either the spatial or temporal variability, and often have other disadvantages associated with soil atmosphere disturbance during deployment in these relatively compressible wetland soils. Automated chamber systems offer the advantage of collecting high-resolution time series of gaseous fluxes while reducing some human and method induced biases. Additionally, new laser-based analyzers that can be used in situ alongside automated chambers offer a greater minimum detectable flux than can be achieved using alternative methods such as Gas Chromatography. Until recently these types of automated measurements were limited to areas that had good power coverage, as laser based systems were power intensive and could not easily be supplemented with power from field-available sources such as solar. Recent advances in laser technology has reduced the power needed and made these systems less power intensive and more field portable in the process. Here we present data using an automated chamber system coupled to a portable laser based greenhouse gas

  12. Web server attack analyzer

    OpenAIRE

    Mižišin, Michal

    2013-01-01

    Web server attack analyzer - Abstract The goal of this work was to create prototype of analyzer of injection flaws attacks on web server. Proposed solution combines capabilities of web application firewall and web server log analyzer. Analysis is based on configurable signatures defined by regular expressions. This paper begins with summary of web attacks, followed by detection techniques analysis on web servers, description and justification of selected implementation. In the end are charact...

  13. Analyzing the Potential for High-speed Rail as Part of the Multimodal Transportation System in the United States' Midwest Corridor

    Directory of Open Access Journals (Sweden)

    Jeffrey C. Peters

    2014-06-01

    Full Text Available With increasing demand and rising fuel costs, both travel time and cost of current intercity passenger transportation modes are becoming increasingly relevant. Around the world, highspeed rail (HSR is seen as a way to alleviate demand on highways and at airports. Ridership is the critical element in determining the viability of a large capital, long-term transportation investment. This paper provides a systematic, consistent methodology for analyzing systemwide modal ridership with and without a proposed HSR network and analyzes the potential for highspeed rail as part of the existing multimodal transportation system in a region in terms of ridership. Considerations of capital investment (e.g., network design and HSR speed, along with exogenous demographic, technological, economic, and policy trends in the long-term, are used to project ridership over time. This study represents an important step toward a consistent, comprehensive economic analysis of HSR in the United States.

  14. MARS: Microarray analysis, retrieval, and storage system

    Directory of Open Access Journals (Sweden)

    Scheideler Marcel

    2005-04-01

    Full Text Available Abstract Background Microarray analysis has become a widely used technique for the study of gene-expression patterns on a genomic scale. As more and more laboratories are adopting microarray technology, there is a need for powerful and easy to use microarray databases facilitating array fabrication, labeling, hybridization, and data analysis. The wealth of data generated by this high throughput approach renders adequate database and analysis tools crucial for the pursuit of insights into the transcriptomic behavior of cells. Results MARS (Microarray Analysis and Retrieval System provides a comprehensive MIAME supportive suite for storing, retrieving, and analyzing multi color microarray data. The system comprises a laboratory information management system (LIMS, a quality control management, as well as a sophisticated user management system. MARS is fully integrated into an analytical pipeline of microarray image analysis, normalization, gene expression clustering, and mapping of gene expression data onto biological pathways. The incorporation of ontologies and the use of MAGE-ML enables an export of studies stored in MARS to public repositories and other databases accepting these documents. Conclusion We have developed an integrated system tailored to serve the specific needs of microarray based research projects using a unique fusion of Web based and standalone applications connected to the latest J2EE application server technology. The presented system is freely available for academic and non-profit institutions. More information can be found at http://genome.tugraz.at.

  15. SUBSURFACE VISUAL ALARM SYSTEM ANALYSIS

    International Nuclear Information System (INIS)

    D.W. Markman

    2001-01-01

    The ''Subsurface Fire Hazard Analysis'' (CRWMS M andO 1998, page 61), and the document, ''Title III Evaluation Report for the Surface and Subsurface Communication System'', (CRWMS M andO 1999a, pages 21 and 23), both indicate the installed communication system is adequate to support Exploratory Studies Facility (ESF) activities with the exception of the mine phone system for emergency notification purposes. They recommend the installation of a visual alarm system to supplement the page/party phone system The purpose of this analysis is to identify data communication highway design approaches, and provide justification for the selected or recommended alternatives for the data communication of the subsurface visual alarm system. This analysis is being prepared to document a basis for the design selection of the data communication method. This analysis will briefly describe existing data or voice communication or monitoring systems within the ESF, and look at how these may be revised or adapted to support the needed data highway of the subsurface visual alarm. system. The existing PLC communication system installed in subsurface is providing data communication for alcove No.5 ventilation fans, south portal ventilation fans, bulkhead doors and generator monitoring system. It is given that the data communication of the subsurface visual alarm system will be a digital based system. It is also given that it is most feasible to take advantage of existing systems and equipment and not consider an entirely new data communication system design and installation. The scope and primary objectives of this analysis are to: (1) Briefly review and describe existing available data communication highways or systems within the ESF. (2) Examine technical characteristics of an existing system to disqualify a design alternative is paramount in minimizing the number of and depth of a system review. (3) Apply general engineering design practices or criteria such as relative cost, and degree

  16. PLACE OF PRODUCTION COSTS SYSTEM ANALYSIS IN SYSTEM ANALYSIS

    Directory of Open Access Journals (Sweden)

    Mariia CHEREDNYCHENKO

    2016-12-01

    Full Text Available Current economic conditions require the development and implementation of an adequate system of production costs, which would ensure a steady profit growth and production volumes in a highly competitive, constantly increasing input prices and tariffs. This management system must be based on an integrated production costs system analysis (PCSA, which would provide all operating costs management subsystems necessary information to design and make better management decisions. It provides a systematic analysis of more opportunities in knowledge, creating conditions of integrity mechanism knowledge object consisting of elements that show intersystem connections, each of which has its own defined and limited objectives, relationship with the environment.

  17. Provision of a draft version for standard classification structure for information of radiation technologies through analyzing their information and derivation of its applicable requirements to the information system

    International Nuclear Information System (INIS)

    Jang, Sol Ah; Kim, Joo Yeon; Yoo, Ji Yup; Shin, Woo Ho; Park, Tai Jin; Song, Myung Jae

    2015-01-01

    Radiation technology is the one for developing new products or processes by applying radiation or for creating new functions in industry, research and medical fields, and its application is increasing consistently. For securing an advanced technology competitiveness, it is required to create a new added value by information consumer through providing an efficient system for supporting information, which is the infrastructure for research and development, contributed to its collection, analysis and use with a rapidity and structure in addition to some direct research and development. Provision of the management structure for information resources is especially crucial for efficient operating the system for supporting information in radiation technology, and then a standard classification structure of information must be first developed as the system for supporting information will be constructed. The standard classification structure has been analyzed by reviewing the definition of information resources in radiation technology, and those classification structures in similar systems operated by institute in radiation and other scientific fields. And, a draft version of the standard classification structure has been then provided as 7 large, 25 medium and 71 small classifications, respectively. The standard classification structure in radiation technology will be developed in 2015 through reviewing this draft version and experts' opinion. Finally, developed classification structure will be applied to the system for supporting information by considering the plan for constructing this system and database, and requirements for designing the system. Furthermore, this structure will be designed in the system for searching information by working to the individual need of information consumers

  18. Provision of a draft version for standard classification structure for information of radiation technologies through analyzing their information and derivation of its applicable requirements to the information system

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Sol Ah; Kim, Joo Yeon; Yoo, Ji Yup; Shin, Woo Ho; Park, Tai Jin; Song, Myung Jae [Korean Association for Radiation Application, Seoul (Korea, Republic of)

    2015-02-15

    Radiation technology is the one for developing new products or processes by applying radiation or for creating new functions in industry, research and medical fields, and its application is increasing consistently. For securing an advanced technology competitiveness, it is required to create a new added value by information consumer through providing an efficient system for supporting information, which is the infrastructure for research and development, contributed to its collection, analysis and use with a rapidity and structure in addition to some direct research and development. Provision of the management structure for information resources is especially crucial for efficient operating the system for supporting information in radiation technology, and then a standard classification structure of information must be first developed as the system for supporting information will be constructed. The standard classification structure has been analyzed by reviewing the definition of information resources in radiation technology, and those classification structures in similar systems operated by institute in radiation and other scientific fields. And, a draft version of the standard classification structure has been then provided as 7 large, 25 medium and 71 small classifications, respectively. The standard classification structure in radiation technology will be developed in 2015 through reviewing this draft version and experts' opinion. Finally, developed classification structure will be applied to the system for supporting information by considering the plan for constructing this system and database, and requirements for designing the system. Furthermore, this structure will be designed in the system for searching information by working to the individual need of information consumers.

  19. The sequence coding and search system: An approach for constructing and analyzing event sequences at commercial nuclear power plants

    International Nuclear Information System (INIS)

    Mays, G.T.

    1989-04-01

    The US Nuclear Regulatory Commission (NRC) has recognized the importance of the collection, assessment, and feedstock of operating experience data from commercial nuclear power plants and has centralized these activities in the Office for Analysis and Evaluation of Operational Data (AEOD). Such data is essential for performing safety and reliability analyses, especially analyses of trends and patterns to identify undesirable changes in plant performance at the earliest opportunity to implement corrective measures to preclude the occurrences of a more serious event. One of NRC's principal tools for collecting and evaluating operating experience data is the Sequence Coding and Search System (SCSS). The SCSS consists of a methodology for structuring event sequences and the requisite computer system to store and search the data. The source information for SCSS is the Licensee Event Report (LER), which is a legally required document. This paper describes the objective SCSS, the information it contains, and the format and approach for constructuring SCSS event sequences. Examples are presented demonstrating the use SCSS to support the analysis of LER data. The SCSS contains over 30,000 LERs describing events from 1980 through the present. Insights gained from working with a complex data system from the initial developmental stage to the point of a mature operating system are highlighted

  20. ADVANCED POWER SYSTEMS ANALYSIS TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    Robert R. Jensen; Steven A. Benson; Jason D. Laumb

    2001-08-31

    The use of Energy and Environmental Research Center (EERC) modeling tools and improved analytical methods has provided key information in optimizing advanced power system design and operating conditions for efficiency, producing minimal air pollutant emissions and utilizing a wide range of fossil fuel properties. This project was divided into four tasks: the demonstration of the ash transformation model, upgrading spreadsheet tools, enhancements to analytical capabilities using the scanning electron microscopy (SEM), and improvements to the slag viscosity model. The ash transformation model, Atran, was used to predict the size and composition of ash particles, which has a major impact on the fate of the combustion system. To optimize Atran key factors such as mineral fragmentation and coalescence, the heterogeneous and homogeneous interaction of the organically associated elements must be considered as they are applied to the operating conditions. The resulting model's ash composition compares favorably to measured results. Enhancements to existing EERC spreadsheet application included upgrading interactive spreadsheets to calculate the thermodynamic properties for fuels, reactants, products, and steam with Newton Raphson algorithms to perform calculations on mass, energy, and elemental balances, isentropic expansion of steam, and gasifier equilibrium conditions. Derivative calculations can be performed to estimate fuel heating values, adiabatic flame temperatures, emission factors, comparative fuel costs, and per-unit carbon taxes from fuel analyses. Using state-of-the-art computer-controlled scanning electron microscopes and associated microanalysis systems, a method to determine viscosity using the incorporation of grey-scale binning acquired by the SEM image was developed. The image analysis capabilities of a backscattered electron image can be subdivided into various grey-scale ranges that can be analyzed separately. Since the grey scale's intensity

  1. Thermal Analysis of Pure Uranium Metal, UMo and UMoSi Alloys Using a Differential Thermal Analyzer

    International Nuclear Information System (INIS)

    Yanlinastuti; Sutri Indaryati; Rahmiati

    2010-01-01

    Thermal analysis of pure uranium metal, U-7%Mo and U-7%Mo-1%Si alloys have been done using a Differential Thermal Analyzer (DTA). The experiments are conducted in order to measure the thermal stability, thermochemical properties of elevated temperature and enthalpy of the specimens. From the analysis results it is showed that uranium metal will transform from α to β phases at temperature of 667.16°C and enthalpy of 2.3034 cal/g and from β to γ phases at temperature of 773.05 °C and enthalpy of 2.8725 cal/g and start melting at temperature of 1125.26 °C and enthalpy of 2.1316 cal/g. The U-7%Mo shows its thermal stability up to temperature of 650 °C and its thermal changes at temperature of 673.75 °C indicated by the formation of an endothermic peak and enthalpy of 0.0257 cal/g. The U-7%Mo-1%Si alloys shows its thermal stability up to temperature of 550 °C and its thermal changes at temperature of 574.18 °C indicated by the formation of an endothermic peak and enthalpy of 0.613 cal/g. From the three specimens it is showed that they have a good thermal stability at temperature up to 550 °C. (author)

  2. Control programs of multichannel pulse height analyzer with CAMAC system using FACOM U-200 mini-computer

    International Nuclear Information System (INIS)

    Yamagishi, Kojiro

    1978-02-01

    The 4096 channel Pulse Height Analyzer (PHA) assembled with CAMAC plug-in units has been developed in JAERI. The PHA consists of ADC unit, CRT-display unit, and CAMAC plug-in units, which are memory-controller, MCA-timer, 4K words RAM memory and CRT-driver. The system is on-line connected to FACOM U-200 Mini-Computer through CAMAC interface unit Crate-controller. The softwares for on-line data acquisition of the system have been developed. These are four utility programs written in FORTRAN and two program packages written in assembler language FASP which are CAMAC Program Package and Basic Input/Output Program Package. CAMAC Program Package has 18 subroutine programs for control of CAMAC plug-in units from FACOM U-200 Mini-Computer; and Basic Input/Output Program Package has 26 subroutine programs to input/output data to/from a typewriter, keyboard, cassette magnetic tape and open reel magnetic tape. These subroutine programs are all FORTRAN callable. The PHA with CAMAC system is first outlined, and then usage is described in detail of four utility programs, CAMAC Program Package and Basic Input/Output Program Package. (auth.)

  3. ANALYZING AND MODELING THE ROLE OF HUMAN RESOURCE INFORMATION SYSTEM ON HUMAN RESOURCE PLANNING AT HIGHER EDUCATION INSTITUTION IN INDONESIA

    Directory of Open Access Journals (Sweden)

    Susilo H.

    2017-08-01

    Full Text Available The challenge of Human Resource Management in Higher Education Institutions is how to plan, organize, and assess the performance of human resources so as to contribute as much as possible to the achievement of high quality education objectives. To answer these challenges, the role of Human Resources Information System (HRIS is needed to facilitate leadership both at the university and faculty level in preparing the needs planning and utilizing the advantages of human resources. This study aims to analyze the role of HRIS in human resource planning, especially in the stages of needs planning activities, recruitment and selection, human resources development, promotion and promotion, and assessment of work and remuneration. The output of research resulted in the form of HRIS-based human resource planning concepts for Higher Education Institutions. The research method was designed using qualitative descriptive approach. Data collection is done through observation technique and interview with research location in University of Brawijaya. The results show that the existing HRIS has not played an optimal role because the function of the system is still limited as a data gathering medium and the submission of employment reports that have not been able to contribute as a decision support system for leaders in HR planning.

  4. Performance of the Zeeman analyzer system of the McDonald Observatory 2.7 meter telescope

    Science.gov (United States)

    Vogt, S. S.; Tull, R. G.; Kelton, P. W.

    1980-01-01

    The paper describes a multichannel photoelectric Zeeman analyzer at the coude spectrograph of the McDonald 2.7 m reflector. A comparison of Lick and McDonald observations of HD 153882 reveals no significant difference in slopes or zero points of the two magnetic fields indicating that the systematic scale difference of 30-40% is probably instrumental in origin. Observations of the magnetic variable beta Cor Bor revealed a more nearly sinusoidal magnetic curve with less internal scatter than the photographically determined field measures of the Lick and Mauna Kea Zeeman systems. Investigation of periodicity in the secularly varying magnetic minima of beta Cor Bor did not yield evidence of previously noted periodicities other than that expected from the time structure of the data sampling.

  5. Urinary amino acid analysis: a comparison of iTRAQ-LC-MS/MS, GC-MS, and amino acid analyzer.

    Science.gov (United States)

    Kaspar, Hannelore; Dettmer, Katja; Chan, Queenie; Daniels, Scott; Nimkar, Subodh; Daviglus, Martha L; Stamler, Jeremiah; Elliott, Paul; Oefner, Peter J

    2009-07-01

    Urinary amino acid analysis is typically done by cation-exchange chromatography followed by post-column derivatization with ninhydrin and UV detection. This method lacks throughput and specificity. Two recently introduced stable isotope ratio mass spectrometric methods promise to overcome those shortcomings. Using two blinded sets of urine replicates and a certified amino acid standard, we compared the precision and accuracy of gas chromatography/mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) of propyl chloroformate and iTRAQ derivatized amino acids, respectively, to conventional amino acid analysis. The GC-MS method builds on the direct derivatization of amino acids in diluted urine with propyl chloroformate, GC separation and mass spectrometric quantification of derivatives using stable isotope labeled standards. The LC-MS/MS method requires prior urinary protein precipitation followed by labeling of urinary and standard amino acids with iTRAQ tags containing different cleavable reporter ions distinguishable by MS/MS fragmentation. Means and standard deviations of percent technical error (%TE) computed for 20 amino acids determined by amino acid analyzer, GC-MS, and iTRAQ-LC-MS/MS analyses of 33 duplicate and triplicate urine specimens were 7.27+/-5.22, 21.18+/-10.94, and 18.34+/-14.67, respectively. Corresponding values for 13 amino acids determined in a second batch of 144 urine specimens measured in duplicate or triplicate were 8.39+/-5.35, 6.23+/-3.84, and 35.37+/-29.42. Both GC-MS and iTRAQ-LC-MS/MS are suited for high-throughput amino acid analysis, with the former offering at present higher reproducibility and completely automated sample pretreatment, while the latter covers more amino acids and related amines.

  6. Urinary Amino Acid Analysis: A Comparison of iTRAQ®-LC-MS/MS, GC-MS, and Amino Acid Analyzer

    Science.gov (United States)

    Kaspar, Hannelore; Dettmer, Katja; Chan, Queenie; Daniels, Scott; Nimkar, Subodh; Daviglus, Martha L.; Stamler, Jeremiah; Elliott, Paul; Oefner, Peter J.

    2009-01-01

    Urinary amino acid analysis is typically done by cation-exchange chromatography followed by post-column derivatization with ninhydrin and UV detection. This method lacks throughput and specificity. Two recently introduced stable isotope ratio mass spectrometric methods promise to overcome those shortcomings. Using two blinded sets of urine replicates and a certified amino acid standard, we compared the precision and accuracy of gas chromatography/mass spectrometry (GC-MS) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) of propyl chloroformate and iTRAQ® derivatized amino acids, respectively, to conventional amino acid analysis. The GC-MS method builds on the direct derivatization of amino acids in diluted urine with propyl chloroformate, GC separation and mass spectrometric quantification of derivatives using stable isotope labeled standards. The LC-MS/MS method requires prior urinary protein precipitation followed by labeling of urinary and standard amino acids with iTRAQ® tags containing different cleavable reporter ions distinguishable by MS/MS fragmentation. Means and standard deviations of percent technical error (%TE) computed for 20 amino acids determined by amino acid analyzer, GC-MS, and iTRAQ®-LC-MS/MS analyses of 33 duplicate and triplicate urine specimens were 7.27±5.22, 21.18±10.94, and 18.34±14.67, respectively. Corresponding values for 13 amino acids determined in a second batch of 144 urine specimens measured in duplicate or triplicate were 8.39±5.35, 6.23±3.84, and 35.37±29.42. Both GC-MS and iTRAQ®-LC-MS/MS are suited for high-throughput amino acid analysis, with the former offering at present higher reproducibility and completely automated sample pretreatment, while the latter covers more amino acids and related amines. PMID:19481989

  7. System safety engineering analysis handbook

    Science.gov (United States)

    Ijams, T. E.

    1972-01-01

    The basic requirements and guidelines for the preparation of System Safety Engineering Analysis are presented. The philosophy of System Safety and the various analytic methods available to the engineering profession are discussed. A text-book description of each of the methods is included.

  8. Static Analysis for Systems Biology

    DEFF Research Database (Denmark)

    Nielson, Flemming; Nielson, Hanne Riis; Rosa, D. Schuch da

    2004-01-01

    This paper shows how static analysis techniques can help understanding biological systems. Based on a simple example we illustrate the outcome of performing three different analyses extracting information of increasing precision. We conclude by reporting on the potential impact and exploitation o...... of these techniques in systems biology....

  9. Reachability Analysis of Probabilistic Systems

    DEFF Research Database (Denmark)

    D'Argenio, P. R.; Jeanett, B.; Jensen, Henrik Ejersbo

    2001-01-01

    than the original model, and may safely refute or accept the required property. Otherwise, the abstraction is refined and the process repeated. As the numerical analysis involved in settling the validity of the property is more costly than the refinement process, the method profits from applying...... such numerical analysis on smaller state spaces. The method is significantly enhanced by a number of novel strategies: a strategy for reducing the size of the numerical problems to be analyzed by identification of so-called {essential states}, and heuristic strategies for guiding the refinement process....

  10. Artificial Neural Network Analysis System

    Science.gov (United States)

    2001-02-27

    Contract No. DASG60-00-M-0201 Purchase request no.: Foot in the Door-01 Title Name: Artificial Neural Network Analysis System Company: Atlantic... Artificial Neural Network Analysis System 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Powell, Bruce C 5d. PROJECT NUMBER 5e. TASK NUMBER...34) 27-02-2001 Report Type N/A Dates Covered (from... to) ("DD MON YYYY") 28-10-2000 27-02-2001 Title and Subtitle Artificial Neural Network Analysis

  11. Evaluation of a lower-powered analyzer and sampling system for eddy-covariance measurements of nitrous oxide fluxes

    Directory of Open Access Journals (Sweden)

    S. E. Brown

    2018-03-01

    Full Text Available Nitrous oxide (N2O fluxes measured using the eddy-covariance method capture the spatial and temporal heterogeneity of N2O emissions. Most closed-path trace-gas analyzers for eddy-covariance measurements have large-volume, multi-pass absorption cells that necessitate high flow rates for ample frequency response, thus requiring high-power sample pumps. Other sampling system components, including rain caps, filters, dryers, and tubing, can also degrade system frequency response. This field trial tested the performance of a closed-path eddy-covariance system for N2O flux measurements with improvements to use less power while maintaining the frequency response. The new system consists of a thermoelectrically cooled tunable diode laser absorption spectrometer configured to measure both N2O and carbon dioxide (CO2. The system features a relatively small, single-pass sample cell (200 mL that provides good frequency response with a lower-powered pump ( ∼  250 W. A new filterless intake removes particulates from the sample air stream with no additional mixing volume that could degrade frequency response. A single-tube dryer removes water vapour from the sample to avoid the need for density or spectroscopic corrections, while maintaining frequency response. This eddy-covariance system was collocated with a previous tunable diode laser absorption spectrometer model to compare N2O and CO2 flux measurements for two full growing seasons (May 2015 to October 2016 in a fertilized cornfield in Southern Ontario, Canada. Both spectrometers were placed outdoors at the base of the sampling tower, demonstrating ruggedness for a range of environmental conditions (minimum to maximum daily temperature range: −26.1 to 31.6 °C. The new system rarely required maintenance. An in situ frequency-response test demonstrated that the cutoff frequency of the new system was better than the old system (3.5 Hz compared to 2.30 Hz and similar to that of a closed

  12. Marketing analysis support system; Marketing bunseki shien system

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-01-10

    Fuji Electric Co., Ltd., in collaboration with Shitashion Japan and Arthur Andersen Ltd., developed a 'marketing analysis support system' which integrally analyzes evaluation factors of various dimensions explaining consumers' purchasing behaviors and which supports business for the efficient operation of product development and demand prediction. This system breaks down products into each evaluation factor from psychological and physical viewpoints, and carries out various kinds of multivariate analysis, thereby making it easy to understand visually, for example, what evaluation factors decide relative positional relations between evaluation factors or between products as well as the position of a product in the whole. Further, more precise marketing analysis and prediction become possible by visually grasping blank areas of products, extent of competition, distribution of products, composition of product series, etc. (translated by NEDO)

  13. Space elevator systems level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Laubscher, B. E. (Bryan E.)

    2004-01-01

    The Space Elevator (SE) represents a major paradigm shift in space access. It involves new, untried technologies in most of its subsystems. Thus the successful construction of the SE requires a significant amount of development, This in turn implies a high level of risk for the SE. This paper will present a systems level analysis of the SE by subdividing its components into their subsystems to determine their level of technological maturity. such a high-risk endeavor is to follow a disciplined approach to the challenges. A systems level analysis informs this process and is the guide to where resources should be applied in the development processes. It is an efficient path that, if followed, minimizes the overall risk of the system's development. systems level analysis is that the overall system is divided naturally into its subsystems, and those subsystems are further subdivided as appropriate for the analysis. By dealing with the complex system in layers, the parameter space of decisions is kept manageable. Moreover, A rational way to manage One key aspect of a resources are not expended capriciously; rather, resources are put toward the biggest challenges and most promising solutions. This overall graded approach is a proven road to success. The analysis includes topics such as nanotube technology, deployment scenario, power beaming technology, ground-based hardware and operations, ribbon maintenance and repair and climber technology.

  14. Guideliness for system modeling: fault tree [analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong

    2004-07-01

    This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard.

  15. Guideliness for system modeling: fault tree [analysis

    International Nuclear Information System (INIS)

    Lee, Yoon Hwan; Yang, Joon Eon; Kang, Dae Il; Hwang, Mee Jeong

    2004-07-01

    This document, the guidelines for system modeling related to Fault Tree Analysis(FTA), is intended to provide the guidelines with the analyzer to construct the fault trees in the level of the capability category II of ASME PRA standard. Especially, they are to provide the essential and basic guidelines and the related contents to be used in support of revising the Ulchin 3 and 4 PSA model for risk monitor within the capability category II of ASME PRA standard. Normally the main objective of system analysis is to assess the reliability of system modeled by Event Tree Analysis (ETA). A variety of analytical techniques can be used for the system analysis, however, FTA method is used in this procedures guide. FTA is the method used for representing the failure logic of plant systems deductively using AND, OR or NOT gates. The fault tree should reflect all possible failure modes that may contribute to the system unavailability. This should include contributions due to the mechanical failures of the components, Common Cause Failures (CCFs), human errors and outages for testing and maintenance. This document identifies and describes the definitions and the general procedures of FTA and the essential and basic guidelines for reving the fault trees. Accordingly, the guidelines for FTA will be capable to guide the FTA to the level of the capability category II of ASME PRA standard

  16. Licensing Support System: Preliminary data scope analysis

    International Nuclear Information System (INIS)

    1989-01-01

    The purpose of this analysis is to determine the content and scope of the Licensing Support System (LSS) data base. Both user needs and currently available data bases that, at least in part, address those needs have been analyzed. This analysis, together with the Preliminary Needs Analysis (DOE, 1988d) is a first effort under the LSS Design and Implementation Contract toward developing a sound requirements foundation for subsequent design work. These reports are preliminary. Further refinements must be made before requirements can be specified in sufficient detail to provide a basis for suitably specific system specifications. This document provides a baseline for what is known at this time. Additional analyses, currently being conducted, will provide more precise information on the content and scope of the LSS data base. 23 refs., 4 figs., 8 tabs

  17. Systems Biology Approach and Mathematical Modeling for Analyzing Phase-Space Switch During Epithelial-Mesenchymal Transition.

    Science.gov (United States)

    Simeoni, Chiara; Dinicola, Simona; Cucina, Alessandra; Mascia, Corrado; Bizzarri, Mariano

    2018-01-01

    In this report, we aim at presenting a viable strategy for the study of Epithelial-Mesenchymal Transition (EMT) and its opposite Mesenchymal-Epithelial Transition (MET) by means of a Systems Biology approach combined with a suitable Mathematical Modeling analysis. Precisely, it is shown how the presence of a metastable state, that is identified at a mesoscopic level of description, is crucial for making possible the appearance of a phase transition mechanism in the framework of fast-slow dynamics for Ordinary Differential Equations (ODEs).

  18. Using the Terrestrial Observation and Prediction System (TOPS) to Analyze Impacts of Climate Change on Ecosystems within Northern California Climate Regions

    Science.gov (United States)

    Pitts, K.; Little, M.; Loewenstein, M.; Iraci, L. T.; Milesi, C.; Schmidt, C.; Skiles, J. W.

    2011-12-01

    The projected impacts of climate change on Northern California ecosystems using model outputs from the Terrestrial Observation and Prediction System (TOPS) for the period 1950-2099 based on 1km downscaled climate data from the Geophysical Fluid Dynamics Laboratory (GFDL) model are analyzed in this study. The impacts are analyzed for the Special Report Emissions Scenarios (SRES) A1B and A2, both maintaining present levels of urbanization constant and under projected urban expansion. The analysis is in support of the Climate Adaptation Science Investigation at NASA Ames Research Center. A statistical analysis is completed for time series of temperature, precipitation, gross primary productivity (GPP), evapotranspiration, soil runoff, and vapor pressure deficit. Trends produced from this analysis show that increases in maximum and minimum temperatures lead to declines in peak GPP, length of growing seasons, and overall declines in runoff within the watershed. For Northern California, GPP is projected under the A2 scenario to decrease by 18-25% by the 2090 decade as compared to the 2000 decade. These trends indicate a higher risk to crop production and other ecosystem services, as conditions would be less hospitable to vegetation growth. The increase in dried out vegetation would then lead to a higher risk of wildfire and mudslides in the mountainous regions.

  19. Sparing analysis for FGD systems

    International Nuclear Information System (INIS)

    Dene, C.E.; Weiss, J.; Twombly, M.A.; Witt, J.

    1992-01-01

    With the passage of federal clean air legislation, utilities will be evaluating the capability of various flue gas desulfurization (FGD) system design configurations and operating scenarios to meet sulfur dioxide (SO 2 ) removal goals. The primary goal in reviewing these alternatives will be to optimize SO 2 removal capability in relation to power production costs. The Electric Power Research institute (EPRI) and its contractor, ARINC Research Corporation, have developed an automated FGD Analysis System that can evaluate competing FGD design alternatives in terms of their SO 2 removal capability and operating costs. The FGD Analysis System can be used to evaluate different design configurations for new systems or to calculate the effect of changes in component reliability for existing FGD systems. The system is based on the EPRI UNIRAM methodology and evaluates the impact of alternative FGD component configurations on the expected unit emission rates. The user interactively enters FGD design data, unit SO 2 generation-level data, and FGD chemical additive-level data for the design configuration to be evaluated. The system then calculates expected SO 2 removal capability and operating cost data for operation of the design configuration over a user specified time period. This paper provides a brief description of the FGD Analysis System and presents sample results for three typical design configurations with different redundancy levels

  20. Method for optical 15N analysis of small amounts of nitrogen gas released from an automatic nitrogen analyzer

    International Nuclear Information System (INIS)

    Arima, Yasuhiro

    1981-01-01

    A method of optical 15 N analysis is proposed for application to small amounts of nitrogen gas released from an automatic nitrogen analyzer (model ANA-1300, Carlo Erba, Milano) subjected to certain set modifications. The ANA-1300 was combined with a vacuum line attached by a molecular sieve 13X column. The nitrogen gas released from the ANA-1300 was introduced with a carrier gas of helium into the molecular sieve column which was pre-evacuated at 10 -4 Torr and cooled with outer liquid nitrogen. After removal of the helium by evacuation, the nitrogen gas fixed on the molecular sieve was released by warming the column, and then, it was sealed into pre-evacuated pyrex glass tubes at 4.5 - 5.0 Torr. In the preparation of discharge tubes, contamination of unlabelled nitrogen occurred from the carrier gas of standard grade helium, and the relative lowering of the 15 N value by it was estimated to be less than 1% when over 700 μg nitrogen was charged on the ANA-1300; when 200 μg nitrogen was charged, it was about 3.5%. However, the effect of the contamination could be corrected for by knowing the amount of contaminant nitrogen. In the analysis of plant materials by the proposed method, the coefficient of variation was less than 2%, and no significant difference was observed between results given by the present method and by the ordinary method in which samples were directly pyrolyzed in the discharge tubes by the Dumas method. The present method revealed about 1.5 μg of cross-contaminated nitrogen and was applicable to more than 200 μg of sample nitrogen. (author)

  1. SASSYS LMFBR systems analysis code

    International Nuclear Information System (INIS)

    Dunn, F.E.; Prohammer, F.G.

    1982-01-01

    The SASSYS code provides detailed steady-state and transient thermal-hydraulic analyses of the reactor core, inlet and outlet coolant plenums, primary and intermediate heat-removal systems, steam generators, and emergency shut-down heat removal systems in liquid-metal-cooled fast-breeder reactors (LMFBRs). The main purpose of the code is to analyze the consequences of failures in the shut-down heat-removal system and to determine whether this system can perform its mission adequately even with some of its components inoperable. The code is not plant-specific. It is intended for use with any LMFBR, using either a loop or a pool design, a once-through steam generator or an evaporator-superheater combination, and either a homogeneous core or a heterogeneous core with internal-blanket assemblies

  2. Weld analysis and control system

    Science.gov (United States)

    Kennedy, Larry Z. (Inventor); Rodgers, Michael H. (Inventor); Powell, Bradley W. (Inventor); Burroughs, Ivan A. (Inventor); Goode, K. Wayne (Inventor)

    1994-01-01

    The invention is a Weld Analysis and Control System developed for active weld system control through real time weld data acquisition. Closed-loop control is based on analysis of weld system parameters and weld geometry. The system is adapted for use with automated welding apparatus having a weld controller which is capable of active electronic control of all aspects of a welding operation. Enhanced graphics and data displays are provided for post-weld analysis. The system provides parameter acquisition, including seam location which is acquired for active torch cross-seam positioning. Torch stand-off is also monitored for control. Weld bead and parent surface geometrical parameters are acquired as an indication of weld quality. These parameters include mismatch, peaking, undercut, underfill, crown height, weld width, puddle diameter, and other measurable information about the weld puddle regions, such as puddle symmetry, etc. These parameters provide a basis for active control as well as post-weld quality analysis and verification. Weld system parameters, such as voltage, current and wire feed rate, are also monitored and archived for correlation with quality parameters.

  3. The Orbitrap mass analyzer as a space instrument for the understanding of prebiotic chemistry in the Solar System

    Science.gov (United States)

    Vuitton, Véronique; Briois, Christelle; Makarov, Alexander

    Over the past decade, it has become apparent that organic molecules are widespread in our Solar System and beyond. The better understand of the prebiotic chemistry leading to their formation is a primary objective of many ongoing space missions. Cassini-Huygens revealed the existence of very large molecular structures in Titan's atmosphere as well as on its surface, in the form of dune deposits, but their exact nature remains elusive. One key science goal of the Mars Science Laboratory Curiosity rover is to assess the presence of organics on the red planet. Rosetta will characterize the elemental and isotopic composition of the gas and dust ejected from comet Churyumov-Gerasimenko, while amino acids have been detected in meteorites. This search for complex organics relies heavily on mass spectrometry, which has the remarkable ability to analyze and quantify species from almost any type of sample (provided that the appropriate sampling and ionizing method is used). Because of the harsh constraints of the spatial environment, the mass resolution of the spectrometers onboard current space probes is quite limited compared to laboratory instruments, leading to significant limitations in the scientific return of the data collected. Therefore, future in situ solar system exploration missions would significantly benefit from instruments relying on High Resolution Mass Spectrometry (HRMS). Since 2009, 5 French laboratories (LPC2E, IPAG, LATMOS, LISA, CSNSM) involved in the chemical investigation of solar system bodies form a Consortium to develop HRMS for future space exploration, based on the use of the Orbitrap technology (C. Briois et al., 2014, to be submitted). The work is undertaken in close collaboration with the Thermo Fisher Scientific Company, which commercializes Orbitrap based laboratory instruments. The Orbitrap is an electrostatic mass analyzer, it is compact, lightweight, and can reach a good sensitivity and dynamic range. A prototype is under development at

  4. Simulation of Safety and Transient Analysis of a Pressurized Water Reactor using the Personal Computer Transient Analyzer

    Directory of Open Access Journals (Sweden)

    Sunday J. IBRAHIM

    2013-06-01

    Full Text Available Safety and transient analyses of a pressurised water reactor (PWR using the Personal Computer Transient Analyzer (PCTRAN simulator was carried out. The analyses presented a synergistic integration of a numerical model; a full scope high fidelity simulation system which adopted point reactor neutron kinetics model and movable boundary two phase fluid models to simplify the calculation of the program, so it could achieve real-time simulation on a personal computer. Various scenarios of transients and accidents likely to occur at any nuclear power plant were simulated. The simulations investigated the change of signals and parameters vis a vis loss of coolant accident, scram, turbine trip, inadvertent control rod insertion and withdrawal, containment failure, fuel handling accident in auxiliary building and containment, moderator dilution as well as a combination of these parameters. Furthermore, statistical analyses of the PCTRAN results were carried out. PCTRAN results for the loss of coolant accident (LOCA caused a rapid drop in coolant pressure at the rate of 21.8KN/m2/sec triggering a shutdown of the reactor protection system (RPS, while the turbine trip accident showed a rapid drop in total plant power at the rate of 14.3 MWe/sec causing a downtime in the plant. Fuel handling accidents mimic results showed release of radioactive materials in unacceptable doses. This work shows the potential classes of nuclear accidents likely to occur during operation in proposed reactor sites. The simulations are very appropriate in the light of Nigeria’s plan to generate nuclear energy in the region of 1000 MWe from reactors by 2017.

  5. LHCb Online Log Analysis and Maintenance System

    CERN Document Server

    Garnier, J-C

    2011-01-01

    History has shown, many times computer logs are the only information an administrator may have for an incident, which could be caused either by a malfunction or an attack. Due to the huge amount of logs that are produced from large-scale IT infrastructures, such as LHCb Online, critical information may be overlooked or simply be drowned in a sea of other messages. This clearly demonstrates the need for an automatic system for long-term maintenance and real time analysis of the logs. We have constructed a low cost, fault tolerant centralized logging system which is able to do in-depth analysis and cross-correlation of every log. This system is capable of handling O(10000) different log sources and numerous formats, while trying to keep the overhead as low as possible. It provides log gathering and management, Offline analysis and online analysis. We call Offline analysis the procedure of analyzing old logs for critical information, while Online analysis refer to the procedure of early alerting and reacting. ...

  6. Harm reduction as a complex adaptive system: A dynamic framework for analyzing Tanzanian policies concerning heroin use.

    Science.gov (United States)

    Ratliff, Eric A; Kaduri, Pamela; Masao, Frank; Mbwambo, Jessie K K; McCurdy, Sheryl A

    2016-04-01

    Contrary to popular belief, policies on drug use are not always based on scientific evidence or composed in a rational manner. Rather, decisions concerning drug policies reflect the negotiation of actors' ambitions, values, and facts as they organize in different ways around the perceived problems associated with illicit drug use. Drug policy is thus best represented as a complex adaptive system (CAS) that is dynamic, self-organizing, and coevolving. In this analysis, we use a CAS framework to examine how harm reduction emerged around heroin trafficking and use in Tanzania over the past thirty years (1985-present). This account is an organizational ethnography based on of the observant participation of the authors as actors within this system. We review the dynamic history and self-organizing nature of harm reduction, noting how interactions among system actors and components have coevolved with patterns of heroin us, policing, and treatment activities over time. Using a CAS framework, we describe harm reduction as a complex process where ambitions, values, facts, and technologies interact in the Tanzanian sociopolitical environment. We review the dynamic history and self-organizing nature of heroin policies, noting how the interactions within and between competing prohibitionist and harm reduction policies have changed with patterns of heroin use, policing, and treatment activities over time. Actors learn from their experiences to organize with other actors, align their values and facts, and implement new policies. Using a CAS approach provides researchers and policy actors a better understanding of patterns and intricacies in drug policy. This knowledge of how the system works can help improve the policy process through adaptive action to introduce new actors, different ideas, and avenues for communication into the system. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Reliability analysis of shutdown system

    International Nuclear Information System (INIS)

    Kumar, C. Senthil; John Arul, A.; Pal Singh, Om; Suryaprakasa Rao, K.

    2005-01-01

    This paper presents the results of reliability analysis of Shutdown System (SDS) of Indian Prototype Fast Breeder Reactor. Reliability analysis carried out using Fault Tree Analysis predicts a value of 3.5 x 10 -8 /de for failure of shutdown function in case of global faults and 4.4 x 10 -8 /de for local faults. Based on 20 de/y, the frequency of shutdown function failure is 0.7 x 10 -6 /ry, which meets the reliability target, set by the Indian Atomic Energy Regulatory Board. The reliability is limited by Common Cause Failure (CCF) of actuation part of SDS and to a lesser extent CCF of electronic components. The failure frequency of individual systems is -3 /ry, which also meets the safety criteria. Uncertainty analysis indicates a maximum error factor of 5 for the top event unavailability

  8. Primary system boron dilution analysis

    International Nuclear Information System (INIS)

    Crump, R.J.; Naretto, C.J.; Borgen, R.A.; Rockhold, H.C.

    1978-01-01

    The results are presented for an analysis conducted to determine the potential paths through which nonborated water or water with insufficient boron concentration might enter the LOFT primary coolant piping system or reactor vessel to cause dilution of the borated primary coolant water. No attempt was made in the course of this analysis to identify possible design modifications nor to suggest changes in administrative procedures or controls

  9. Methods for Analyzing the Benefits and Costs of Distributed Photovoltaic Generation to the U.S. Electric Utility System

    Energy Technology Data Exchange (ETDEWEB)

    Denholm, P.; Margolis, R.; Palmintier, B.; Barrows, C.; Ibanez, E.; Bird, L.; Zuboy, J.

    2014-09-01

    This report outlines the methods, data, and tools that could be used at different levels of sophistication and effort to estimate the benefits and costs of DGPV. In so doing, we identify the gaps in current benefit-cost-analysis methods, which we hope will inform the ongoing research agenda in this area. The focus of this report is primarily on benefits and costs from the utility or electricity generation system perspective. It is intended to provide useful background information to utility and regulatory decision makers and their staff, who are often being asked to use or evaluate estimates of the benefits and cost of DGPV in regulatory proceedings. Understanding the technical rigor of the range of methods and how they might need to evolve as DGPV becomes a more significant contributor of energy to the electricity system will help them be better consumers of this type of information. This report is also intended to provide information to utilities, policy makers, PV technology developers, and other stakeholders, which might help them maximize the benefits and minimize the costs of integrating DGPV into a changing electricity system.

  10. Analyzing Peace Pedagogies

    Science.gov (United States)

    Haavelsrud, Magnus; Stenberg, Oddbjorn

    2012-01-01

    Eleven articles on peace education published in the first volume of the Journal of Peace Education are analyzed. This selection comprises peace education programs that have been planned or carried out in different contexts. In analyzing peace pedagogies as proposed in the 11 contributions, we have chosen network analysis as our method--enabling…

  11. Simplified demultiplexing scheme for two PDM-IM/DD systems utilizing a single Stokes analyzer over 25-km SMF.

    Science.gov (United States)

    Pan, Yan; Yan, Lianshan; Yi, Anlin; Jiang, Lin; Pan, Wei; Luo, Bin; Zou, Xihua

    2017-10-15

    We propose a four-linear state of polarization multiplexed intensity modulation and direct detection (IM/DD) scheme based on two orthogonal polarization division multiplexing (PDM) on-off keying systems. We also experimentally demonstrate a simple demultiplexing algorithm for this scheme by utilizing only a single Stokes analyzer. At the rate of 4×10  Gbit/s, the experimental results show that the power penalty of the proposed scheme is about 1.5 dB, compared to the single PDM-IM/DD for back-to-back (B2B) transmission. Compared to B2B, just about 1.7 dB power penalty is required after 25 km Corning LEAF optical fiber transmission. Meanwhile, the performance of the polarization tracking is evaluated, and the results show that the BER fluctuation is less than 0.5 dB with a polarization scrambling rate up to 708.75 deg/s.

  12. Analyzing the politico-moral foundations of the Iran’s health system based on theories of justice

    Science.gov (United States)

    Akrami, Forouzan; Abbasi, Mahmoud; Karimi, Abbas; Shahrivari, Akbar; Majdzadeh, Reza; Zali, Alireza

    2017-01-01

    Public health ethics is a field that covers both factual and ethical issues in health policy and science, and has positive obligations to improve the well-being of populations and reduce social inequalities. It is obvious that various philosophies and moral theories can differently shape the framework of public health ethics. For this reason, the present study reviewed theories of justice in order to analyze and criticize Iran’s general health policies document, served in 14 Articles in 2014. Furthermore, it explored egalitarianism as the dominant theory in the political philosophy of the country’s health care system. According to recent theories of justice, however, health policies must address well-being and its basic dimensions such as health, reasoning, autonomy, and the role of the involved agencies and social institutions in order to achieve social justice beyond distributive justice. Moreover, policy-making in the field of health and biomedical sciences based on Islamic culture necessitates a theory of social justice in the light of theological ethics. Educating people about their rights and duties, increasing their knowledge on individual agency, autonomy, and the role of the government, and empowering them will help achieve social justice. It is recommended to design and implement a strategic plan following each of these policies, based on the above-mentioned values and in collaboration with other sectors, to clarify the procedures in every case. PMID:29291037

  13. Automated Loads Analysis System (ATLAS)

    Science.gov (United States)

    Gardner, Stephen; Frere, Scot; O’Reilly, Patrick

    2013-01-01

    ATLAS is a generalized solution that can be used for launch vehicles. ATLAS is used to produce modal transient analysis and quasi-static analysis results (i.e., accelerations, displacements, and forces) for the payload math models on a specific Shuttle Transport System (STS) flight using the shuttle math model and associated forcing functions. This innovation solves the problem of coupling of payload math models into a shuttle math model. It performs a transient loads analysis simulating liftoff, landing, and all flight events between liftoff and landing. ATLAS utilizes efficient and numerically stable algorithms available in MSC/NASTRAN.

  14. Lens system for SIMS analysis

    International Nuclear Information System (INIS)

    Martinez, G.; Sancho, M.; Garcia-Galan, J.C.

    1987-01-01

    A powerful version of the charge-density method is applied to the study of a combined objective and emission lens, suitable for highly localized analysis of a flat sample surface. This lens can extract secondary ions of equal or opposite polarity to that of the primary particles. A computer simulation of the ion trajectories for both modes is made. The behaviour for different values of the geometric parameters and polarizations is analyzed and useful data for design such as primary beam demagnification and secondary image position are given. (author) 4 refs

  15. Systems analysis - independent analysis and verification

    Energy Technology Data Exchange (ETDEWEB)

    DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)

    1996-10-01

    The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.

  16. Solutions to Improve Person Transport System in the Pitesti City by Analyzing Public Transport vs. Private Transport

    Science.gov (United States)

    Mihaela, Istrate; Alexandru, Boroiu; Viorel, Nicolae; Ionel, Vieru

    2017-10-01

    One of the major problems facing the Pitesti city is the road congestion that occurs in the central area of the city during the peak hours. With all the measures taken in recent years - the widening of road arteries, increasing the number of parking spaces, the creation of overground road passages - it is obvious that the problem can only be solved by a new philosophy regarding urban mobility: it is no longer possible to continue through solutions to increase the accessibility of the central area of the city, but it is necessary, on the contrary, to promote a policy of discouraging the penetration of vehicles in the city center, coupled with a policy of improving the connection between urban public transport and county public transport. This new approach is also proposed in the new Urban Mobility Plan of Pitesti city, under development. The most convincing argument for the necessity of this new orientation in the Pitesti city mobility plan is based on the analysis of the current situation of passenger transport on the territory of Pitesti city: the analysis of “public transport versus private transport” reveals a very low occupancy rate for cars and the fact that the road surface required for a passenger (the dynamic area) is much higher in the case of private transport than in the case of public transport. Measurements of passenger flows and vehicle flows on the 6 penetration ways in the city have been made and the calculations clearly demonstrate the benefits of an urban public transport system connected by “transshipment buses” to be made at the edge of the city, to the county public transport system. In terms of inter-county transport, it will continue to be connected to the urban public transport system by existing bus Station, within the city: South Bus Station and North Bus Station. The usefulness of the paper is that it identifies the solutions for sustainable mobility in Pitesti city and proposes concrete solutions for the development of the

  17. System analysis for radwaste management

    International Nuclear Information System (INIS)

    Lennemann, W.L.

    1987-01-01

    The most logical approach to evaluating radioactive waste management processes and their options is to consider radioactive waste management, handling, and disposal as a complete and complex system from the waste arisings to their disposition. The principal elements that should be considered or taken into account when making a decision involving one or more components of a radwaste management system essentially concern radiation doses or detriments- both radiological and industrial safety and both capital investments and operating costs. This paper discusses the system analysis of the low- and medium-level radioactive waste management

  18. STAT, GAPS, STRAIN, DRWDIM: a system of computer codes for analyzing HTGR fuel test element metrology data. User's manual

    Energy Technology Data Exchange (ETDEWEB)

    Saurwein, J.J.

    1977-08-01

    A system of computer codes has been developed to statistically reduce Peach Bottom fuel test element metrology data and to compare the material strains and fuel rod-fuel hole gaps computed from these data with HTGR design code predictions. The codes included in this system are STAT, STRAIN, GAPS, and DRWDIM. STAT statistically evaluates test element metrology data yielding fuel rod, fuel body, and sleeve irradiation-induced strains; fuel rod anisotropy; and additional data characterizing each analyzed fuel element. STRAIN compares test element fuel rod and fuel body irradiation-induced strains computed from metrology data with the corresponding design code predictions. GAPS compares test element fuel rod, fuel hole heat transfer gaps computed from metrology data with the corresponding design code predictions. DRWDIM plots the measured and predicted gaps and strains. Although specifically developed to expedite the analysis of Peach Bottom fuel test elements, this system can be applied, without extensive modification, to the analysis of Fort St. Vrain or other HTGR-type fuel test elements.

  19. Radiographic enhancement and analysis system

    International Nuclear Information System (INIS)

    Schlosser, M.S.

    1981-01-01

    Radiographic image enhancement and analysis techniques are discussed as they apply to nondestructive inspection. A system is described which has been developed to enhance and quantitatively evaluate radiographic images using digital computer techniques. Some examples of typical applications are also presented as an introduction to this new inspection technique. (author)

  20. Systems analysis and the computer

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, A S

    1983-08-01

    The words systems analysis are used in at least two senses. Whilst the general nature of the topic is well understood in the or community, the nature of the term as used by computer scientists is less familiar. In this paper, the nature of systems analysis as it relates to computer-based systems is examined from the point of view that the computer system is an automaton embedded in a human system, and some facets of this are explored. It is concluded that or analysts and computer analysts have things to learn from each other and that this ought to be reflected in their education. The important role played by change in the design of systems is also highlighted, and it is concluded that, whilst the application of techniques developed in the artificial intelligence field have considerable relevance to constructing automata able to adapt to change in the environment, study of the human factors affecting the overall systems within which the automata are embedded has an even more important role. 19 references.

  1. Acoustic analysis of a piping system

    International Nuclear Information System (INIS)

    Misra, A.S.; Vijay, D.K.

    1996-01-01

    Acoustic pulsations in the Darlington Nuclear Generating Station, a 881 MW CANDU, primary heat transport piping system caused fuel bundle failures under short term operations. The problem was successfully analyzed using the steady-state acoustic analysis capability of the ABAQUS program. This paper describes in general, modelling of low amplitude acoustic pulsations in a liquid filled piping system using ABAQUS. The paper gives techniques for estimating the acoustic medium properties--bulk modulus, fluid density and acoustic damping--and modelling fluid-structure interactions at orifices and elbows. The formulations and techniques developed are benchmarked against the experiments given in 3 cited references. The benchmark analysis shows that the ABAQUS results are in excellent agreement with the experiments

  2. Integrated Reliability and Risk Analysis System (IRRAS)

    International Nuclear Information System (INIS)

    Russell, K.D.; McKay, M.K.; Sattison, M.B.; Skinner, N.L.; Wood, S.T.; Rasmuson, D.M.

    1992-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is a state-of-the-art, microcomputer-based probabilistic risk assessment (PRA) model development and analysis tool to address key nuclear plant safety issues. IRRAS is an integrated software tool that gives the user the ability to create and analyze fault trees and accident sequences using a microcomputer. This program provides functions that range from graphical fault tree construction to cut set generation and quantification. Version 1.0 of the IRRAS program was released in February of 1987. Since that time, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version has been designated IRRAS 4.0 and is the subject of this Reference Manual. Version 4.0 of IRRAS provides the same capabilities as Version 1.0 and adds a relational data base facility for managing the data, improved functionality, and improved algorithm performance

  3. Method of determining external defects of a structure by analyzing a series of its images in the monitoring system

    Directory of Open Access Journals (Sweden)

    Loktev Aleksey Alekseevich

    2015-03-01

    geometrical parameters by analyzing a series of images. This is the issue and the subject of this work, which developed the computational algorithms to detect external defects. At the stage of preliminary image processing there is the delineation of characteristic points in the image and the calculation of the optical flow in the area of these points. When determining the defect position, the characteristic points of the image are determined using the detector of Harris-Laplace, which are located in the central part of the image. The characteristic points outside the frame are considered to be background. There is an identification of the changes in characteristic points in the frame in relation to the background by using a pyramidal iterative scheme. In the second stage servo frame focuses on a specific point with the greatest change in relation to the background in the current time. The algorithm for object detection and determination of its parameters includes three procedures: detection procedure start; the procedure of the next image processing; stop procedure for determining the parameters of the object. The method described here can be used to create information-measuring system of monitoring based on the use of photodetectors with high-definition and recognition of defects (color differences and differences in the form compared to the background. Since almost each examination of a building or structure begins with a visual examination and determination of the most probable places of occurrence and presence of the defects, the proposed method can be combined with this stage and it will simplify the process of diagnosing, screening for the development of projects on reconstruction and placement of additional equipment on the existing infrastructure.

  4. Plutonium solution analyzer

    International Nuclear Information System (INIS)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded)

  5. Plutonium solution analyzer

    Energy Technology Data Exchange (ETDEWEB)

    Burns, D.A.

    1994-09-01

    A fully automated analyzer has been developed for plutonium solutions. It was assembled from several commercially available modules, is based upon segmented flow analysis, and exhibits precision about an order of magnitude better than commercial units (0.5%-O.05% RSD). The system was designed to accept unmeasured, untreated liquid samples in the concentration range 40-240 g/L and produce a report with sample identification, sample concentrations, and an abundance of statistics. Optional hydraulics can accommodate samples in the concentration range 0.4-4.0 g/L. Operating at a typical rate of 30 to 40 samples per hour, it consumes only 0.074 mL of each sample and standard, and generates waste at the rate of about 1.5 mL per minute. No radioactive material passes through its multichannel peristaltic pump (which remains outside the glovebox, uncontaminated) but rather is handled by a 6-port, 2-position chromatography-type loop valve. An accompanying computer is programmed in QuickBASIC 4.5 to provide both instrument control and data reduction. The program is truly user-friendly and communication between operator and instrument is via computer screen displays and keyboard. Two important issues which have been addressed are waste minimization and operator safety (the analyzer can run in the absence of an operator, once its autosampler has been loaded).

  6. Development of cost-benefit analysis system

    International Nuclear Information System (INIS)

    Shiba, Tsuyoshi; Mishima, Tetsuya; Yuyama, Tomonori; Suzuki, Atsushi

    2001-01-01

    In order to promote the FDR development, it is necessary to see various benefits brought by introduction of FBR from multiple perspectives and have a good grasp of such benefits quantitatively and an adequate R and D investment scale which corresponds with them. In this study, the structured prototype in the previous study was improved to be able to perform cost-benefit analysis. An example of improvement made in the system is addition of subroutine used for comparison between new energy and benefits brought by introduction of FBR with special emphasis on addition of logic for analyzing externality about the new energy. Other improvement examples are modification of the Conventional Year Expense Ratio method of power generation cost to Average Durable Year Cost method, addition of database function and turning input data into database, and reviewing idea on cost by the type of waste material and price of uranium. The cost-benefit analysis system was also restructured utilizing Microsoft ACCESS so that it should have a data base function. As the result of the improvement mentioned above, we expect that the improved cost-benefit analysis system will have higher generality than the system before; therefore, great deal of benefits brought by application of the system in the future is expected. (author)

  7. A neutron activation analyzer

    International Nuclear Information System (INIS)

    Westphal, G.P.; Lemmel, H.; Grass, F.; De Regge, P.P.; Burns, K.; Markowicz, A.

    2005-01-01

    Dubbed 'Analyzer' because of its simplicity, a neutron activation analysis facility for short-lived isomeric transitions is based on a low-cost rabbit system and an adaptive digital filter which are controlled by a software performing irradiation control, loss-free gamma-spectrometry, spectra evaluation, nuclide identification and calculation of concentrations in a fully automatic flow of operations. Designed for TRIGA reactors and constructed from inexpensive plastic tubing and an aluminum in-core part, the rabbit system features samples of 5 ml and 10 ml with sample separation at 150 ms and 200 ms transport time or 25 ml samples without separation at a transport time of 300 ms. By automatically adapting shaping times to pulse intervals the preloaded digital filter gives best throughput at best resolution up to input counting rates of 10 6 cps. Loss-free counting enables quantitative correction of counting losses of up to 99%. As a test of system reproducibility in sample separation geometry, K, Cl, Mn, Mg, Ca, Sc, and V have been determined in various reference materials at excellent agreement with consensus values. (author)

  8. Miniature mass analyzer

    CERN Document Server

    Cuna, C; Lupsa, N; Cuna, S; Tuzson, B

    2003-01-01

    The paper presents the concept of different mass analyzers that were specifically designed as small dimension instruments able to detect with great sensitivity and accuracy the main environmental pollutants. The mass spectrometers are very suited instrument for chemical and isotopic analysis, needed in environmental surveillance. Usually, this is done by sampling the soil, air or water followed by laboratory analysis. To avoid drawbacks caused by sample alteration during the sampling process and transport, the 'in situ' analysis is preferred. Theoretically, any type of mass analyzer can be miniaturized, but some are more appropriate than others. Quadrupole mass filter and trap, magnetic sector, time-of-flight and ion cyclotron mass analyzers can be successfully shrunk, for each of them some performances being sacrificed but we must know which parameters are necessary to be kept unchanged. To satisfy the miniaturization criteria of the analyzer, it is necessary to use asymmetrical geometries, with ion beam obl...

  9. Bifurcation analysis of a three dimensional system

    Directory of Open Access Journals (Sweden)

    Yongwen WANG

    2018-04-01

    Full Text Available In order to enrich the stability and bifurcation theory of the three dimensional chaotic systems, taking a quadratic truncate unfolding system with the triple singularity equilibrium as the research subject, the existence of the equilibrium, the stability and the bifurcation of the system near the equilibrium under different parametric conditions are studied. Using the method of mathematical analysis, the existence of the real roots of the corresponding characteristic equation under the different parametric conditions is analyzed, and the local manifolds of the equilibrium are gotten, then the possible bifurcations are guessed. The parametric conditions under which the equilibrium is saddle-focus are analyzed carefully by the Cardan formula. Moreover, the conditions of codimension-one Hopf bifucation and the prerequisites of the supercritical and subcritical Hopf bifurcation are found by computation. The results show that the system has abundant stability and bifurcation, and can also supply theorical support for the proof of the existence of the homoclinic or heteroclinic loop connecting saddle-focus and the Silnikov's chaos. This method can be extended to study the other higher nonlinear systems.

  10. Multichannel Mars Organic Analyzer (McMOA): Microfluidic Networks for the Automated In Situ Microchip Electrophoretic Analysis of Organic Biomarkers on Mars

    Science.gov (United States)

    Chiesl, T. N.; Benhabib, M.; Stockton, A. M.; Mathies, R. A.

    2010-04-01

    We present the Multichannel Mars Organic Analyzer (McMOA) for the analysis of Amino Acids, PAHs, and Oxidized Carbon. Microfluidic architecures integrating automated metering, mixing, on chip reactions, and serial dilutions are also discussed.

  11. Safety analysis of tritium processing system based on PHA

    International Nuclear Information System (INIS)

    Fu Wanfa; Luo Deli; Tang Tao

    2012-01-01

    Safety analysis on primary confinement of tritium processing system for TBM was carried out with Preliminary Hazard Analysis. Firstly, the basic PHA process was given. Then the function and safe measures with multiple confinements about tritium system were described and analyzed briefly, dividing the two kinds of boundaries of tritium transferring through, that are multiple confinement systems division and fluid loops division. Analysis on tritium releasing is the key of PHA. Besides, PHA table about tritium releasing was put forward, the causes and harmful results being analyzed, and the safety measures were put forward also. On the basis of PHA, several kinds of typical accidents were supposed to be further analyzed. And 8 factors influencing the tritium safety were analyzed, laying the foundation of evaluating quantitatively the safety grade of various nuclear facilities. (authors)

  12. RootAnalyzer: A Cross-Section Image Analysis Tool for Automated Characterization of Root Cells and Tissues.

    Directory of Open Access Journals (Sweden)

    Joshua Chopin

    Full Text Available The morphology of plant root anatomical features is a key factor in effective water and nutrient uptake. Existing techniques for phenotyping root anatomical traits are often based on manual or semi-automatic segmentation and annotation of microscopic images of root cross sections. In this article, we propose a fully automated tool, hereinafter referred to as RootAnalyzer, for efficiently extracting and analyzing anatomical traits from root-cross section images. Using a range of image processing techniques such as local thresholding and nearest neighbor identification, RootAnalyzer segments the plant root from the image's background, classifies and characterizes the cortex, stele, endodermis and epidermis, and subsequently produces statistics about the morphological properties of the root cells and tissues. We use RootAnalyzer to analyze 15 images of wheat plants and one maize plant image and evaluate its performance against manually-obtained ground truth data. The comparison shows that RootAnalyzer can fully characterize most root tissue regions with over 90% accuracy.

  13. Parametric systems analysis for tandem mirror hybrids

    International Nuclear Information System (INIS)

    Lee, J.D.; Chapin, D.L.; Chi, J.W.H.

    1980-09-01

    Fusion fission systems, consisting of fissile producing fusion hybrids combining a tandem mirror fusion driver with various blanket types and net fissile consuming LWR's, have been modeled and analyzed parametrically. Analysis to date indicates that hybrids can be competitive with mined uranium when U 3 O 8 cost is about 100 $/lb., adding less than 25% to present day cost of power from LWR's. Of the three blanket types considered, uranium fast fission (UFF), thorium fast fission (ThFF), and thorium fission supressed (ThFS), the ThFS blanket has a modest economic advantage under most conditions but has higher support ratios and potential safety advantages under all conditions

  14. System reliability analysis with natural language and expert's subjectivity

    International Nuclear Information System (INIS)

    Onisawa, T.

    1996-01-01

    This paper introduces natural language expressions and expert's subjectivity to system reliability analysis. To this end, this paper defines a subjective measure of reliability and presents the method of the system reliability analysis using the measure. The subjective measure of reliability corresponds to natural language expressions of reliability estimation, which is represented by a fuzzy set defined on [0,1]. The presented method deals with the dependence among subsystems and employs parametrized operations of subjective measures of reliability which can reflect expert 's subjectivity towards the analyzed system. The analysis results are also expressed by linguistic terms. Finally this paper gives an example of the system reliability analysis by the presented method

  15. A gas conditioning and analysis system

    International Nuclear Information System (INIS)

    Busch, F.R.

    1974-01-01

    A system for carrying out a rapid analysis of explosive gas-mixtures is described. This system comprises a conduit connecting a sample taking point to a detection chamber, said chamber containing a mass of liquid into which the gas sample is discharged and being provided with a detecting unit for analyzing gases and with separate gas exit and liquid exit. The liquid is sent to a level-regulating chamber, whereas said gas exit sends the gas to a gas-stopping chamber which is in turn, connected to the conduit leading to a discharge point, and a vacuum pump for drawing up the gas sample into the system. This can be apply to nuclear power stations [fr

  16. Structured analysis and modeling of complex systems

    Science.gov (United States)

    Strome, David R.; Dalrymple, Mathieu A.

    1992-01-01

    The Aircrew Evaluation Sustained Operations Performance (AESOP) facility at Brooks AFB, Texas, combines the realism of an operational environment with the control of a research laboratory. In recent studies we collected extensive data from the Airborne Warning and Control Systems (AWACS) Weapons Directors subjected to high and low workload Defensive Counter Air Scenarios. A critical and complex task in this environment involves committing a friendly fighter against a hostile fighter. Structured Analysis and Design techniques and computer modeling systems were applied to this task as tools for analyzing subject performance and workload. This technology is being transferred to the Man-Systems Division of NASA Johnson Space Center for application to complex mission related tasks, such as manipulating the Shuttle grappler arm.

  17. Analyzing Katana referral hospital as a complex adaptive system: agents, interactions and adaptation to a changing environment.

    Science.gov (United States)

    Karemere, Hermès; Ribesse, Nathalie; Marchal, Bruno; Macq, Jean

    2015-01-01

    This study deals with the adaptation of Katana referral hospital in Eastern Democratic Republic of Congo in a changing environment that is affected for more than a decade by intermittent armed conflicts. His objective is to generate theoretical proposals for addressing differently the analysis of hospitals governance in the aims to assess their performance and how to improve that performance. The methodology applied approach uses a case study using mixed methods ( qualitative and quantitative) for data collection. It uses (1) hospital data to measure the output of hospitals, (2) literature review to identify among others, events and interventions recorded in the history of hospital during the study period and (3) information from individual interviews to validate the interpretation of the results of the previous two sources of data and understand the responsiveness of management team referral hospital during times of change. The study brings four theoretical propositions: (1) Interaction between key agents is a positive force driving adaptation if the actors share a same vision, (2) The strength of the interaction between agents is largely based on the nature of institutional arrangements, which in turn are shaped by the actors themselves, (3) The owner and the management team play a decisive role in the implementation of effective institutional arrangements and establishment of positive interactions between agents, (4) The analysis of recipient population's perception of health services provided allow to better tailor and adapt the health services offer to the population's needs and expectations. Research shows that it isn't enough just to provide support (financial and technical), to manage a hospital for operate and adapt to a changing environment but must still animate, considering that it is a complex adaptive system and that this animation is nothing other than the induction of a positive interaction between agents.

  18. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  19. A Human Body Analysis System

    Directory of Open Access Journals (Sweden)

    Girondel Vincent

    2006-01-01

    Full Text Available This paper describes a system for human body analysis (segmentation, tracking, face/hands localisation, posture recognition from a single view that is fast and completely automatic. The system first extracts low-level data and uses part of the data for high-level interpretation. It can detect and track several persons even if they merge or are completely occluded by another person from the camera's point of view. For the high-level interpretation step, static posture recognition is performed using a belief theory-based classifier. The belief theory is considered here as a new approach for performing posture recognition and classification using imprecise and/or conflicting data. Four different static postures are considered: standing, sitting, squatting, and lying. The aim of this paper is to give a global view and an evaluation of the performances of the entire system and to describe in detail each of its processing steps, whereas our previous publications focused on a single part of the system. The efficiency and the limits of the system have been highlighted on a database of more than fifty video sequences where a dozen different individuals appear. This system allows real-time processing and aims at monitoring elderly people in video surveillance applications or at the mixing of real and virtual worlds in ambient intelligence systems.

  20. Analysis of chaos in high-dimensional wind power system.

    Science.gov (United States)

    Wang, Cong; Zhang, Hongli; Fan, Wenhui; Ma, Ping

    2018-01-01

    A comprehensive analysis on the chaos of a high-dimensional wind power system is performed in this study. A high-dimensional wind power system is more complex than most power systems. An 11-dimensional wind power system proposed by Huang, which has not been analyzed in previous studies, is investigated. When the systems are affected by external disturbances including single parameter and periodic disturbance, or its parameters changed, chaotic dynamics of the wind power system is analyzed and chaotic parameters ranges are obtained. Chaos existence is confirmed by calculation and analysis of all state variables' Lyapunov exponents and the state variable sequence diagram. Theoretical analysis and numerical simulations show that the wind power system chaos will occur when parameter variations and external disturbances change to a certain degree.