WorldWideScience

Sample records for description analysis simulation

  1. Geometry Description Markup Language for Physics Simulation And Analysis Applications.

    Energy Technology Data Exchange (ETDEWEB)

    Chytracek, R.; /CERN; McCormick, J.; /SLAC; Pokorski, W.; /CERN; Santin, G.; /European Space Agency

    2007-01-23

    The Geometry Description Markup Language (GDML) is a specialized XML-based language designed as an application-independent persistent format for describing the geometries of detectors associated with physics measurements. It serves to implement ''geometry trees'' which correspond to the hierarchy of volumes a detector geometry can be composed of, and to allow to identify the position of individual solids, as well as to describe the materials they are made of. Being pure XML, GDML can be universally used, and in particular it can be considered as the format for interchanging geometries among different applications. In this paper we will present the current status of the development of GDML. After having discussed the contents of the latest GDML schema, which is the basic definition of the format, we will concentrate on the GDML processors. We will present the latest implementation of the GDML ''writers'' as well as ''readers'' for either Geant4 [2], [3] or ROOT [4], [10].

  2. Atmospheric mercury simulation using the CMAQ model: formulation description and analysis of wet deposition results

    Science.gov (United States)

    Bullock, O. Russell; Brehme, Katherine A.

    The community multiscale air quality (CMAQ) modeling system has been adapted to simulate the emission, transport, transformation and deposition of atmospheric mercury (Hg) in three distinct forms: elemental Hg gas, reactive gaseous Hg, and particulate Hg. Emissions of Hg are currently defined from information published in the Environmental Protection Agency's Mercury Study Report to Congress. The atmospheric transport of these three forms of Hg is simulated in the same manner as for all other substances simulated by the CMAQ model to date. Transformations of Hg are simulated with four new chemical reactions within the standard CMAQ gaseous chemistry framework and a highly modified cloud chemistry mechanism which includes a compound-specific speciation for oxidized forms of Hg, seven new aqueous-phase Hg reactions, six aqueous Hg chemical equilibria, and a two-way mechanism for the sorption of dissolved oxidized Hg to elemental carbon particles. The CMAQ Hg model simulates the partitioning of reactive gaseous Hg between air and cloud water based on the Henry's constant for mercuric chloride. Henry's equilibrium is assumed for elemental Hg also. Particulate Hg is assumed to be incorporated into the aqueous medium during cloud nucleation. Wet and dry deposition is simulated for each of the three forms of Hg. Wet deposition rate is calculated based on precipitation information from the CMAQ meteorological processor and the physicochemical Hg speciation in the cloud chemistry mechanism. Dry deposition rate is calculated based on dry deposition velocity and air concentration information for each of the three forms of Hg. The horizontal modeling domain covers the central and eastern United States and adjacent southern Canada. An analysis of simulated Hg wet deposition versus weekly observations is performed. The results are described for two evaluation periods: 4 April-2 May 1995, and 20 June-18 July 1995.

  3. Multidimensional nonlinear descriptive analysis

    CERN Document Server

    Nishisato, Shizuhiko

    2006-01-01

    Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...

  4. Simulation framework and XML detector description for the CMS experiment

    CERN Document Server

    Arce, P; Boccali, T; Case, M; de Roeck, A; Lara, V; Liendl, M; Nikitenko, A N; Schröder, M; Strässner, A; Wellisch, H P; Wenzel, H

    2003-01-01

    Currently CMS event simulation is based on GEANT3 while the detector description is built from different sources for simulation and reconstruction. A new simulation framework based on GEANT4 is under development. A full description of the detector is available, and the tuning of the GEANT4 performance and the checking of the ability of the physics processes to describe the detector response is ongoing. Its integration on the CMS mass production system and GRID is also currently under development. The Detector Description Database project aims at providing a common source of information for Simulation, Reconstruction, Analysis, and Visualisation, while allowing for different representations as well as specific information for each application. A functional prototype, based on XML, is already released. Also examples of the integration of DDD in the GEANT4 simulation and in the reconstruction applications are provided.

  5. Production Logistics Simulation Supported by Process Description Languages

    Directory of Open Access Journals (Sweden)

    Bohács Gábor

    2016-03-01

    Full Text Available The process description languages are used in the business may be useful in the optimization of logistics processes too. The process description languages would be the obvious solution for process control, to handle the main sources of faults and to give a correct list of what to do during the logistics process. Related to this, firstly, the paper presents the main features of the frequent process description languages. The following section describes the currently most used process modelling languages, in the areas of production and construction logistics. In addition, the paper gives some examples of logistics simulation, as another very important field of logistics system modelling. The main edification of the paper, the logistics simulation supported by process description languages. The paper gives a comparison of a Petri net formal representation and a Simul8 model, through a construction logistics model, as the major contribution of the research.

  6. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  7. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  8. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Directory of Open Access Journals (Sweden)

    Waltemath Dagmar

    2011-12-01

    Full Text Available Abstract Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML. SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s used

  9. Fast Detector Simulation Using Lelaps, Detector Descriptions in GODL

    Energy Technology Data Exchange (ETDEWEB)

    Langeveld, Willy; /SLAC

    2005-07-06

    Lelaps is a fast detector simulation program which reads StdHep generator files and produces SIO or LCIO output files. It swims particles through detectors taking into account magnetic fields, multiple scattering and dE/dx energy loss. It simulates parameterized showers in EM and hadronic calorimeters and supports gamma conversions and decays. In addition to three built-in detector configurations, detector descriptions can also be read from files in the new GODL file format.

  10. Eportfolios: From description to analysis

    Directory of Open Access Journals (Sweden)

    Gabriella Minnes Brandes

    2008-06-01

    Full Text Available In recent years, different professional and academic settings have been increasingly utilizing ePortfolios to serve multiple purposes from recruitment to evaluation. This p aper analyzes ePortfolios created by graduate students at a Canadian university. Demonstrated is how students’ constructions can, and should, be more than a simple compilation of artifacts. Examined is an online learning environment whereby we shared knowledge, supported one another in knowledge construction, developed collective expertise, and engaged in progressive discourse. In our analysis of the portfolios, we focused on reflection and deepening understanding of learning. We discussed students’ use of metaphors and hypertexts as means of making cognitive connections. We found that when students understood technological tools and how to use them to substantiate their thinking processes and to engage the readers/ viewers, their ePortfolios were richer and more complex in their illustrations of learning. With more experience and further analysis of exemplars of existing portfolios, students became more nuanced in their organization of their ePortfolios, reflecting the messages they conveyed. Metaphors and hypertexts became useful vehicles to move away from linearity and chronology to new organizational modes that better illustrated students’ cognitive processes. In such a community of inquiry, developed within an online learning space, the instructor and peers had an important role in enhancing reflection through scaffolding. We conclude the paper with a call to explore the interactions between viewer/ reader and the materials presented in portfolios as part of learning occasions.

  11. Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study.

    Science.gov (United States)

    McKenna, Kim D; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann

    2015-01-01

    The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders' efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to those manikins. Many (78%) respondents felt

  12. Tailored work hardening descriptions in simulation of sheet metal forming

    Science.gov (United States)

    Vegter, Henk; Mulder, Hans.; van Liempt, Peter; Heijne, Jan

    2013-12-01

    In the previous decades much attention has been given on an accurate material description, especially for simulations at the design stage of new models in the automotive industry. Improvements lead to shorter design times and a better tailored use of material. It also contributed to the design and optimization of new materials. The current description of plastic material behaviour in simulation models of sheet metal forming is covered by a hardening curve and a yield surface. In this paper the focus will be on modelling of work hardening for advanced high strength steels considering the requirements of present applications. Nowadays work hardening models need to include the effect of hard phases in a soft matrix and the effect of strain rate and temperature on work hardening. Most material tests to characterize work hardening are only applicable to low strains whereas many practical applications require hardening data at relatively high strains. Therefore, physically based hardening descriptions are needed allowing reliable extensions to high strain values.

  13. Detector Description Software for Simulation,Reconstruction and Visualisation

    Institute of Scientific and Technical Information of China (English)

    PedroArce

    2001-01-01

    This paper describes a software that reads the detector description from tagbased ASCII files and builds an independent detector representation (its only dependency being the CLHEP library) in transient mode,A second package uses this transient representation to build automatically a GEANT4 representation of the detector geometry and materials.The software supports and kind of element and material,including material mixtures built by giving the weight fractions,the volume fractions of the number of atoms of each component.The common solid shapes(box,cube,cone,sphere,polycone,polyhedra,…)can be used,as well as solids made from a boolean operation of another solids(addition,substraction and intersection),The geometry volumes can be placed through simple positioning or positioning several copies following a given formula,so that each copy can have different position and rotation,Also divisioning of a volume along one of its axis is supported for the basic solid shpaes. The Simulation,Reconstruction and Visualisation(ROOT based)software contain no detector data.Instead,this data is always accesed through a unique interface to the GEANT4 detector representation package,which manages the complicated parameterised positionings and divisions and returns GEANT4 independent objects.Both packages have been built following strict Software Engineering practices,that we also describe in the paper.The software has been used for the detector description of the PS214(HARP) experiment at CERN,which is taking data since april 2001,With minor modifications it is also been used for the GEANT4 simulation of the CMS experiment.

  14. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    Science.gov (United States)

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  15. Theoretical Description and Numerical Simulation of the Hydrodynamic Coupling

    Directory of Open Access Journals (Sweden)

    V. O. Lomakin

    2016-01-01

    Full Text Available The article’s subject is to study and describe the processes in the hydrodynamic coupling during its operation. The hydrodynamic coupling is a type of hydrodynamic transmission that provides a flexible connection between the input and output shafts, in contrast to the mechanical coupling. Currently, the fluid couplings are widely used and the theoretical description of their operation has been given long before. However, in Russia these units are not produced, the theoretical model is very simple while the experimental data are scattered and non-systematized. So the problem is relevant and requires consideration.The research objective is to complement the existing theoretical model for better describing the fluid coupling operation as well as to compare the results, based on its using, with the numerical simulation results. The main part of the article contains these sections.The mathematical model shows: the equations used for theoretical description of the fluid coupling operation, the basic hydrodynamic equations converted to solve the problem in a stationary setting, and the applied turbulence model (k-ω. The author backslides from the standard jet theory in which the calculation is performed at an average trickle in order to take into consideration the non-uniformity of the velocity distribution in the fluid coupling.The article also raised the issue on the applicability of the stationary formulation of the problem for the numerical simulation. The study revealed that the solutions obtained under stationary and non-stationary calculations practically match. The verification was conducted by three points of characteristic of the hydraulic coupling.The article gives the fluid coupling dimensions, represents an image of its threedimensional model and of the computational grid. It also shows some figures to illustrate the processes in a fluid coupling obtained by its numerical modeling.During the study it was found out that the proposed

  16. Water Quality Analysis Simulation

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...

  17. Water Quality Analysis Simulation

    Science.gov (United States)

    The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.

  18. LARYNGEAL MALIGNANCY: A RETROSPECTIVE DESCRIPTIVE ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vinod Kumar

    2016-06-01

    Full Text Available BACKGROUND Laryngeal cancer is the second most common head and neck cancer in India. The onset, rate of progression and duration of symptoms are variable for supraglottic, glottic and subglottic cancer. Smoking and alcohol are also most important risk factors for laryngeal cancer. Data regarding cases of laryngeal cancer in relation to age, sex, symptoms and signs, aetiological factors with special reference to smoking and alcohol, histopathological types, tumour staging, treatment and outcomes are important to assess changing trends in laryngeal cancer treatment. MATERIALS AND METHODS This study is about retrospective descriptive analysis of diagnosed and treated cases of laryngeal cancer in the Department of ENT from 2005 to 2008. Total fifty patients with laryngeal malignancy were seen from May 2005 to May 2008 with average 1 year of follow-up. Data regarding cases of laryngeal cancer in relation to age, sex, symptoms and signs, aetiological factors with special reference to smoking and alcohol, histopathological types, tumour staging, treatment and outcomes were analysed using SPSS software. All patients who were diagnosed to have laryngeal cancer and treated were included in the study. RESULTS In this descriptive analysis, 62% patients are between 51-70 years. Most of the patients had been symptomatic for 3-5 months; 58% patients presented with voice change followed by other complaints like throat pain, foreign body sensation, otalgia, breathing difficulty. Voice change was distinctly the most common symptom regardless of tumour site. It was more prevalent in glottis cases, but it was also the leading symptom in supraglottic tumours. Glottis tumours were more often found at an early stage and patients with a supraglottic tumour presented more often with neck node metastasis. CONCLUSION Laryngeal malignancy is one of the head and neck malignancies, which are more common in males. Tobacco is an important risk factor in causation of the

  19. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language

    National Research Council Canada - National Science Library

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-01-01

    .... In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments...

  20. Multifractal Description of Simulated Flow Velocity in Idealised Porous Media by Using the Sandbox Method

    Science.gov (United States)

    Jiménez-Hornero, Francisco J.; Ariza-Villaverde, Ana B.; de Ravé, Eduardo Gutiérrez

    2013-03-01

    The spatial description of flows in porous media is a main issue due to their influence on processes that take place inside. In addition to descriptive statistics, the multifractal analysis based on the Box-Counting fixed-size method has been used during last decade to study some porous media features. However, this method gives emphasis to domain regions containing few data points that spark the biased assessment of generalized fractal dimensions for negative moment orders. This circumstance is relevant when describing the flow velocity field in idealised three-dimensional porous media. The application of the Sandbox method is explored in this work as an alternative to the Box-Counting procedure for analyzing flow velocity magnitude simulated with the lattice model approach for six media with different porosities. According to the results, simulated flows have multiscaling behaviour. The multifractal spectra obtained with the Sandbox method reveal more heterogeneity as well as the presence of some extreme values in the distribution of high flow velocity magnitudes as porosity decreases. This situation is not so evident for the multifractal spectra estimated with the Box-Counting method. As a consequence, the description of the influence of porous media structure on flow velocity distribution provided by the Sandbox method improves the results obtained with the Box-Counting procedure.

  1. Generating a 3D Simulation of a Car Accident from a Formal Description : the CarSim System

    NARCIS (Netherlands)

    Egges, A.; Nijholt, A.; Nugues, P.

    2001-01-01

    The problem of generating a 3D simulation of a car accident from a written description can be divided into two subtasks: the linguistic analysis and the virtual scene generation. As a means of communication between these two system parts, we designed a template formalism to represent a written

  2. Exploiting spatial descriptions in visual scene analysis.

    Science.gov (United States)

    Ziegler, Leon; Johannsen, Katrin; Swadzba, Agnes; De Ruiter, Jan P; Wachsmuth, Sven

    2012-08-01

    The reliable automatic visual recognition of indoor scenes with complex object constellations using only sensor data is a nontrivial problem. In order to improve the construction of an accurate semantic 3D model of an indoor scene, we exploit human-produced verbal descriptions of the relative location of pairs of objects. This requires the ability to deal with different spatial reference frames (RF) that humans use interchangeably. In German, both the intrinsic and relative RF are used frequently, which often leads to ambiguities in referential communication. We assume that there are certain regularities that help in specific contexts. In a first experiment, we investigated how speakers of German describe spatial relationships between different pieces of furniture. This gave us important information about the distribution of the RFs used for furniture-predicate combinations, and by implication also about the preferred spatial predicate. The results of this experiment are compiled into a computational model that extracts partial orderings of spatial arrangements between furniture items from verbal descriptions. In the implemented system, the visual scene is initially scanned by a 3D camera system. From the 3D point cloud, we extract point clusters that suggest the presence of certain furniture objects. We then integrate the partial orderings extracted from the verbal utterances incrementally and cumulatively with the estimated probabilities about the identity and location of objects in the scene, and also estimate the probable orientation of the objects. This allows the system to significantly improve both the accuracy and richness of its visual scene representation.

  3. Domain Endurants: An Analysis and Description Process Model

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2014-01-01

    We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. ...... claim that both sections, Sects. 2–3, contribute to a methodology of software engineering.......We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. We...

  4. Thorough Behavioral Analysis and Description: Its Relationship to Social Planning

    Science.gov (United States)

    Whol, Theodore H.

    1974-01-01

    A case is made for the utility and necessity of thorough behavioral analysis and description of developmentally disabled children using Wolfenberger's "Five Embarrassments in the Diagnostic Process" as a point of departure. (DB)

  5. Description and simulation of physics of Resistive Plate Chambers

    Science.gov (United States)

    Français, V.

    2016-05-01

    Monte-Carlo simulation of physical processes is an important tool for detector development as it allows to predict signal pulse amplitude and timing, time resolution, efficiency ... Yet despite the fact they are very common, full simulations for RPC-like detector are not widespread and often incomplete. They are often based on mathematical distributions that are not suited for this particular modelisation and over-simplify or neglect some important physical processes. We describe the main physical processes occurring inside a RPC when a charged particle goes through (ionisation, electron drift and multiplication, signal induction ...) through the Riegler-Lippmann-Veenhof model together with a still-in-development simulation. This is a full, fast and multi-threaded Monte-Carlo modelisation of the main physical processes using existing and well tested libraries and framework (such as the Garfield++ framework and the GNU Scientific Library). It is developed in the hope to be a basic ground for future RPC simulation developments.

  6. Screening Analysis : Volume 1, Description and Conclusions.

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville Power Administration; Corps of Engineers; Bureau of Reclamation

    1992-08-01

    The SOR consists of three analytical phases leading to a Draft EIS. The first phase Pilot Analysis, was performed for the purpose of testing the decision analysis methodology being used in the SOR. The Pilot Analysis is described later in this chapter. The second phase, Screening Analysis, examines all possible operating alternatives using a simplified analytical approach. It is described in detail in this and the next chapter. This document also presents the results of screening. The final phase, Full-Scale Analysis, will be documented in the Draft EIS and is intended to evaluate comprehensively the few, best alternatives arising from the screening analysis. The purpose of screening is to analyze a wide variety of differing ways of operating the Columbia River system to test the reaction of the system to change. The many alternatives considered reflect the range of needs and requirements of the various river users and interests in the Columbia River Basin. While some of the alternatives might be viewed as extreme, the information gained from the analysis is useful in highlighting issues and conflicts in meeting operating objectives. Screening is also intended to develop a broad technical basis for evaluation including regional experts and to begin developing an evaluation capability for each river use that will support full-scale analysis. Finally, screening provides a logical method for examining all possible options and reaching a decision on a few alternatives worthy of full-scale analysis. An organizational structure was developed and staffed to manage and execute the SOR, specifically during the screening phase and the upcoming full-scale analysis phase. The organization involves ten technical work groups, each representing a particular river use. Several other groups exist to oversee or support the efforts of the work groups.

  7. Description and simulation of physics of Resistive Plate Chambers

    CERN Document Server

    Français, Vincent

    2016-01-01

    Monte-Carlo simulation of physical processes is an important tool for detector development as it allows to predict signal pulse amplitude and timing, time resolution, efficiency ... Yet despite the fact they are very common, full simulations for RPC-like detector are not widespread and often incomplete. They are often based on mathematical distributions that are not suited for this particular modelisation and over-simplify or neglect some important physical processes. We describe the main physical processes occurring inside a RPC when a charged particle goes through (ionisation, electron drift and multiplication, signal induction ...) through the Riegler-Lippmann-Veenhof model together with a still-in-development simulation. This is a full, fast and multi-threaded Monte-Carlo modelisation of the main physical processes using existing and well tested libraries and framework (such as the Garfield++ framework and the GNU Scientific Library). It is developed in the hope to be a basic ground for future RPC simulat...

  8. Health leaders of tomorrow: a descriptive analysis.

    Science.gov (United States)

    Levey, L M; Lane, M S; Baretich, M; Levey, S

    1989-01-01

    The dramatically changing environment of the health care executive prompted the current survey of graduate students in health administration. The survey examined the types of attitudes and values that may influence students' future leadership style and ethical decision making. A self-administered questionnaire was designed to provide descriptive information as well as to allow comparisons with recent surveys of practitioners and peers and with other research. Respondents to the survey, conducted in the fall of 1986, included nearly half of the full-time students in 56 participating AUPHA graduate programs and one-quarter of the part-time students (N = 1,764). Students' characteristics such as age, sex, religious preference, work experience, and career aspirations were assessed in relation to their social philosophy on health issues, instrumental and terminal values, attitudes toward achievement, and degree of idealism versus relativism in moral reasoning. The typical student was a twenty-seven-year-old white female with a stated religious preference who expected in ten years to be associated with a multi-institutional system or consulting firm. Her social philosophy showed concern for the rising cost of health care on the consumer and an emphasis on self-help. Self-respect and honesty are her highest values; her ethical ideology had components of high idealism and relativism, which is indicative of a situational decision-making style. Work orientation and mastery were both above average sources of achievement motivation for her, whereas competitiveness was about average. When group differences in attitudes and values were evaluated, however, sex was the highest predictor, followed by age, expectation of becoming a CEO, and self-assessed potential for success. Although there was substantial agreement between students' and practitioners' attitudes and values, some differences were found. Implications for future research are discussed, as well as issues relating to

  9. Descriptive Topology in Selected Topics of Functional Analysis

    CERN Document Server

    Kakol, J; Pellicer, Manuel Lopez

    2011-01-01

    "Descriptive Topology in Selected Topics of Functional Analysis" is a collection of recent developments in the field of descriptive topology, specifically focused on the classes of infinite-dimensional topological vector spaces that appear in functional analysis. Such spaces include Frechet spaces, (LF)-spaces and their duals, and the space of continuous real-valued functions C(X) on a completely regular Hausdorff space X, to name a few. These vector spaces appear in functional analysis in distribution theory, differential equations, complex analysis, and various other analytical set

  10. Description of waste pretreatment and interfacing systems dynamic simulation model

    Energy Technology Data Exchange (ETDEWEB)

    Garbrick, D.J.; Zimmerman, B.D.

    1995-05-01

    The Waste Pretreatment and Interfacing Systems Dynamic Simulation Model was created to investigate the required pretreatment facility processing rates for both high level and low level waste so that the vitrification of tank waste can be completed according to the milestones defined in the Tri-Party Agreement (TPA). In order to achieve this objective, the processes upstream and downstream of the pretreatment facilities must also be included. The simulation model starts with retrieval of tank waste and ends with vitrification for both low level and high level wastes. This report describes the results of three simulation cases: one based on suggested average facility processing rates, one with facility rates determined so that approximately 6 new DSTs are required, and one with facility rates determined so that approximately no new DSTs are required. It appears, based on the simulation results, that reasonable facility processing rates can be selected so that no new DSTs are required by the TWRS program. However, this conclusion must be viewed with respect to the modeling assumptions, described in detail in the report. Also included in the report, in an appendix, are results of two sensitivity cases: one with glass plant water recycle steams recycled versus not recycled, and one employing the TPA SST retrieval schedule versus a more uniform SST retrieval schedule. Both recycling and retrieval schedule appear to have a significant impact on overall tank usage.

  11. The stellar atmosphere simulation code Bifrost. Code description and validation

    NARCIS (Netherlands)

    Gudiksen, B.V.; Carlsson, M.; Hansteen, V.H.; Hayek, W.; Leenaarts, J.|info:eu-repo/dai/nl/304837946; Martínez-Sykora, J.

    2011-01-01

    Context. Numerical simulations of stellar convection and photospheres have been developed to the point where detailed shapes of observed spectral lines can be explained. Stellar atmospheres are very complex, and very different physical regimes are present in the convection zone, photosphere,

  12. COPD phenotype description using principal components analysis

    DEFF Research Database (Denmark)

    Roy, Kay; Smith, Jacky; Kolsum, Umme

    2009-01-01

    BACKGROUND: Airway inflammation in COPD can be measured using biomarkers such as induced sputum and Fe(NO). This study set out to explore the heterogeneity of COPD using biomarkers of airway and systemic inflammation and pulmonary function by principal components analysis (PCA). SUBJECTS...... AND METHODS: In 127 COPD patients (mean FEV1 61%), pulmonary function, Fe(NO), plasma CRP and TNF-alpha, sputum differential cell counts and sputum IL8 (pg/ml) were measured. Principal components analysis as well as multivariate analysis was performed. RESULTS: PCA identified four main components (% variance...... associations between the variables within components 1 and 2. CONCLUSION: COPD is a multi dimensional disease. Unrelated components of disease were identified, including neutrophilic airway inflammation which was associated with systemic inflammation, and sputum eosinophils which were related to increased Fe...

  13. A Descriptive Guide to Trade Space Analysis

    Science.gov (United States)

    2015-09-01

    with analysts who had participated in some of the trades work reviewed for this study . The intent of this section is to describe the nature of the...Center 700 Dyer Road Monterey, CA 93943-0692 DISTRIBUTION STATEMENT: Approved for public release; distribution is unlimited This study cost...Finding the balance between all three of these components is what drives decision outcomes. The pmpose of trade space analysis is to understand the

  14. The stellar atmosphere simulation code Bifrost. Code description and validation

    Science.gov (United States)

    Gudiksen, B. V.; Carlsson, M.; Hansteen, V. H.; Hayek, W.; Leenaarts, J.; Martínez-Sykora, J.

    2011-07-01

    Context. Numerical simulations of stellar convection and photospheres have been developed to the point where detailed shapes of observed spectral lines can be explained. Stellar atmospheres are very complex, and very different physical regimes are present in the convection zone, photosphere, chromosphere, transition region and corona. To understand the details of the atmosphere it is necessary to simulate the whole atmosphere since the different layers interact strongly. These physical regimes are very diverse and it takes a highly efficient massively parallel numerical code to solve the associated equations. Aims: The design, implementation and validation of the massively parallel numerical code Bifrost for simulating stellar atmospheres from the convection zone to the corona. Methods: The code is subjected to a number of validation tests, among them the Sod shock tube test, the Orzag-Tang colliding shock test, boundary condition tests and tests of how the code treats magnetic field advection, chromospheric radiation, radiative transfer in an isothermal scattering atmosphere, hydrogen ionization and thermal conduction. Results.Bifrost completes the tests with good results and shows near linear efficiency scaling to thousands of computing cores.

  15. Description and Analysis Pattern for Theses and Dissertations

    Directory of Open Access Journals (Sweden)

    Sirous Alidousti

    2009-07-01

    Full Text Available Dissertations and theses that are generated in course of research at PhD and Masters levels are considered to be important scientific documents in every country. Data description and analysis of such documents collected together, could automatically - especially when compared with data from other resources - provide new information that is very valuable. Nevertheless, no comprehensive, integrated pattern exists for such description and analysis. The present paper offers the findings of a research conducted for devising an information analysis pattern for dissertations and theses. It also puts forward information categories derived from such documents that could be described and analyzed.

  16. Thermal conductance of carbon nanotube contacts: Molecular dynamics simulations and general description of the contact conductance

    Science.gov (United States)

    Salaway, Richard N.; Zhigilei, Leonid V.

    2016-07-01

    The contact conductance of carbon nanotube (CNT) junctions is the key factor that controls the collective heat transfer through CNT networks or CNT-based materials. An improved understanding of the dependence of the intertube conductance on the contact structure and local environment is needed for predictive computational modeling or theoretical description of the effective thermal conductivity of CNT materials. To investigate the effect of local structure on the thermal conductance across CNT-CNT contact regions, nonequilibrium molecular dynamics (MD) simulations are performed for different intertube contact configurations (parallel fully or partially overlapping CNTs and CNTs crossing each other at different angles) and local structural environments characteristic of CNT network materials. The results of MD simulations predict a stronger CNT length dependence present over a broader range of lengths than has been previously reported and suggest that the effect of neighboring junctions on the conductance of CNT-CNT junctions is weak and only present when the CNTs that make up the junctions are within the range of direct van der Waals interaction with each other. A detailed analysis of the results obtained for a diverse range of intertube contact configurations reveals a nonlinear dependence of the conductance on the contact area (or number of interatomic intertube interactions) and suggests larger contributions to the conductance from areas of the contact where the density of interatomic intertube interactions is smaller. An empirical relation accounting for these observations and expressing the conductance of an arbitrary contact configuration through the total number of interatomic intertube interactions and the average number of interatomic intertube interactions per atom in the contact region is proposed. The empirical relation is found to provide a good quantitative description of the contact conductance for various CNT configurations investigated in the MD

  17. Physics Detector Simulation Facility Phase II system software description

    Energy Technology Data Exchange (ETDEWEB)

    Scipioni, B.; Allen, J.; Chang, C.; Huang, J.; Liu, J.; Mestad, S.; Pan, J.; Marquez, M.; Estep, P.

    1993-05-01

    This paper presents the Physics Detector Simulation Facility (PDSF) Phase II system software. A key element in the design of a distributed computing environment for the PDSF has been the separation and distribution of the major functions. The facility has been designed to support batch and interactive processing, and to incorporate the file and tape storage systems. By distributing these functions, it is often possible to provide higher throughput and resource availability. Similarly, the design is intended to exploit event-level parallelism in an open distributed environment.

  18. Efficient generation of connectivity in neuronal networks from simulator-independent descriptions

    Directory of Open Access Journals (Sweden)

    Mikael eDjurfeldt

    2014-04-01

    Full Text Available Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed.We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeller's needs. We have used the connection generator interface to connect C++ and Python implementations of the connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modelling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface.

  19. Efficient generation of connectivity in neuronal networks from simulator-independent descriptions.

    Science.gov (United States)

    Djurfeldt, Mikael; Davison, Andrew P; Eppler, Jochen M

    2014-01-01

    Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed. We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeler's needs. We have used the connection generator interface to connect C++ and Python implementations of the previously described connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modeling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface.

  20. Uncertainty analysis of a one-dimensional constitutive model for shape memory alloy thermomechanical description

    DEFF Research Database (Denmark)

    Oliveira, Sergio A.; Savi, Marcelo A.; Santos, Ilmar F.

    2014-01-01

    The use of shape memory alloys (SMAs) in engineering applications has increased the interest of the accuracy analysis of their thermomechanical description. This work presents an uncertainty analysis related to experimental tensile tests conducted with shape memory alloy wires. Experimental data...... are compared with numerical simulations obtained from a constitutive model with internal constraints employed to describe the thermomechanical behavior of SMAs. The idea is to evaluate if the numerical simulations are within the uncertainty range of the experimental data. Parametric analysis is also developed...

  1. The electricity portfolio simulation model (EPSim) technical description.

    Energy Technology Data Exchange (ETDEWEB)

    Drennen, Thomas E.; Klotz, Richard (Hobart and William Smith Colleges, Geneva, NY)

    2005-09-01

    Stakeholders often have competing interests when selecting or planning new power plants. The purpose of developing this preliminary Electricity Portfolio Simulation Model (EPSim) is to provide a first cut, dynamic methodology and approach to this problem, that can subsequently be refined and validated, that may help energy planners, policy makers, and energy students better understand the tradeoffs associated with competing electricity portfolios. EPSim allows the user to explore competing electricity portfolios annually from 2002 to 2025 in terms of five different criteria: cost, environmental impacts, energy dependence, health and safety, and sustainability. Four additional criteria (infrastructure vulnerability, service limitations, policy needs and science and technology needs) may be added in future versions of the model. Using an analytic hierarchy process (AHP) approach, users or groups of users apply weights to each of the criteria. The default energy assumptions of the model mimic Department of Energy's (DOE) electricity portfolio to 2025 (EIA, 2005). At any time, the user can compare alternative portfolios to this reference case portfolio.

  2. The Hydrogen Futures Simulation Model (H[2]Sim) technical description.

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Scott A.; Kamery, William; Baker, Arnold Barry; Drennen, Thomas E.; Lutz, Andrew E.; Rosthal, Jennifer Elizabeth

    2004-10-01

    Hydrogen has the potential to become an integral part of our energy transportation and heat and power sectors in the coming decades and offers a possible solution to many of the problems associated with a heavy reliance on oil and other fossil fuels. The Hydrogen Futures Simulation Model (H2Sim) was developed to provide a high level, internally consistent, strategic tool for evaluating the economic and environmental trade offs of alternative hydrogen production, storage, transport and end use options in the year 2020. Based on the model's default assumptions, estimated hydrogen production costs range from 0.68 $/kg for coal gasification to as high as 5.64 $/kg for centralized electrolysis using solar PV. Coal gasification remains the least cost option if carbon capture and sequestration costs ($0.16/kg) are added. This result is fairly robust; for example, assumed coal prices would have to more than triple or the assumed capital cost would have to increase by more than 2.5 times for natural gas reformation to become the cheaper option. Alternatively, assumed natural gas prices would have to fall below $2/MBtu to compete with coal gasification. The electrolysis results are highly sensitive to electricity costs, but electrolysis only becomes cost competitive with other options when electricity drops below 1 cent/kWhr. Delivered 2020 hydrogen costs are likely to be double the estimated production costs due to the inherent difficulties associated with storing, transporting, and dispensing hydrogen due to its low volumetric density. H2Sim estimates distribution costs ranging from 1.37 $/kg (low distance, low production) to 3.23 $/kg (long distance, high production volumes, carbon sequestration). Distributed hydrogen production options, such as on site natural gas, would avoid some of these costs. H2Sim compares the expected 2020 per mile driving costs (fuel, capital, maintenance, license, and registration) of current technology internal combustion engine (ICE

  3. Simulating microbial denitrification with EPIC: Model description and initial testing

    Energy Technology Data Exchange (ETDEWEB)

    Izaurralde, Roberto C.; Mcgill, William B.; Williams, Jimmy R.; Jones, Curtis D.; Link, Robert P.; Manowitz, D.; Schwab, D. E.; Zhang, Xuesong; Robertson, G. P.; Milar, Neville

    2017-09-01

    fertilization. Although similar in magnitude, daily and cumulative simulated N2O fluxes followed a linear trend instead of the observed exponential trend. Further model testing of EPIC+IMWJ, alone or in ensembles with other models, using data from comprehensive experiments will be essential to discover areas of model improvement and increase the accuracy of N2O predictions under a wide range of environmental conditions.

  4. Visual unit analysis: a descriptive approach to landscape assessment

    Science.gov (United States)

    R. J. Tetlow; S. R. J. Sheppard

    1979-01-01

    Analysis of the visible attributes of landscapes is an important component of the planning process. When landscapes are at regional scale, economical and effective methodologies are critical. The Visual Unit concept appears to offer a logical and useful framework for description and evaluation. The concept subdivides landscape into coherent, spatially-defined units....

  5. Performance Analysis Based on Timing Simulation

    DEFF Research Database (Denmark)

    Nielsen, Christian Dalsgaard; Kishinevsky, Michael

    1994-01-01

    Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomia...... time complexity O(b2m), where b is the number of vertices with initially marked in-arcs (typically b≪n). The algorithm has a clear semantic and a low descriptive complexity. We illustrate the use of the algorithm by applying it to performance analysis of asynchronous circuits.......Determining the cycle time and a critical cycle is a fundamental problem in the analysis of concurrent systems. We solve this problemusing timing simulation of an underlying Signal Graph (an extension of Marked Graphs). For a Signal Graph with n vertices and m arcs our algorithm has the polynomial...

  6. Work hardening descriptions in simulation of sheet metal forming tailored to material type and processing

    NARCIS (Netherlands)

    Vegter, Henk; Mulder, Hans; Liempt, van Peter; Heijne, Jan

    2016-01-01

    In the previous decades much attention has been given on an accurate material description, especially for simulations at the design stage of new models in the automotive industry. Improvements lead to shorter design times and a better tailored use of material. It also contributes to the design and o

  7. Information Extraction to Generate Visual Simulations of Car Accidents from Written Descriptions

    NARCIS (Netherlands)

    Nugues, P.; Dupuy, S.; Egges, A.

    2003-01-01

    This paper describes a system to create animated 3D scenes of car accidents from written reports. The text-to-scene conversion process consists of two stages. An information extraction module creates a tabular description of the accident and a visual simulator generates and animates the scene. We

  8. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  9. Stochastic Simulation Tool for Aerospace Structural Analysis

    Science.gov (United States)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  10. Feasibility study for a numerical aerodynamic simulation facility. Volume 2: Hardware specifications/descriptions

    Science.gov (United States)

    Green, F. M.; Resnick, D. R.

    1979-01-01

    An FMP (Flow Model Processor) was designed for use in the Numerical Aerodynamic Simulation Facility (NASF). The NASF was developed to simulate fluid flow over three-dimensional bodies in wind tunnel environments and in free space. The facility is applicable to studying aerodynamic and aircraft body designs. The following general topics are discussed in this volume: (1) FMP functional computer specifications; (2) FMP instruction specification; (3) standard product system components; (4) loosely coupled network (LCN) specifications/description; and (5) three appendices: performance of trunk allocation contention elimination (trace) method, LCN channel protocol and proposed LCN unified second level protocol.

  11. Description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory test model

    Science.gov (United States)

    Woolley, C. T.; Groom, N. J.

    1981-01-01

    A description of a digital computer simulation of an Annular Momentum Control Device (AMCD) laboratory model is presented. The AMCD is a momentum exchange device which is under development as an advanced control effector for spacecraft attitude control systems. The digital computer simulation of this device incorporates the following models: six degree of freedom rigid body dynamics; rim warp; controller dynamics; nonlinear distributed element axial bearings; as well as power driver and power supply current limits. An annotated FORTRAN IV source code listing of the computer program is included.

  12. Analysis of laparoscopic port site complications: A descriptive study

    Directory of Open Access Journals (Sweden)

    Somu Karthik

    2013-01-01

    Full Text Available Context: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. Aims: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. Settings and Design: Prospective descriptive study. Materials and Methods: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. Statistical Analysis Used: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. Results: Of the 570 patients undergoing laparoscopic surgery, 17 (3% had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI was the most frequent (n = 10, 1.8%, followed by port site bleeding (n = 4, 0.7%, omentum-related complications (n = 2; 0.35%, and port site metastasis (n = 1, 0.175%. Conclusions: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit.

  13. Predicate Argument Structure Analysis for Use Case Description Modeling

    Science.gov (United States)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  14. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    Science.gov (United States)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  15. Descriptive analysis of flavor characteristics for black walnut cultivars.

    Science.gov (United States)

    Miller, Ashley E; Chambers, Delores H

    2013-06-01

    Seven black walnut cultivars, Brown Nugget, Davidson, Emma K, Football, Sparks 127, Sparrow, and Tomboy, were evaluated by descriptive sensory analysis. Seven trained panelists developed a lexicon for the black walnuts and scored the intensities of the samples for 22 flavor attributes. Results showed that the 7 samples differed significantly (P ≤ 0.05) on 13 of the attributes. For the majority of the attributes, only Emma K differed from the rest of the cultivars by being characterized with lower scores for black walnut ID, overall nutty, nutty-grain-like, nutty-buttery, floral/fruity, oily, and overall sweet. That sample also was higher in acrid, burnt, fruity-dark, musty/earthy, rancid, and bitter attributes. The remaining 6 cultivars showed few differences in individual attribute ratings, but did show some differences when mapped using multivariate techniques. Future studies should include descriptive analysis of other black walnut varieties, both wild and commercial, that could be grown and harvested for production. © 2013 Institute of Food Technologists®

  16. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  17. Descriptive quantitative analysis of hallux abductovalgus transverse plane radiographic parameters.

    Science.gov (United States)

    Meyr, Andrew J; Myers, Adam; Pontious, Jane

    2014-01-01

    Although the transverse plane radiographic parameters of the first intermetatarsal angle (IMA), hallux abductus angle (HAA), and the metatarsal-sesamoid position (MSP) form the basis of preoperative procedure selection and postoperative surgical evaluation of the hallux abductovalgus deformity, the so-called normal values of these measurements have not been well established. The objectives of the present study were to (1) evaluate the descriptive statistics of the first IMA, HAA, and MSP from a large patient population and (2) to determine an objective basis for defining "normal" versus "abnormal" measurements. Anteroposterior foot radiographs from 373 consecutive patients without a history of previous foot and ankle surgery and/or trauma were evaluated for the measurements of the first IMA, HAA, and MSP. The results revealed a mean measurement of 9.93°, 17.59°, and position 3.63 for the first IMA, HAA, and MSP, respectively. An advanced descriptive analysis demonstrated data characteristics of both parametric and nonparametric distributions. Furthermore, clear differentiations in deformity progression were appreciated when the variables were graphically depicted against each other. This could represent a quantitative basis for defining "normal" versus "abnormal" values. From the results of the present study, we have concluded that these radiographic parameters can be more conservatively reported and analyzed using nonparametric descriptive and comparative statistics within medical studies and that the combination of a first IMA, HAA, and MSP at or greater than approximately 10°, 18°, and position 4, respectively, appears to be an objective "tipping point" in terms of deformity progression and might represent an upper limit of acceptable in terms of surgical deformity correction.

  18. Simulations of Magnetic Reconnection - Kinetic Mechanisms Underlying the Fluid Description of Ions

    Science.gov (United States)

    Aunai, icolas; Belmont, Gerard; Smets, Roch

    2012-01-01

    Because of its ability to transfer the energy stored in magnetic field together with the breaking of the flux freezing constraint, magnetic reconnection is considered as one of the most important phenomena in plasma physics. When it happens in a collision less environment such as the terrestrial magnetosphere, it should a priori be modelled with in the framework of kinetic physics. The evidence of kinetic features has incidentally for a long time, been shown by researchers with the help of both numerical simulations and satellite observations. However, most of our understanding of the process comes from the more intuitive fluid interpretation with simple closure hypothesis which do not include kinetic effects. To what extent are these two separate descriptions of the same phenomenon related? What is the role of kinetic effects in the averaged/fluid dynamics of reconnection? This thesis addresses these questions for the proton population in the particular case of anti parallel merging with the help of 2D Hybrid simulations. We show that one can not assume, as is usually done, that the acceleration of the proton flow is only due to the Laplace force. Our results show, for symmetric and asymmetric connection, the importance of the pressure force, opposed to the electric one on the separatrices, in the decoupling region. In the symmetric case, we emphasize the kinetic origin of this force by analyzing the proton distribution functions and explain their structure by studying the underlying particle dynamics. Protons, as individual particles, are shown to bounce in the electric potential well created by the Hall effect. The spatial divergence of this well results in a mixing in phase space responsible for the observed structure of the pressure tensor. A detailed energy budget analysis confirms the role of the pressure force for the acceleration; but, contrary to what is sometimes assumed, it also reveals that the major part of the incoming Poynting flux is transferred to

  19. Descriptive analysis of YouTube music therapy videos.

    Science.gov (United States)

    Gooding, Lori F; Gregory, Dianne

    2011-01-01

    The purpose of this study was to conduct a descriptive analysis of music therapy-related videos on YouTube. Preliminary searches using the keywords music therapy, music therapy session, and "music therapy session" resulted in listings of 5000, 767, and 59 videos respectively. The narrowed down listing of 59 videos was divided between two investigators and reviewed in order to determine their relationship to actual music therapy practice. A total of 32 videos were determined to be depictions of music therapy sessions. These videos were analyzed using a 16-item investigator-created rubric that examined both video specific information and therapy specific information. Results of the analysis indicated that audio and visual quality was adequate, while narrative descriptions and identification information were ineffective in the majority of the videos. The top 5 videos (based on the highest number of viewings in the sample) were selected for further analysis in order to investigate demonstration of the Professional Level of Practice Competencies set forth in the American Music Therapy Association (AMTA) Professional Competencies (AMTA, 2008). Four of the five videos met basic competency criteria, with the quality of the fifth video precluding evaluation of content. Of particular interest is the fact that none of the videos included credentialing information. Results of this study suggest the need to consider ways to ensure accurate dissemination of music therapy-related information in the YouTube environment, ethical standards when posting music therapy session videos, and the possibility of creating AMTA standards for posting music therapy related video.

  20. Power converter simulation and analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ghazy, M.A.

    1989-01-01

    There has been a great deal of progress made in computer aided design and analysis in the power electronic field. Many of the simulation packages are inefficient and time consuming in simulating switching converters. This thesis proposes an efficient, simple, general simulation approach to simulate any power converter with less computation time and space requirements on computer. In this approach the equations of power converters are formulated using network topology. In this thesis several procedures have been explained for the steady-state computation of power electronic circuits. Also, the steady-state analyses have been accomplished by a new technique called Fourier series method. For a complete system consisting of converters, filters, and electric machines, the simulation is complicated if a frequency domain technique is used. This thesis introduces a better technique which decouples the system into subsystems and simulates it in the time domain. The design of power converters using optimization techniques is presented in this thesis. Finally, the theory of Variable Structured Systems has been applied to power converters. Sliding mode control for DC-DC and DC-AC power converters is introduced as a tool to accomplish desired characteristics.

  1. DESCRIPTIVE ANALYSIS OF CORPORATE CULTURE FOLLOWING THE CHANGES

    Directory of Open Access Journals (Sweden)

    Elenko Zahariev

    2016-09-01

    Full Text Available Corporate culture more sensibly makes additions to the economic knowledge, accompanies the strategy and tactics in management. It feels in manners and overall activity of the organization - it is empathy and tolerance, respect and responsibility. The new corporate culture transforms each participant, changes his/her mind in the general collaborations and working habits. The new corporate culture requires improving the management style. It is no longer necessary the leader only to rule, to administer and control, but to lead and inspire. The leader sets challenging targets, optimizes the performance of the teams, fuels an optimistic mood and faith, gains agreement between workers, monitors and evaluate the work in a fair way. Current study raises the problem of interpreting cultural profiles in modern organizations and analyzes corporate culture after the changes during the transition period in Bulgaria. The descriptive analysis of corporate culture allows the relatively precise identification of its various types based on the accepted classification signs.

  2. Finite element analysis applied to dentoalveolar trauma: methodology description.

    Science.gov (United States)

    da Silva, B R; Moreira Neto, J J S; da Silva, F I; de Aguiar, A S W

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar trauma. For didactic purposes, the methodological process was divided into steps that go from the creation of a geometric model to the evaluation of final results, always with a focus on methodological characteristics, advantages, and disadvantages, so as to allow the reader to customize the methodology according to specific needs. Our description shows that the finite element method can faithfully reproduce dentoalveolar trauma, provided the methodology is closely followed and thoroughly evaluated.

  3. Description and Simulation of a Fast Packet Switch Architecture for Communication Satellites

    Science.gov (United States)

    Quintana, Jorge A.; Lizanich, Paul J.

    1995-01-01

    The NASA Lewis Research Center has been developing the architecture for a multichannel communications signal processing satellite (MCSPS) as part of a flexible, low-cost meshed-VSAT (very small aperture terminal) network. The MCSPS architecture is based on a multifrequency, time-division-multiple-access (MF-TDMA) uplink and a time-division multiplex (TDM) downlink. There are eight uplink MF-TDMA beams, and eight downlink TDM beams, with eight downlink dwells per beam. The information-switching processor, which decodes, stores, and transmits each packet of user data to the appropriate downlink dwell onboard the satellite, has been fully described by using VHSIC (Very High Speed Integrated-Circuit) Hardware Description Language (VHDL). This VHDL code, which was developed in-house to simulate the information switching processor, showed that the architecture is both feasible and viable. This paper describes a shared-memory-per-beam architecture, its VHDL implementation, and the simulation efforts.

  4. Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0

    Science.gov (United States)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.

  5. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, R V; Kristensen, D; Nielsen, Jacob Holm

    2006-01-01

    products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative......Oxidation in 3 types of bovine milk with different fatty acid profiles obtained through manipulation of feed was evaluated by analytical methods quantifying the content of potential antioxidants, the tendency of formation of free radicals, and the accumulation of primary and secondary oxidation...... method for detection of early events in lipid oxidation in milk to predict shelf-life...

  6. A Computer-Based Content Analysis of Interview Texts: Numeric Description and Multivariate Analysis.

    Science.gov (United States)

    Bierschenk, B.

    1977-01-01

    A method is described by which cognitive structures in verbal data can be identified and categorized through numerical analysis and quantitative description. Transcriptions of interviews (in this case, the verbal statements of 40 researchers) are manually coded and subjected to analysis following the AaO (Agent action Object) paradigm. The texts…

  7. APPLICATION OF NEOTAME IN CATCHUP: QUANTITATIVE DESCRIPTIVE AND PHYSICOCHEMICAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    G. C. M. C. BANNWART

    2008-11-01

    Full Text Available

    In this study, fi ve prototypes of catchup were developed by replacing partially or totally the sucrose in the formulation by the sweetener Neotame (NTM. These prototypes were evaluated for their physicochemical characteristics and sensory profi le (Quantitative Descriptive Analysis. The main sensory differences observed among the prototypes were regarding to color, consistency, mouthfeel, sweet taste and tomato taste, for which lower means were obtained as the sugar level was decreased, and also in terms of salty taste, that had higher means with the decrease of sugar. In terms of bitter and sweetener aftertastes, the prototype 100% sweetened with NTM presented the higher mean score, but with no signifi cant difference when compared to other prototypes containing sucrose, for bitter taste, however, it had the highest mean score, statistically different from all the other prototypes. In terms of physicochemical characteristics, the differences were mainly in terms of consistency, solids and color. Despite the differences observed among the prototypes as the sugar level was reduced, it was concluded that NTM is a suitable sweetener for catchup, both for use in reduced calories and no sugar versions.

  8. Clinical significance in nursing research: A discussion and descriptive analysis.

    Science.gov (United States)

    Polit, Denise F

    2017-05-10

    It is widely understood that statistical significance should not be equated with clinical significance, but the topic of clinical significance has not received much attention in the nursing literature. By contrast, interest in conceptualizing and operationalizing clinical significance has been a "hot topic" in other health care fields for several decades. The major purpose of this paper is to briefly describe recent advances in defining and quantifying clinical significance. The overview covers both group-level indicators of clinical significance (e.g., effect size indexes), and individual-level benchmarks (e.g., the minimal important change index). A secondary purpose is to describe the extent to which developments in clinical significance have penetrated the nursing literature. A descriptive analysis of a sample of primary research articles published in three high-impact nursing research journals in 2016 was undertaken. A total of 362 articles were electronically searched for terms relating to statistical and clinical significance. Of the 362 articles, 261 were reports of quantitative studies, the vast majority of which (93%) included a formal evaluation of the statistical significance of the results. By contrast, the term "clinical significance" or related surrogate terms were found in only 33 papers, and most often the term was used informally, without explicit definition or assessment. Raising consciousness about clinical significance should be an important priority among nurse researchers. Several recommendations are offered to improve the visibility and salience of clinical significance in nursing science. Copyright © 2017 Elsevier Ltd. All rights reserved.

  9. Quantification and Standardized Description of Color Vision Deficiency Caused by Anomalous Trichromats—Part I: Simulation and Measurement

    Directory of Open Access Journals (Sweden)

    Wong EdwardK

    2008-01-01

    Full Text Available Abstract The MPEG-21 Multimedia Framework allows visually impaired users to have an improved access to visual content by enabling content adaptation techniques such as color compensation. However, one important issue is the method to create and interpret the standardized CVD descriptions when making the use of generic color vision tests. In Part I of our study to tackle the issue, we present a novel computerized hue test (CHT to examine and quantify CVD, which allows reproducing and manipulating test colors for the purposes of computer simulation and analysis of CVD. Both objective evaluation via color difference measurement and subjective evaluation via clinical experiment showed that the CHT works well as a color vision test: it is highly correlated with the Farnsworth-Munsell 100 Hue (FM100H test and allows for a more elaborate and correct color reproduction than the FM100H test.

  10. Speeding up intrinsically slow collective processes in particle simulations by concurrent coupling to a continuum description.

    Science.gov (United States)

    Müller, Marcus; Daoulas, Kostas Ch

    2011-11-25

    The difficulty to study intrinsically slow collective processes by computer simulation of particle models stems from multiple disparate time scales (e.g., stiff bonded interactions versus soft nonbonded interactions). Continuum models, which describe the system by collective variables rather than the coordinates of the individual molecular constituents, often do not suffer from this time-scale problem because the stiff microscopic degrees of freedom have been integrated out. We propose to concurrently couple these two descriptions by a heterogeneous multiscale method. We illustrate the technique by studying the Lifshitz-Slyozov coarsening mechanism in a binary polymer blend using a soft coarse-grained particle model and a Landau-Ginzburg-de Gennes free energy functional, respectively. A speedup of up to two orders of magnitudes is achieved.

  11. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    Science.gov (United States)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; hide

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  12. Physics-Based Simulator for NEO Exploration Analysis & Simulation

    Science.gov (United States)

    Balaram, J.; Cameron, J.; Jain, A.; Kline, H.; Lim, C.; Mazhar, H.; Myint, S.; Nayar, H.; Patton, R.; Pomerantz, M.; Quadrelli, M.; Shakkotai, P.; Tso, K.

    2011-01-01

    As part of the Space Exploration Analysis and Simulation (SEAS) task, the National Aeronautics and Space Administration (NASA) is using physics-based simulations at NASA's Jet Propulsion Laboratory (JPL) to explore potential surface and near-surface mission operations at Near Earth Objects (NEOs). The simulator is under development at JPL and can be used to provide detailed analysis of various surface and near-surface NEO robotic and human exploration concepts. In this paper we describe the SEAS simulator and provide examples of recent mission systems and operations concepts investigated using the simulation. We also present related analysis work and tools developed for both the SEAS task as well as general modeling, analysis and simulation capabilites for asteroid/small-body objects.

  13. Solar Pilot Plant, Phase I. Preliminary design report. Volume II. System description and system analysis. CDRL item 2

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-05-01

    Honeywell conducted a parametric analysis of the 10-MW(e) solar pilot plant requirements and expected performance and established an optimum system design. The main analytical simulation tools were the optical (ray trace) and the dynamic simulation models. These are described in detail in Books 2 and 3 of this volume under separate cover. In making design decisions, available performance and cost data were used to provide a design reflecting the overall requirements and economics of a commercial-scale plant. This volume contains a description of this analysis/design process and resultant system/subsystem design and performance.

  14. Less Developed Countries Energy System Network Simulator, LDC-ESNS: a brief description

    Energy Technology Data Exchange (ETDEWEB)

    Reisman, A; Malone, R

    1978-04-01

    Prepared for the Brookhaven National Laboratory Developing Countries Energy Program, this report describes the Less Developed Countries Energy System Network Simulator (LDC-ESNS), a tool which provides a quantitative representation of the energy system of an LDC. The network structure of the energy supply and demand system, the model inputs and outputs, and the possible uses of the model for analysis are described.

  15. Modeling and simulation of equivalent circuits in description of biological systems - a fractional calculus approach

    Directory of Open Access Journals (Sweden)

    José Francisco Gómez Aguilar

    2012-07-01

    Full Text Available Using the fractional calculus approach, we present the Laplace analysis of an equivalent electrical circuit for a multilayered system, which includes distributed elements of the Cole model type. The Bode graphs are obtained from the numerical simulation of the corresponding transfer functions using arbitrary electrical parameters in order to illustrate the methodology. A numerical Laplace transform is used with respect to the simulation of the fractional differential equations. From the results shown in the analysis, we obtain the formula for the equivalent electrical circuit of a simple spectrum, such as that generated by a real sample of blood tissue, and the corresponding Nyquist diagrams. In addition to maintaining consistency in adjusted electrical parameters, the advantage of using fractional differential equations in the study of the impedance spectra is made clear in the analysis used to determine a compact formula for the equivalent electrical circuit, which includes the Cole model and a simple RC model as special cases.

  16. Website Sharing in Online Health Communities: A Descriptive Analysis.

    Science.gov (United States)

    Nath, Chinmoy; Huh, Jina; Adupa, Abhishek Kalyan; Jonnalagadda, Siddhartha R

    2016-01-13

    An increasing number of people visit online health communities to seek health information. In these communities, people share experiences and information with others, often complemented with links to different websites. Understanding how people share websites can help us understand patients' needs in online health communities and improve how peer patients share health information online. Our goal was to understand (1) what kinds of websites are shared, (2) information quality of the shared websites, (3) who shares websites, (4) community differences in website-sharing behavior, and (5) the contexts in which patients share websites. We aimed to find practical applications and implications of website-sharing practices in online health communities. We used regular expressions to extract URLs from 10 WebMD online health communities. We then categorized the URLs based on their top-level domains. We counted the number of trust codes (eg, accredited agencies' formal evaluation and PubMed authors' institutions) for each website to assess information quality. We used descriptive statistics to determine website-sharing activities. To understand the context of the URL being discussed, we conducted a simple random selection of 5 threads that contained at least one post with URLs from each community. Gathering all other posts in these threads resulted in 387 posts for open coding analysis with the goal of understanding motivations and situations in which website sharing occurred. We extracted a total of 25,448 websites. The majority of the shared websites were .com (59.16%, 15,056/25,448) and WebMD internal (23.2%, 5905/25,448) websites; the least shared websites were social media websites (0.15%, 39/25,448). High-posting community members and moderators posted more websites with trust codes than low-posting community members did. The heart disease community had the highest percentage of websites containing trust codes compared to other communities. Members used websites to

  17. Description-based and experience-based decisions: individual analysis

    Directory of Open Access Journals (Sweden)

    Andrey Kudryavtsev

    2012-05-01

    Full Text Available We analyze behavior in two basic classes of decision tasks: description-based and experience-based. In particular, we compare the prediction power of a number of decision learning models in both kinds of tasks. Unlike most previous studies, we focus on individual, rather than aggregate, behavioral characteristics. We carry out an experiment involving a battery of both description- and experience-based choices between two mixed binary prospects made by each of the participants, and employ a number of formal models for explaining and predicting participants' choices: Prospect theory (PT (Kahneman and Tversky, 1979; Expectancy-Valence model (EVL (Busemeyer and Stout, 2002; and three combinations of these well-established models. We document that the PT and the EVL models are best for predicting people's decisions in description- and experience-based tasks, respectively, which is not surprising as these two models are designed specially for these kinds of tasks. Furthermore, we find that models involving linear weighting of gains and losses perform better in both kinds of tasks, from the point of view of generalizability and individual parameter consistency. We therefore, conclude that, overall, when both prospects are mixed, the assumption of diminishing sensitivity does not improve models' prediction power for individual decision-makers. Finally, for some of the models' parameters, we document consistency at the individual level between description- and experience-based tasks.

  18. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, R V; Kristensen, D; Nielsen, Jacob Holm

    2006-01-01

    Oxidation in 3 types of bovine milk with different fatty acid profiles obtained through manipulation of feed was evaluated by analytical methods quantifying the content of potential antioxidants, the tendency of formation of free radicals, and the accumulation of primary and secondary oxidation p...... method for detection of early events in lipid oxidation in milk to predict shelf-life......Oxidation in 3 types of bovine milk with different fatty acid profiles obtained through manipulation of feed was evaluated by analytical methods quantifying the content of potential antioxidants, the tendency of formation of free radicals, and the accumulation of primary and secondary oxidation...... products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...

  19. Simulation analysis of wastes gasification technologies

    Directory of Open Access Journals (Sweden)

    Stępień Leszek

    2017-01-01

    Full Text Available Each year a significant growth in the amount of wastes generated is observed. Due to this fact technologies enabling utilization of wastes are needed. One of the ways to utilizes wastes is thermal conversion. Most widely used technology for thermal conversion is gasification that enables to produce syngas that can be either combusted or directed to further synthesis to produce methanol or liquid fuels. There are several commercially available technologies that enable to gasify wastes. The first part of this study is subjected to general description of waste gasification process. Furthermore the analysis and comparison of commercially available gasification technologies is presented, including their process arrangement, limits and capabilities. Second part of the study is dedicated to the development of thermodynamic model for waste gasification. The model includes three zones of gasification reactors: drying, gasification and eventually ash melting. Modified Gibbs minimization method is used to simulate gasification process. The model is capable of predicting final gas composition as a function of temperature or equivalence ratio. Calculations are performed for a specified average wastes composition and different equivalence ratios of air to discuss its influence on the performance of gasification (temperature of the process and gas composition. Finally the model enables to calculate total energy balance of the process as well as gasification and final gas temperature.

  20. Fractional-Order Nonlinear Systems Modeling, Analysis and Simulation

    CERN Document Server

    Petráš, Ivo

    2011-01-01

    "Fractional-Order Nonlinear Systems: Modeling, Analysis and Simulation" presents a study of fractional-order chaotic systems accompanied by Matlab programs for simulating their state space trajectories, which are shown in the illustrations in the book. Description of the chaotic systems is clearly presented and their analysis and numerical solution are done in an easy-to-follow manner. Simulink models for the selected fractional-order systems are also presented. The readers will understand the fundamentals of the fractional calculus, how real dynamical systems can be described using fractional derivatives and fractional differential equations, how such equations can be solved, and how to simulate and explore chaotic systems of fractional order. The book addresses to mathematicians, physicists, engineers, and other scientists interested in chaos phenomena or in fractional-order systems. It can be used in courses on dynamical systems, control theory, and applied mathematics at graduate or postgraduate level. ...

  1. Descriptive Analysis in Education: A Guide for Researchers. NCEE 2017-4023

    Science.gov (United States)

    Loeb, Susanna; Dynarski, Susan; McFarland, Daniel; Morris, Pamela; Reardon, Sean; Reber, Sarah

    2017-01-01

    Whether the goal is to identify and describe trends and variation in populations, create new measures of key phenomena, or describe samples in studies aimed at identifying causal effects, description plays a critical role in the scientific process in general and education research in particular. Descriptive analysis identifies patterns in data to…

  2. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  3. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  4. AN ANALYSIS REGARDING DESCRIPTIVE DIMENSIONS OF BRAND EQUITY

    Directory of Open Access Journals (Sweden)

    Ioan MOISESCU

    2007-01-01

    Full Text Available The competitive potential of any company is significantly influenced by the brands held in the company’s portfolio. Brands are definitely valuable marketing assets. As the brand is a central element of any marketing strategy it is essential to be aware of the descriptive dimensions of its equity. This paper tries to outline these dimensions as follows: brand loyalty, brand awareness, brand perceived quality, brand personality, brand image, brand identity and brand associations, as analyzed in the specialized literature. Identifying and comparing different approaches regarding each brand equity dimension and revealing interdependencies between these dimensions, focusing on the importance of scientifically determining their role in generating a long-term increase in marketing efforts efficiency, are among the main objectives of this paper.

  5. MASSCLEAN - MASSive CLuster Evolution and ANalysis Package - Description and Tests

    CERN Document Server

    Popescu, Bogdan

    2008-01-01

    We present MASSCLEAN, a new, sophisticated and robust stellar cluster image and photometry simulation package. This package is able to create color-magnitude diagrams and standard FITS images in any of the traditional optical and near-infrared bands based on cluster characteristics input by the user, including but not limited to distance, age, mass, radius and extinction. At the limit of very distant, unresolved clusters, we have checked the integrated colors created in MASSCLEAN against those from other single stellar population models with consistent results. We have also tested models which provide a reasonable estimate of the field star contamination in images and color-magnitude diagrams. We demonstrate the package by simulating images and color-magnitude diagrams of well known massive Milky Way clusters and compare their appearance to real data. Because the algorithm populates the cluster with a discrete number of tenable stars, it can be used as part of a Monte Carlo Method to derive the probabilistic ...

  6. Feature separability analysis for SAR ATR using data description method

    Science.gov (United States)

    Guo, Weiwei; Du, Xiaoyong; Hu, WeiDong; Yu, Wenxian

    2007-11-01

    Feature extraction and selection play an important role in radar target recognition. This paper focuses on evaluating feature separability for SAR ATR and selecting the best subset of features. In details, fifteen features extracted from T72, BTR70 and BMP2 in MSTAR standard public dataset are examined, which are divided into seven categories: standard deviation, fractal dimension, weighted-rank fill ratio, size-related features, contrast-based features, count feature, projection feature, and moment features. Since the number of samples is small, a new separability criterion based on the overlap degree of each two class regions is proposed to assess the separability of these features. Here the class region is described by support vector data description (SVDD) method for good generalization. Based on the proposed criterion, a forward feature selection method is adopted to choose the best subset of features. Because of the strong variability of the feature against aspect, the features are analyzed under different aspect sectors within 360°angle range stepped by 15°, 30 °, and 60°, respectively. Experiments using MSTAR dataset validate the criterion, and the best subset of features is determined.

  7. Students' views on the block evaluation process: A descriptive analysis.

    Science.gov (United States)

    Pakkies, Ntefeleng E; Mtshali, Ntombifikile G

    2016-03-30

    Higher education institutions have executed policies and practices intended to determine and promote good teaching. Students' evaluation of the teaching and learning process is seen as one measure of evaluating quality and effectiveness of instruction and courses. Policies and procedures guiding this process are discernible in universities, but it isoften not the case for nursing colleges. To analyse and describe the views of nursing students on block evaluation, and how feedback obtained from this process was managed. A quantitative descriptive study was conducted amongst nursing students (n = 177) in their second to fourth year of training from one nursing college in KwaZulu-Natal. A questionnaire was administered by the researcher and data were analysed using the Statistical Package of Social Sciences Version 19.0. The response rate was 145 (81.9%). The participants perceived the aim of block evaluation as improving the quality of teaching and enhancing their experiences as students.They questioned the significance of their input as stakeholders given that they had never been consulted about the development or review of the evaluation tool, or the administration process; and they often did not receive feedback from the evaluation they participated in. The college management should develop a clear organisational structure with supporting policies and operational guidelines for administering the evaluation process. The administration, implementation procedures, reporting of results and follow-up mechanisms should be made transparent and communicated to all concerned. Reports and actions related to these evaluations should provide feedback into relevant courses or programmes.

  8. Students’ views on the block evaluation process: A descriptive analysis

    Directory of Open Access Journals (Sweden)

    Ntefeleng E. Pakkies

    2016-02-01

    Full Text Available Background: Higher education institutions have executed policies and practices intended to determine and promote good teaching. Students’ evaluation of the teaching and learning process is seen as one measure of evaluating quality and effectiveness of instruction and courses. Policies and procedures guiding this process are discernible in universities, but it isoften not the case for nursing colleges.Objective: To analyse and describe the views of nursing students on block evaluation, and how feedback obtained from this process was managed.Method: A quantitative descriptive study was conducted amongst nursing students (n = 177 in their second to fourth year of training from one nursing college in KwaZulu-Natal. A questionnaire was administered by the researcher and data were analysed using the Statistical Package of Social Sciences Version 19.0.Results: The response rate was 145 (81.9%. The participants perceived the aim of block evaluation as improving the quality of teaching and enhancing their experiences as students.They questioned the significance of their input as stakeholders given that they had never been consulted about the development or review of the evaluation tool, or the administration process; and they often did not receive feedback from the evaluation they participated in.Conclusion: The college management should develop a clear organisational structure with supporting policies and operational guidelines for administering the evaluation process. The administration, implementation procedures, reporting of results and follow-up mechanisms should be made transparent and communicated to all concerned. Reports and actions related to these evaluations should provide feedback into relevant courses or programmes.Keywords: Student evaluation of teaching; perceptions; undergraduate nursing students; evaluation process

  9. The Tuscan Mobile Simulation Program: a description of a program for the delivery of in situ simulation training.

    Science.gov (United States)

    Ullman, Edward; Kennedy, Maura; Di Delupis, Francesco Dojmi; Pisanelli, Paolo; Burbui, Andrea Giuliattini; Cussen, Meaghan; Galli, Laura; Pini, Riccardo; Gensini, Gian Franco

    2016-09-01

    Simulation has become a critical aspect of medical education. It allows health care providers the opportunity to focus on safety and high-risk situations in a protected environment. Recently, in situ simulation, which is performed in the actual clinical setting, has been used to recreate a more realistic work environment. This form of simulation allows for better team evaluation as the workers are in their traditional roles, and can reveal latent safety errors that often are not seen in typical simulation scenarios. We discuss the creation and implementation of a mobile in situ simulation program in emergency departments of three hospitals in Tuscany, Italy, including equipment, staffing, and start-up costs for this program. We also describe latent safety threats identified in the pilot in situ simulations. This novel approach has the potential to both reduce the costs of simulation compared to traditional simulation centers, and to expand medical simulation experiences to providers and healthcare organizations that do not have access to a large simulation center.

  10. HOURS: Simulation and analysis software for the KM3NeT

    Science.gov (United States)

    Tsirigotis, A. G.; Leisos, A.; Tzamarias, S. E.

    2017-02-01

    The Hellenic Open University Reconstruction & Simulation (HOURS) software package contains a realistic simulation package of the detector response of very large (km3-scale) underwater neutrino telescopes, including an accurate description of all the relevant physical processes, the production of signal and background as well as several analysis strategies for triggering and pattern recognition, event reconstruction, tracking and energy estimation. HOURS also provides tools for simulating calibration techniques and other studies for estimating the detector sensitivity to several neutrino sources.

  11. Holistic Nursing Simulation: A Concept Analysis.

    Science.gov (United States)

    Cohen, Bonni S; Boni, Rebecca

    2016-11-28

    Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.

  12. Manpower Analysis Using Discrete Simulation

    Science.gov (United States)

    2015-12-01

    Course STA-21 Seaman to Admiral (21st century) SQL Structured Query Language TOS Time on Station xiv THIS PAGE INTENTIONALLY LEFT BLANK...using Simkit—a widely available library based in the Java programming language for building Discrete Event Simulation (DES) models. By overriding...intervals (i.e., quarterly), while holding attrition negligible. For the purposes of modeling each new accession to the system, the Arrival

  13. Statistical Analysis of Random Simulations : Bootstrap Tutorial

    NARCIS (Netherlands)

    Deflandre, D.; Kleijnen, J.P.C.

    2002-01-01

    The bootstrap is a simple but versatile technique for the statistical analysis of random simulations.This tutorial explains the basics of that technique, and applies it to the well-known M/M/1 queuing simulation.In that numerical example, different responses are studied.For some responses, bootstrap

  14. Descriptive Analysis of Teachers' Responses to Problem Behavior Following Training

    Science.gov (United States)

    Addison, Laura; Lerman, Dorothea C.

    2009-01-01

    The procedures described by Sloman et al. (2005) were extended to an analysis of teachers' responses to problem behavior after they had been taught to withhold potential sources of positive and negative reinforcement following instances of problem behavior. Results were consistent with those reported previously, suggesting that escape from child…

  15. Audio description and corpus analysis of popular music

    NARCIS (Netherlands)

    Van Balen, J.M.H.

    2016-01-01

    In the field of sound and music computing, only a handful of studies are concerned with the pursuit of new musical knowledge. There is a substantial body of corpus analysis research focused on new musical insight, but almost all of it deals with symbolic data: scores, chords or manual annotations. I

  16. Description and Preliminary Training Evaluation of an Arc Welding Simulator. Research Report SRR 73-23.

    Science.gov (United States)

    Abrams, Macy L.; And Others

    A prototype arc welding training simulator was designed to provide immediate, discriminative feedback and the capacity for concentrated practice. Two randomly selected groups of welding trainees were compared to evaluate the simulator, one group being trained using the simulator and the other using conventional practice. Preliminary data indicated…

  17. A description of persistent climatic anomalies in a 1000-year climatic model simulation

    Science.gov (United States)

    Hunt, B. G.

    The Mark 2 version of the CSIRO coupled global climatic model has been used to generate a 1000-year simulation of natural (i.e. unforced) climatic variability representative of ``present conditions''. The annual mean output from the simulation has been used to investigate the occurrence of decadal and longer trends over the globe for a number of climatic variables. Here trends are defined to be periods of years with a climatic anomaly of a given sign. The analysis reveals substantial differences between the trend characteristics of the various climatic variables. Trends longer than 12years duration were unusual for rainfall. Such trends were fairly uniformly distributed over the globe and had an asymmetry in the rate of occurrence for wet or dry conditions. On the other hand, trends in surface wind stress, and especially the atmospheric screen temperature, were of longer duration but primarily confined to oceanic regions. The trends in the atmospheric screen temperature could be traced deep into the oceanic mixed layer, implying large changes in oceanic thermal inertia. This thermal inertia then constituted an important component of the `memory' of the climatic system. While the geographic region associated with a given trend could be identified over several adjacent grid boxes of the model, regional plots for individual years of the trend revealed a range of variations, suggesting that a consistent forcing mechanism may not be responsible for a trend at a given location. Typical return periods for 12-year rainfall trends were once in 1000years, highlighting the rarity of such events. Using a looser definition of a trend revealed that drying trends up to 50 years duration were also possible, attributable solely to natural climatic variability. Significant ( 20% to 40%) rainfall reductions per year can be associated with a long-term drying trend, hence such events are of considerable climatic significance. It can take more than 100years for the hydrologic losses

  18. Integrating software architectures for distributed simulations and simulation analysis communities.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Fellig, Daniel; Linebarger, John Michael; Moore, Patrick Curtis; Sa, Timothy J.; Hawley, Marilyn F.

    2005-10-01

    The one-year Software Architecture LDRD (No.79819) was a cross-site effort between Sandia California and Sandia New Mexico. The purpose of this research was to further develop and demonstrate integrating software architecture frameworks for distributed simulation and distributed collaboration in the homeland security domain. The integrated frameworks were initially developed through the Weapons of Mass Destruction Decision Analysis Center (WMD-DAC), sited at SNL/CA, and the National Infrastructure Simulation & Analysis Center (NISAC), sited at SNL/NM. The primary deliverable was a demonstration of both a federation of distributed simulations and a federation of distributed collaborative simulation analysis communities in the context of the same integrated scenario, which was the release of smallpox in San Diego, California. To our knowledge this was the first time such a combination of federations under a single scenario has ever been demonstrated. A secondary deliverable was the creation of the standalone GroupMeld{trademark} collaboration client, which uses the GroupMeld{trademark} synchronous collaboration framework. In addition, a small pilot experiment that used both integrating frameworks allowed a greater range of crisis management options to be performed and evaluated than would have been possible without the use of the frameworks.

  19. Historical rammed earth process description thanks to micromorphological analysis

    OpenAIRE

    HAMARD, Erwan; Cammas, Cécilia; Fabbri, Antonin; Razakamanantsoa, Andry; Cazacliu, Bogdan; MOREL, Jean Claude

    2016-01-01

    Rammed earth was traditionally used in western European countries before industrial building materials replace it during 20th Century. Construction strategies developed by former builders were dictated by locally available construction materials and engendered local constructive cultures. Unfortunately, this knowledge was orally transmitted and is lost today. The rediscovery of these cultures can provide answers to modern rammed earth construction processes. Micromorphological analysis of ear...

  20. Finite Element Analysis Applied to Dentoalveolar Trauma: Methodology Description

    OpenAIRE

    2011-01-01

    Dentoalveolar traumatic injuries are among the clinical conditions most frequently treated in dental practice. However, few studies so far have addressed the biomechanical aspects of these events, probably as a result of difficulties in carrying out satisfactory experimental and clinical studies as well as the unavailability of truly scientific methodologies. The aim of this paper was to describe the use of finite element analysis applied to the biomechanical evaluation of dentoalveolar traum...

  1. Functional description of the Airlift Deployment Analysis System (ADANS)

    Energy Technology Data Exchange (ETDEWEB)

    Harrison, G.; Southworth, F.; Sexton, A.; Hilliard, M.; Kraemer, R. (Oak Ridge National Lab., TN (USA)); Russell, D.L.; Holcomb, M.; Wood, T.S.; Brenner, H.; Jacobi, J. (Tennessee Univ., Knoxville, TN (USA))

    1991-05-01

    The Airlift Deployment Analysis System (ADANS) is an automated system that will provide Headquarters Military Airlift Command (HQMAC) and the Numbered Air Forces (NAFs) with planning, scheduling, and analytical tools for peacetime and contingency airlift operations. ADANS will consist of an algorithms subsystem for airlift planning, scheduling, and analysis; a relational database management system (RDBMS); a user friendly front-end subsystem; and communications software. ADANS will be completed by October 1992. It will be developed in three increments to provide an initial operating capability as quickly as possible. At the end of each increment, an operational airlift planning, scheduling, and analysis system will be installed. ADANS will provide automated tools for MAC mission support allocation, airlift scheduling, load allocation, and analysis of the airlift system. The ADANS scheduling algorithms and database will operate on the Deployment Flow Computer System (DFCS) at HQMAC, Scott Air Force Base. The DFCS will consist of Honeywell DPS 90/92 Tandem and Honeywell DPS 90/92 mainframe computers. User workstations will interface with the DFCS through two local area networks, one classified and one unclassified. The classified DFCS will be connected via a high-speed bus to HQMAC's System 1'' computer, a node on the Worldwide Military Command and Control System Intercomputer Network for communications with the Joint Deployment System and the Joint Operations Planning System. Systems currently under development that will interface with ADANS include the Joint Operation Planning and Execution System and the Global Decision Support System (GDSS). MAC Command and Control Information Processing System data will be available through GDSS. This command and control information will be accessed by ADANS so that analysts will have data on resources used within MAC's airlift system during plan and schedule development. 80 figs., 8 tabs.

  2. Simulation modeling and analysis with Arena

    Energy Technology Data Exchange (ETDEWEB)

    Tayfur Altiok; Benjamin Melamed [Rutgers University, NJ (United States). Department of Industrial and Systems Engineering

    2007-06-15

    The textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment. It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings. Chapter 13.3.3 is on coal loading operations on barges/tugboats.

  3. Sensitivity Analysis of Fire Dynamics Simulation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Nielsen, Peter V.; Petersen, Arnkell J.

    2007-01-01

    equations require solution of the issues of combustion and gas radiation to mention a few. This paper performs a sensitivity analysis of a fire dynamics simulation on a benchmark case where measurement results are available for comparison. The analysis is performed using the method of Elementary Effects......In case of fire dynamics simulation requirements to reliable results are most often very high due to the severe consequences of erroneous results. At the same time it is a well known fact that fire dynamics simulation constitutes rather complex physical phenomena which apart from flow and energy...

  4. Simulation nitrogen-limited crop growth with SWAP/WOFOST : process descriptions and user manual

    NARCIS (Netherlands)

    Groenendijk, Piet; Boogaard, Hendrik; Heinen, Marius; Kroes, J.G.; Supit, Iwan; Wit, de Allard

    2016-01-01

    This report describes a soil nitrogen module (Soil-N), which is combined with the agro-hydrological model, SWAP, and the crop growth model, WOFOST. The core of the Soil-N module is a description of the nitrogen cycle, which is coupled to the organic matter cycle based upon the RothC-26.3 model.

  5. Multipath routing and multiple description coding in ad-hoc networks: A simulation study

    NARCIS (Netherlands)

    Díaz, I.F.; Epema, D.; Jongh, J. de

    2004-01-01

    The nature of wireless multihop ad-hoc networks makes it a challenge to offer connections of an assured quality. In order to improve the performance of such networks, multipath routing in combination with Multiple Description Coding (MDC) has been proposed. By splitting up streams of multimedia traf

  6. Impacts of a change in vegetation description on simulated European summer present-day and future climates

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez, E.; Gaertner, M.A.; Gallardo, C.; Padorno, E.; Castro, M. [Universidad de Castilla-La Mancha (UCLM), Facultad de Ciencias del Medio Ambiente, Toledo (Spain); Arribas, A. [Universidad de Castilla-La Mancha (UCLM), Facultad de Ciencias del Medio Ambiente, Toledo (Spain); Met Office, Exeter, Devon (United Kingdom)

    2007-08-15

    This paper analyzes the soil-atmosphere feedbacks and uncertainties under current (1960-1990) and plausible future climate conditions (2070-2100, using the A2 greenhouse gases emission scenario). For this purpose, two vegetation descriptions differing only in the grassland and grass-with-trees proportion in some parts of the domain have been created. The combination of these two different climate scenarios and two vegetation descriptions defines four different 30-year experiments, which have been completed using a regional climate model. The domain is centered around the Mediterranean basin and covers most of Europe. The study focuses on the summer season when there are major differences between the two vegetation descriptions and when the impact of land-surface processes on precipitation is largest. Present climate experiments show large evapotranspiration differences over areas where vegetation changes have taken place. Precipitation increases (up to 3 mm day{sup -1} in some regions) follow evapotranspiration increases, although with a more complex spatial structure. These results indicate a high sensitivity at regional scales of summer precipitation processes to vegetation changes. Future climate simulations show very similar changes to those observed in the current climate experiments. This indicates that the impacts of climate change are relatively independent to the land-cover descriptions used in this study. (orig.)

  7. Regional hydrogeological simulations for Forsmark - numerical modelling using DarcyTools. Preliminary site description Forsmark area version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-12-15

    A numerical model is developed on a regional-scale (hundreds of square kilometres) to study the zone of influence for variable-density groundwater flow that affects the Forsmark area. Transport calculations are performed by particle tracking from a local-scale release area (a few square kilometres) to test the sensitivity to different hydrogeological uncertainties and the need for far-field realism. The main objectives of the regional flow modelling were to achieve the following: I. Palaeo-hydrogeological understanding: An improved understanding of the palaeohydrogeological conditions is necessary in order to gain credibility for the site descriptive model in general and the hydrogeological description in particular. This requires modelling of the groundwater flow from the last glaciation up to present-day with comparisons against measured TDS and other hydro-geochemical measures. II. Simulation of flow paths: The simulation and visualisation of flow paths from a tentative repository area is a means for describing the role of the current understanding of the modelled hydrogeological conditions in the target volume, i.e. the conditions of primary interest for Safety Assessment. Of particular interest here is demonstration of the need for detailed far-field realism in the numerical simulations. The motivation for a particular model size (and resolution) and set of boundary conditions for a realistic description of the recharge and discharge connected to the flow at repository depth is an essential part of the groundwater flow path simulations. The numerical modelling was performed by two separate modelling teams, the ConnectFlow Team and the DarcyTools Team. The work presented in this report was based on the computer code DarcyTools developed by Computer-aided Fluid Engineering. DarcyTools is a kind of equivalent porous media (EPM) flow code specifically designed to treat flow and salt transport in sparsely fractured crystalline rock intersected by transmissive

  8. The Balanced Scorecard in Portuguese Private Organizations: A Descriptive Analysis

    Directory of Open Access Journals (Sweden)

    Patrícia Rodrigues Quesado

    2012-06-01

    Full Text Available The understanding that the existing models applied to business performance evaluation were obsolete and could induce companies to taking erroneous decisions led to the development of information and management control systems that reflect the evolution of the non-financial or qualitative key success factors, such as the Balanced Scorecard (BSC. Nowadays, the BSC is one of the most popular management accounting tools. However, reports of high rates of failure require a more close analysis at the challenges and key issues with which various private organizations were confronted in its implementation. Consequently, even after more than two decades after the creation of the model, and despite the introduction of some improvements, investment in BSC projects within the corporate and academic environments can still be confirmed. Thus, in order to find out if Portuguese organizations know and implement the BSC, we sent a questionnaire to 549 private organizations, obtaining a response rate of 28.2%. The results suggest that although the majority of respondents know the BSC, its implementation in these organizations is limited and very recent.

  9. Severe Tuberculosis Requiring Intensive Care: A Descriptive Analysis

    Science.gov (United States)

    Figueiredo Dias, Paulo; Ferreira, Alcina Azevedo; Xerinda, Sandra Margarida; Lima Alves, Carlos; Sarmento, António Carlos; dos Santos, Lurdes Campos

    2017-01-01

    Background. This study aims to describe the characteristics of tuberculosis (TB) patients requiring intensive care and to determine the in-hospital mortality and the associated predictive factors. Methods. Retrospective cohort study of all TB patients admitted to the ICU of the Infectious Diseases Department of Centro Hospitalar de São João (Porto, Portugal) between January 2007 and July 2014. Comorbid diagnoses, clinical features, radiological and laboratory investigations, and outcomes were reviewed. Univariate analysis was performed to identify risk factors for death. Results. We included 39 patients: median age was 52.0 years and 74.4% were male. Twenty-one patients (53.8%) died during hospital stay (15 in the ICU). The diagnosis of isolated pulmonary TB, a positive smear for acid-fast-bacilli and a positive PCR for Mycobacterium tuberculosis in patients of pulmonary disease, severe sepsis/septic shock, acute renal failure and Multiple Organ Dysfunction Syndrome on admission, the need for mechanical ventilation or vasopressor support, hospital acquired infection, use of adjunctive corticotherapy, smoking, and alcohol abuse were significantly associated with mortality (p < 0.05). Conclusion. This cohort of TB patients requiring intensive care presented a high mortality rate. Most risk factors for mortality were related to organ failure, but others could be attributed to delay in the diagnostic and therapeutic approach, important targets for intervention. PMID:28250986

  10. Leadership communication styles: a descriptive analysis of health care professionals

    Directory of Open Access Journals (Sweden)

    Rogers R

    2012-06-01

    Full Text Available Rebekah RogersSchool of Communication, East Carolina University, NC, USAAbstract: The study of leadership in health care is important to examine for many reasons. Health care leaders will inevitably have an impact on the lives of many people, as individuals rely on physicians and nurses during some of the most critical moments in their lives. This paper presents a broad overview of a research study conducted over the past year and highlights its general conclusions. In this study, I examined the leadership styles of health care administrators and those of physicians and nurses who chair departments. Thorough analysis yielded three clear themes: viewpoints on leadership, decision making, and relationships. Physicians' viewpoints on leadership varied; however, it was assumed that they knew they were leaders. Nurses seemed to be in a category of their own, in which it was common for them to use the term “servant leadership.” Results from the hospital administrators suggested that they were always thinking “big picture leadership.” Leadership is a working component of every job and it is important for people to become as educated as possible about their own communication style.Keywords: leadership, communication, health care

  11. Concentration in the Greek private hospital sector: a descriptive analysis.

    Science.gov (United States)

    Boutsioli, Zoe

    2007-07-01

    Over the last 20 years, governments all around the world have attempted to boost the role of market and competition in health care industries in order to increase efficiency and reduce costs. The increased competition and the significant implications on costs and prices of health care services resulted in health care industries being transformed. Large firms are merging and acquiring other firms. If this trend continues, few firms will dominate the health care markets. In this study, I use the simple concentration ratio (CR) for the largest 4, 8 and 20 companies to measure the concentration of Greek private hospitals during the period 1997-2004. Also, the Gini coefficient for inequality is used. For the two different categories of hospitals used (a) general and neuropsychiatric and (b) obstetric/gynaecological it is evident that the top four firms of the first category accounted for 43% of sales in 1997, and 52% in 2004, while the four largest firms of the second category accounted for almost 83% in 1997, and 81% in 2004. Also, the Gini coefficient increases over the 8-year period examined from 0.69 in 1997 to 0.82 in 2004. It explains that the market of the private health care services becomes less equal in the sense that fewer private hospitals and clinics hold more and more of the share of the total sales. From a cross-industry analysis it is clear that the private hospital sector has the highest concentration rate. Finally, it appears that the market structure of the private hospitals in Greece resembles more closely to an oligopoly rather than a monopolistic competition, since very few firms dominate the market.

  12. Description and characterization of a novel method for partial volume simulation in software breast phantoms.

    Science.gov (United States)

    Chen, Feiyu; Bakic, Predrag R; Maidment, Andrew D A; Jensen, Shane T; Shi, Xiquan; Pokrajac, David D

    2015-10-01

    A modification to our previous simulation of breast anatomy is proposed to improve the quality of simulated x-ray projections images. The image quality is affected by the voxel size of the simulation. Large voxels can cause notable spatial quantization artifacts; small voxels extend the generation time and increase the memory requirements. An improvement in image quality is achievable without reducing voxel size by the simulation of partial volume averaging in which voxels containing more than one simulated tissue type are allowed. The linear x-ray attenuation coefficient of voxels is, thus, the sum of the linear attenuation coefficients weighted by the voxel subvolume occupied by each tissue type. A local planar approximation of the boundary surface is employed. In the two-material case, the partial volume in each voxel is computed by decomposition into up to four simple geometric shapes. In the three-material case, by application of the Gauss-Ostrogradsky theorem, the 3D partial volume problem is converted into one of a few simpler 2D surface area problems. We illustrate the benefits of the proposed methodology on simulated x-ray projections. An efficient encoding scheme is proposed for the type and proportion of simulated tissues in each voxel. Monte Carlo simulation was used to evaluate the quantitative error of our approximation algorithms.

  13. CONDENSED MONTE-CARLO SIMULATIONS FOR THE DESCRIPTION OF LIGHT TRANSPORT

    NARCIS (Netherlands)

    GRAAFF, R; KOELINK, MH; DEMUL, FFM; ZIJLSTRA, WG; DASSEL, ACM; AARNOUDSE, JG

    1993-01-01

    A novel method, condensed Monte Carlo simulation, is presented that applies the results of a single Monte Carlo simulation for a given albedo mu(s)/(mu(a) + mu(s)) to obtaining results for other albedos; mu(s) and mu(a) are the scattering and absorption coefficients, respectively. The method require

  14. Simulation on design-based and model-based methods in descriptive analysis of complex samples%基于设计和基于模型方法在复杂抽样数据统计描述中的模拟比较研究

    Institute of Scientific and Technical Information of China (English)

    李镒冲; 于石成; 赵寅君; 姜勇; 王丽敏; 张梅; 蒋炜; 包鹤龄; 周脉耕

    2015-01-01

    目的 比较基于设计和基于模型方法在复杂样本统计描述中的表现.方法 以2010年中国慢性病及其危险因素监测的收缩压(SBP)和血压升高率为材料,利用多阶段随机抽样模拟抽取1 000次(样本量均为2 000名),同时赋予样本随年龄增加而变大的应答率,使样本年龄结构偏离目标人群.以均方误差(MSE)和95%可信区间(CI)覆盖参数的概率为评价标准,比较基于设计方法,基于模型的常规方法和多水平模型对均数和率进行统计描述时的表现.结果 常规方法、基于设计方法和多水平模型在估计SBP均数时,MSE分别为6.41、1.38和5.86,基于设计方法表现最好;3种方法估计的95% CI覆盖总体参数的概率分别为24.7%、97.5%和84.3%,常规方法和多水平模型均可导致统计推断Ⅰ类错误概率增加.估计血压升高率时,基于设计方法的MSE为4.80,表现优于常规方法(20.9)和多水平模型(17.2);而常规方法95% CI包含总体参数的概率仅为29.4%,多水平模型为86.4%,均低于基于设计的方法(97.3%).结论 对样本结构存在系统偏差的复杂抽样数据进行统计描述时,基于设计方法在估计的无偏性和统计推断的有效性方面均优于常规方法和多水平模型,应作为首选方法.%Objective To compare design-based and model-based methods in descriptive analysis of complex sample.Methods A total of 1 000 samples were selected and a multistage random sampling design was used in the analysis of the 2010 China chronic disease and risk factors surveillance.For each simulated sample,cases with probability proportional age were randomly deleted so that sample age structure was deviated systematically from that of the target population.Mean systolic blood pressure(SBP) and prevalence of raised blood pressure,as well as their 95 % confidence intervals (95 % CI) were determined using designbased and model-based methods(routine method and multi

  15. Passive-solar simulation analysis

    Energy Technology Data Exchange (ETDEWEB)

    1982-12-05

    A passive solar heating component (TANKWALL) was developed. TANKWALL is a fiberglass water wall, selective surfaced, solar absorption, thermal storage tank. Testing of the thermal performance of TANKWALL was performed which included a series of room and instrumentation calibration tests and several comparison experiments where TANKWALL was operated side-by-side with other passive solar configurations. Relevant effective heat transfer values (solar absorptance, capacitance, conductance) were derived using a thermal network computer model and data from the above experiments. Then using these heat transfer values, three hour-by-hour computer simulations were run for a proto-typical passive solar house using various solar systems and standard hourly weather data. The subject house had a Building Load Coefficient of 8899.2 Btu/H, a vertical south facing 200 sq. ft. collector, and a Load/Collector Ratio of 44.5. It was found that the TANKWALL with selective surface performs significantly better in the Dayton, Ohio area than either the painted 12 inch Trombe wall or the painted TANKWALL with R-4.5 night insulation.

  16. A Survey on how Description Logic Ontologies Benefit from Formal Concept Analysis

    CERN Document Server

    Sertkaya, Baris

    2011-01-01

    Although the notion of a concept as a collection of objects sharing certain properties, and the notion of a conceptual hierarchy are fundamental to both Formal Concept Analysis and Description Logics, the ways concepts are described and obtained differ significantly between these two research areas. Despite these differences, there have been several attempts to bridge the gap between these two formalisms, and attempts to apply methods from one field in the other. The present work aims to give an overview on the research done in combining Description Logics and Formal Concept Analysis.

  17. Usage of a Responsible Gambling Tool: A Descriptive Analysis and Latent Class Analysis of User Behavior.

    Science.gov (United States)

    Forsström, David; Hesser, Hugo; Carlbring, Per

    2016-09-01

    Gambling is a common pastime around the world. Most gamblers can engage in gambling activities without negative consequences, but some run the risk of developing an excessive gambling pattern. Excessive gambling has severe negative economic and psychological consequences, which makes the development of responsible gambling strategies vital to protecting individuals from these risks. One such strategy is responsible gambling (RG) tools. These tools track an individual's gambling history and supplies personalized feedback and might be one way to decrease excessive gambling behavior. However, research is lacking in this area and little is known about the usage of these tools. The aim of this article is to describe user behavior and to investigate if there are different subclasses of users by conducting a latent class analysis. The user behaviour of 9528 online gamblers who voluntarily used a RG tool was analysed. Number of visits to the site, self-tests made, and advice used were the observed variables included in the latent class analysis. Descriptive statistics show that overall the functions of the tool had a high initial usage and a low repeated usage. Latent class analysis yielded five distinct classes of users: self-testers, multi-function users, advice users, site visitors, and non-users. Multinomial regression revealed that classes were associated with different risk levels of excessive gambling. The self-testers and multi-function users used the tool to a higher extent and were found to have a greater risk of excessive gambling than the other classes.

  18. Stochastic simulation algorithms and analysis

    CERN Document Server

    Asmussen, Soren

    2007-01-01

    Sampling-based computational methods have become a fundamental part of the numerical toolset of practitioners and researchers across an enormous number of different applied domains and academic disciplines. This book provides a broad treatment of such sampling-based methods, as well as accompanying mathematical analysis of the convergence properties of the methods discussed. The reach of the ideas is illustrated by discussing a wide range of applications and the models that have found wide usage. The first half of the book focuses on general methods, whereas the second half discusses model-specific algorithms. Given the wide range of examples, exercises and applications students, practitioners and researchers in probability, statistics, operations research, economics, finance, engineering as well as biology and chemistry and physics will find the book of value.

  19. Regional hydrogeological simulations using CONECTFLOW. Preliminary site description. Laxemar sub area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Hunter, Fiona; Jackson, Peter; McCarthy, Rachel [Serco Assurance, Risley (United Kingdom); Gylling, Bjoern; Marsic, Niko [Kemakta Konsult AB, Stockholm (Sweden)

    2006-04-15

    The main objective of this study is to support the development of a preliminary Site Description of the Laxemar subarea on a regional-scale based on the available data of November 2004 (Data Freeze L1.2). A more specific objective of this study is to assess the role of both known and less quantified hydrogeological conditions in determining the present-day distribution of saline groundwater in the Laxemar subarea on a regional-scale. An improved understanding of the palaeo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale, as well as predictions of future hydrogeological conditions. Another objective is to assess the flow-paths from the local-scale model domain, based on the present-day flow conditions, to assess the distribution of discharge and recharge areas connected to the flow at the approximate repository depth to inform the Preliminary Safety Evaluation. Significant new features incorporated in the modelling include: a depth variation in hydraulic properties within the deformation zones; a dependence on rock domain and depth in the rock mass properties in regional-scale models; a more detailed model of the overburden in terms of a layered system of spatially variable thickness made up of several different types of Quaternary deposits has been implemented; and several variants on the position of the watertable have been tried. The motivation for introducing a dependence on rock domain was guided by the hydrogeological interpretation with the aim of honouring the observed differences in hydraulic properties measured at the boreholes.

  20. Application of three-dimensional simulation at lecturing on descriptive geometry

    Directory of Open Access Journals (Sweden)

    Tel'noy Viktor Ivanovich

    2014-05-01

    Full Text Available Teaching descriptive geometry has its own characteristics. Need not only to inform students of a certain amount of knowledge on the subject, but also to develop their spatial imagination as well as the right to develop the skills of logical thinking. Practice of teaching the discipline showed that students face serious difficulties in the process of its study. This is due to the relatively low level of their schooling in geometry and technical drawing, and lacking in high spatial imagination. They find it difficult to imagine the geometrical image of the object of study and mentally convert it on the plane. Because of this, there is a need to find ways to effectively teach the discipline «Descriptive Geometry» at university. In the context of global informatization and computerization of the educational process, implementation of graphically programs for the development of design documentation and 3D modeling is one of the most promising applications of information technology in the process of solving these problems. With the help of three-dimensional models the best visibility in the classroom is achieved. When conducting lectures on descriptive geometry it is requested to use three-dimensional modeling not only as didactic means (demonstrativeness means, but also as a method of teaching (learning tool to deal with various graphics tasks. Bearing this in mind, the essence of the implementation of 3D modeling is revealed with the aim of better understanding of the algorithms for solving both positional and metric tasks using spatial representation of graphic constructions. It is shown that the possibility to consider the built model from different angles is of particular importance, as well as the use of transparency properties for illustrating the results of solving geometric problems. Using 3D models together with their display on the plane, as well as text information promotes better assimilation and more lasting memorization of the

  1. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    Energy Technology Data Exchange (ETDEWEB)

    Follin, Sven [SF GeoLogic AB, Stockholm (Sweden); Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations.

  2. Evaluation of bitterness in white wine applying descriptive analysis, time-intensity analysis, and temporal dominance of sensations analysis.

    Science.gov (United States)

    Sokolowsky, Martina; Fischer, Ulrich

    2012-06-30

    Bitterness in wine, especially in white wine, is a complex and sensitive topic as it is a persistent sensation with negative connotation by consumers. However, the molecular base for bitter taste in white wines is still widely unknown yet. At the same time studies dealing with bitterness have to cope with the temporal dynamics of bitter perception. The most common method to describe bitter taste is the static measurement amongst other attributes during a descriptive analysis. A less frequently applied method, the time-intensity analysis, evaluates the temporal gustatory changes focusing on bitterness alone. The most recently developed multidimensional approach of the temporal dominance of sensations method reveals the temporal dominance of bitter taste in relation to other attributes. In order to compare the results comprised with these different sensory methodologies, 13 commercial white wines were evaluated by the same panel. To facilitate a statistical comparison, parameters were extracted from bitterness curves obtained from time-intensity and temporal dominance of sensations analysis and were compared to bitter intensity as well as bitter persistency based on descriptive analysis. Analysis of variance differentiated significantly the wines regarding all measured bitterness parameters obtained from the three sensory techniques. Comparing the information of all sensory parameters by multiple factor analysis and correlation, each technique provided additional valuable information regarding the complex bitter perception in white wine.

  3. Description and analysis of the debris flows occurred during 2008 in the Eastern Pyrenees

    Directory of Open Access Journals (Sweden)

    M. Portilla

    2010-07-01

    Full Text Available Rainfall-triggered landslides taking place in the Spanish Eastern Pyrenees have usually been analysed on a regional scale. Most research focussed either on terrain susceptibility or on the characteristics of the critical rainfall, neglecting a detailed analysis of individual events. In contrast to other mountainous regions, research on debris flow has only been performed marginally and associated hazard has mostly been neglected.

    In this study, five debris flows, which occurred in 2008, are selected; and site specific descriptions and analysis regarding geology, morphology, rainfall data and runout were performed. The results are compared with worldwide data and some conclusions on hazard assessment are presented.

    The five events can be divided into two in-channel debris flows and three landslide-triggered debris flows. The in-channel generated debris flows exceeded 10 000 m3, which are unusually large mass movements compared to historic events which occurred in the Eastern Pyrenees. In contrast, the other events mobilised total volumes less than 2000 m3. The geomorphologic analysis showed that the studied events emphasize similar patterns when compared to published data focussing on slope angle in the initiation zone or catchment area.

    Rainfall data revealed that all debris flows were triggered by high intensity-short duration rainstorms during the summer season. Unfortunately, existing rainfall thresholds in the Eastern Pyrenees consider long-lasting rainfall, usually occurring in autumn/winter. Therefore, new thresholds should be established taking into account the rainfall peak intensity in mm/h, which seems to be a much more relevant factor for summer than the event's total precipitation.

    The runout analysis of the 2008 debris flows confirms the trend that larger volumes generally induce higher mobility. The numerical simulation of the Riu Runer event shows that its dynamic behaviour

  4. A descriptive analysis of studies on behavioural treatment of drooling (1970-2005).

    NARCIS (Netherlands)

    Burg, J.J. van der; Didden, R.; Jongerius, P.H.; Rotteveel, J.J.

    2007-01-01

    A descriptive analysis was conducted on studies on the behavioural treatment of drooling (published between 1970 and 2005). The 17 articles that met the inclusion criteria described 53 participants (mean age 14y 7mo, [SD 4y 9mo]; range 6-28y). Sex of 87% of the participants was reported: 28 male, 18

  5. A descriptive analysis of studies on behavioural treatment of drooling (1970-2005).

    NARCIS (Netherlands)

    Burg, J.J. van der; Didden, R.; Jongerius, P.H.; Rotteveel, J.J.

    2007-01-01

    A descriptive analysis was conducted on studies on the behavioural treatment of drooling (published between 1970 and 2005). The 17 articles that met the inclusion criteria described 53 participants (mean age 14y 7mo, [SD 4y 9mo]; range 6-28y). Sex of 87% of the participants was reported: 28 male, 18

  6. Effects of a Training Package to Improve the Accuracy of Descriptive Analysis Data Recording

    Science.gov (United States)

    Mayer, Kimberly L.; DiGennaro Reed, Florence D.

    2013-01-01

    Functional behavior assessment is an important precursor to developing interventions to address a problem behavior. Descriptive analysis, a type of functional behavior assessment, is effective in informing intervention design only if the gathered data accurately capture relevant events and behaviors. We investigated a training procedure to improve…

  7. Description langugage for the modelling and analysis of temporal change of instrumentation and control system structures

    Energy Technology Data Exchange (ETDEWEB)

    Goering, Markus Heinrich

    2013-10-25

    The utilisation of computer-based I and C, as a result of the technological advancements in the computer industry, represents an up-to-date challenge for I and C engineers in nuclear power plants throughout the world. In comparison with the time-proven, hard-wired I and C, the engineering must consider the novel characteristics of computer-based technology during the implementation, these are primarily constituted by higher performance and the utilisation of software. On one hand, this allows for implementing more complex I and C functions and integrating several I and C functions on to single components, although on the other hand, the minimisation of the CCF probability is of high priority to the engineering. Furthermore, the engineering must take the implementation of the deterministic safety concept for the I and C design into consideration. This includes engineering the redundancy, diversity, physical separation, and independence design features, and is complemented by the analysis of the I and C design with respect to the superposition of pre-defined event sequences and postulated failure combinations, so as to secure the safe operation of the nuclear power plant. The focus of this thesis is on the basic principles of engineering, i.e. description languages and methods, which the engineering relies on for a highly qualitative and efficient computer-based I and C implementation. The analysis of the deterministic safety concept and computer-based I and C characteristics yields the relevant technical requirements for the engineering, these are combined with the general structuring principles of standard IEC 81346 and the extended description language evaluation criteria, which are based on the guideline VDI/VDE-3681, resulting in target criteria for evaluating description languages. The analysis and comparison of existing description languages reveals that no description language satisfactorily fulfils all target criteria, which is constituted in the

  8. Power vectors: an application of Fourier analysis to the description and statistical analysis of refractive error.

    Science.gov (United States)

    Thibos, L N; Wheeler, W; Horner, D

    1997-06-01

    The description of sphero-cylinder lenses is approached from the viewpoint of Fourier analysis of the power profile. It is shown that the familiar sine-squared law leads naturally to a Fourier series representation with exactly three Fourier coefficients, representing the natural parameters of a thin lens. The constant term corresponds to the mean spherical equivalent (MSE) power, whereas the amplitude and phase of the harmonic correspond to the power and axis of a Jackson cross-cylinder (JCC) lens, respectively. Expressing the Fourier series in rectangular form leads to the representation of an arbitrary sphero-cylinder lens as the sum of a spherical lens and two cross-cylinders, one at axis 0 degree and the other at axis 45 degrees. The power of these three component lenses may be interpreted as (x,y,z) coordinates of a vector representation of the power profile. Advantages of this power vector representation of a sphero-cylinder lens for numerical and graphical analysis of optometric data are described for problems involving lens combinations, comparison of different lenses, and the statistical distribution of refractive errors.

  9. Ambrosia artemisiifolia L. pollen simulations over the Euro-CORDEX domain: model description and emission calibration

    Science.gov (United States)

    liu, li; Solmon, Fabien; Giorgi, Filippo; Vautard, Robert

    2014-05-01

    Ragweed Ambrosia artemisiifolia L. is a highly allergenic invasive plant. Its pollen can be transported over large distances and has been recognized as a significant cause of hayfever and asthma (D'Amato et al., 2007). In the context of the ATOPICA EU program we are studying the links between climate, land use and ecological changes on the ragweed pollen emissions and concentrations. For this purpose, we implemented a pollen emission/transport module in the RegCM4 regional climate model in collaboration with ATOPICA partners. The Abdus Salam International Centre for Theoretical Physics (ICTP) regional climate model, i.e. RegCM4 was adapted to incorporate the pollen emissions from (ORCHIDEE French) Global Land Surface Model and a pollen tracer model for describing pollen convective transport, turbulent mixing, dry and wet deposition over extensive domains, using consistent assumption regarding the transport of multiple species (Fabien et al., 2008). We performed two families of recent-past simulations on the Euro-Cordex domain (simulation for future condition is been considering). Hindcast simulations (2000~2011) were driven by the ERA-Interim re-analyses and designed to best simulate past periods airborne pollens, which were calibrated with parts of observations and verified by comparison with the additional observations. Historical simulations (1985~2004) were driven by HadGEM CMPI5 and designed to serve as a baseline for comparison with future airborne concentrations as obtained from climate and land-use scenarios. To reduce the uncertainties on the ragweed pollen emission, an assimilation-like method (Rouǐl et al., 2009) was used to calibrate release based on airborne pollen observations. The observations were divided into two groups and used for calibration and validation separately. A wide range of possible calibration coefficients were tested for each calibration station, making the bias between observations and simulations within an admissible value then

  10. Xanthogranulomatous Pyelonephritis Can Simulate a Complex Cyst: Case Description and Review of Literature

    Directory of Open Access Journals (Sweden)

    Salvatore Butticè

    2014-05-01

    Full Text Available Xanthogranulomatous pyelonephritis is a rare and peculiar form of chronic pyelonephritis and is generally associated with renal lithiasis. Its incidence is higher in females. The peculiarity of this disease is that it requires a differential diagnosis, because it can often simulate dramatic pathologic conditions. In fact, in the literature are also described cases in association with squamous cell carcinoma of the kidney The radiologic clinical findings simulate renal masses, sometimes in association with caval thrombus. We describe a case of xanthogranulomatous pyelonephritis with radiologic aspects of a complex cyst of Bosniak class III in a man 40-year old.

  11. Descriptive Research

    DEFF Research Database (Denmark)

    Wigram, Anthony Lewis

    2003-01-01

    Descriptive research is described by Lathom-Radocy and Radocy (1995) to include Survey research, ex post facto research, case studies and developmental studies. Descriptive research also includes a review of the literature in order to provide both quantitative and qualitative evidence of the effect...... starts will allow effect size calculations to be made in order to evaluate effect over time. Given the difficulties in undertaking controlled experimental studies in the creative arts therapies, descriptive research methods offer a way of quantifying effect through descriptive statistical analysis...

  12. Critical slowing down and error analysis in lattice QCD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Virotta, Francesco

    2012-02-21

    In this work we investigate the critical slowing down of lattice QCD simulations. We perform a preliminary study in the quenched approximation where we find that our estimate of the exponential auto-correlation time scales as {tau}{sub exp}(a){proportional_to}a{sup -5}, where a is the lattice spacing. In unquenched simulations with O(a) improved Wilson fermions we do not obtain a scaling law but find results compatible with the behavior that we find in the pure gauge theory. The discussion is supported by a large set of ensembles both in pure gauge and in the theory with two degenerate sea quarks. We have moreover investigated the effect of slow algorithmic modes in the error analysis of the expectation value of typical lattice QCD observables (hadronic matrix elements and masses). In the context of simulations affected by slow modes we propose and test a method to obtain reliable estimates of statistical errors. The method is supposed to help in the typical algorithmic setup of lattice QCD, namely when the total statistics collected is of O(10){tau}{sub exp}. This is the typical case when simulating close to the continuum limit where the computational costs for producing two independent data points can be extremely large. We finally discuss the scale setting in N{sub f}=2 simulations using the Kaon decay constant f{sub K} as physical input. The method is explained together with a thorough discussion of the error analysis employed. A description of the publicly available code used for the error analysis is included.

  13. Alpal, a Program to Generate Physics Simulation Codes from Natural Descriptions

    Science.gov (United States)

    Cook, G. O.

    A Livermore Physics Applications Language (ALPAL), a new computer language, is described. ALPAL is a tool that generates a Fortran code module from a natural description of a physics model. This capability gives the computational physicist a significant productivity boost. While ALPAL is a working computer program, significant additions are being made to it. Some of the factors that make ALPAL an important tool are: first, it eliminates many sources of errors; second, it permits building program modules with far greater speed than is otherwise possible; third, it provides a means of specifying many numerical algorithms; and fourth, it is a language that is close to a journal-style presentation of physics models and numerical methods for solving them. In sum, ALPAL is designed to magnify the abilities and creativity of computational physicists.

  14. Preliminary site description: Groundwater flow simulations. Simpevarp area (version 1.1) modelled with CONNECTFLOW

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Worth, David [Serco Assurance Ltd, Risley (United Kingdom); Gylling, Bjoern; Marsic, Niko [Kemakta Konsult AB, Stockholm (Sweden); Holmen, Johan [Golder Associates, Stockholm (Sweden)

    2004-08-01

    The main objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater at the Simpevarp and Laxemar sites. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Descriptive Model in general and the Site Hydrogeological Description in particular. This is to serve as a basis for describing the present hydrogeological conditions as well as predictions of future hydrogeological conditions. This objective implies a testing of: geometrical alternatives in the structural geology and bedrock fracturing, variants in the initial and boundary conditions, and parameter uncertainties (i.e. uncertainties in the hydraulic property assignment). This testing is necessary in order to evaluate the impact on the groundwater flow field of the specified components and to promote proposals of further investigations of the hydrogeological conditions at the site. The general methodology for modelling transient salt transport and groundwater flow using CONNECTFLOW that was developed for Forsmark has been applied successfully also for Simpevarp. Because of time constraints only a key set of variants were performed that focussed on the influences of DFN model parameters, the kinematic porosity, and the initial condition. Salinity data in deep boreholes available at the time of the project was too limited to allow a good calibration exercise. However, the model predictions are compared with the available data from KLX01 and KLX02 below. Once more salinity data is available it may be possible to draw more definite conclusions based on the differences between variants. At the moment though the differences should just be used understand the sensitivity of the models to various input parameters.

  15. TRANSIMS: Transportation analysis and simulation system

    Energy Technology Data Exchange (ETDEWEB)

    Smith, L.; Beckman, R.; Baggerly, K. [and others

    1995-07-01

    This document summarizes the TRansportation ANalysis and SIMulation System (TRANSIMS) Project, the system`s major modules, and the project`s near-term plans. TRANSIMS will employ advanced computational and analytical techniques to create an integrated regional transportation systems analysis environment. The simulation environment will include a regional population of individual travelers and freight loads with travel activities and plans, whose individual interactions will be simulated on the transportation system, and whose environmental impact will be determined. We will develop an interim operational capability (IOC) for each major TRANSIMS module during the five-year program. When the IOC is ready, we will complete a specific case study to confirm the IOC features, applicability, and readiness.

  16. Simulation and analysis of flexibly jointed manipulators

    Science.gov (United States)

    Murphy, Steve H.; Wen, John T.; Saridis, George M.

    1990-01-01

    Modeling, simulation, and analysis of robot manipulators with non-negligible joint flexibility are studied. A recursive Newton-Euler model of the flexibly jointed manipulator is developed with many advantages over the traditional Lagrange-Euler methods. The Newton-Euler approach leads to a method for the simulation of a flexibly jointed manipulator in which the number of computations grows linearly with the number of links. Additionally, any function for the flexibility between the motor and link may be used permitting the simulation of nonlinear effects, such as backlash, in a uniform manner for all joints. An analysis of the control problems for flexibly jointed manipulators is presented by converting the Newton-Euler model to a Lagrange-Euler form. The detailed structure available in the model is used to examine linearizing controllers and shows the dependency of the control on the choice of flexible model and structure of the manipulator.

  17. Design and Analysis of Simulation Experiments

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2015-01-01

    This is a new edition of Kleijnen’s advanced expository book on statistical methods for the Design and Analysis of Simulation Experiments (DASE). Altogether, this new edition has approximately 50% new material not in the original book. More specifically, the author has made significant changes to

  18. Critical Discourse Analysis, Description, Explanation, Causes: Foucault's Inspiration Versus Weber's Perspiration

    Directory of Open Access Journals (Sweden)

    Gary Wickham

    2007-05-01

    Full Text Available The FOUCAULTian governmentality approach, in relying on a teleology—the ultimate purpose of human endeavour is the quest for ever-growing human reason, a reason that is the universal basis of moral judgements, especially moral judgements about political and legal actions—leads not to description, explanation and the possible identification of causes, but to critique, to the inappropriate conflation of, on the one hand, description, explanation and the identification of causes with, on the other, political criticisms sourced in the teleology. Drawing on some of WEBER's methodological insights, an argument is developed that critical discourse analysis, in taking on the FOUCAULTian approach, gives up the best traditions of description, explanation and the identification of causes in favour of the expression, in many different forms, of the teleology. URN: urn:nbn:de:0114-fqs070246

  19. Computer Modeling and Simulation of Geofluids: A General Review and Sample Description

    Institute of Scientific and Technical Information of China (English)

    胡文XUAN; 段振豪; 等

    1997-01-01

    Thermodynamic properties of fluids are essential for the understanding of the geochemical behavior of various processes,The paper introduces the most updated computer modeling and simulation methods in the study of thermodynamics of geofluids,inclduing semiempirical models(such as equation of state)and molecular dynamics and Monte Carlo simulation.A well-established semi-empirical model can interpolate and extrapolate experimental data and yield much physicochemical information.Computer modeling may produce"experimental data" yield much physicochemical information.Computer modeling may produce"experimental data"even under experimentally difficult conditions.They provide important methods for the study of geological fluid systems on the quantitative basis.

  20. Software for Simulation of Power Plant Processes. Part B - Program Description and Application

    DEFF Research Database (Denmark)

    Elmegaard, Brian; Houbak, Niels

    2002-01-01

    Modelling of energy systems has been increasingly more important. In particular the dynamic behaviour is critical when operating the systems closer to the limits (either of the process, the materials, the emissions or the economics, etc.). This enforces strong requirements on both the models and ...... of the energy system simulator DNA and a short tricky example showing that too simple models may result in unexpected problems....

  1. Radiographic simulations and analysis for ASCI

    Energy Technology Data Exchange (ETDEWEB)

    Aufderheide, M.; Stone, D.; VonWittenau, A.

    1998-12-18

    In this paper, the authors describe their work on developing quantitatively accurate radiographic simulation and analysis tools for ASCI hydro codes. they have extended the ability of HADES, the code which simulates radiography through a mesh, to treat the complex meshes used in ASCI calculations. The ultimate goal is to allow direct comparison between experimental radiographs and full physics simulated radiographs of ASCI calculations. They describe the ray-tracing algorithm they have developed for fast, accurate simulation of dynamic radiographs with the meshes used in ALE3D, an LLNL ASCI code. Spectral effects and material compositions are included. In addition to the newness of the mesh types, the distributed nature of domain decomposed problems requires special treatment by the radiographic code. Because of the size of such problems, they have parallelized the radiographic simulation, in order to have quick turnaround time. presently, this is done using the domain decomposition from the hydro code. They demonstrate good parallel scaling as the size of the problem is increased. They show a comparison between an experimental radiograph of a high explosive detonation and a simulated radiograph of an ALE3D calculation. They conclude with a discussion of future work.

  2. Pandora - a simulation tool for safety assessments. Technical description and user's guide

    Energy Technology Data Exchange (ETDEWEB)

    Ekstroem, Per-Anders (Facilia AB (Sweden))

    2010-12-15

    This report documents a flexible simulation tool, Pandora, used in several post closure safety assessments in both Sweden and Finland to assess the radiological dose to man due to releases from radioactive waste repositories. Pandora allows the user to build compartment models to represent the migration and fate of radionuclides in the environment. The tool simplifies the implementation and simulation of radioecological biosphere models in which there exist a large set of radionuclides and input variables. Based on the well-known technical computing software MATLAB and especially its interactive graphical environment Simulink, Pandora receives many benefits. MATLAB/Simulink is a highly flexible tool used for simulations of practically any type of dynamic system; it is widely used, continuously maintained, and often upgraded. By basing the tool on this commercial software package, we gain both the graphical interface provided by Simulink, as well as the ability to access the advanced numerical equation solving routines in MATLAB. Since these numerical methods are well established and quality assured in their MATLAB implementation, the solution methods used in Pandora can be considered to have high level of quality assurance. The structure of Pandora provides clarity in the model format, which means the model itself assists its own documentation, since the model can be understood by inspecting its structure. With the introduction of the external tool Pandas (Pandora assessment tool), version handling and an integrated way of performing the entire calculation chain has been added. Instead of being dependent on other commercial statistical software as @Risk for performing probabilistic assessments, they can now be performed within the tool

  3. Three column intermittent simulated moving bed chromatography: 1. Process description and comparative assessment.

    Science.gov (United States)

    Jermann, Simon; Mazzotti, Marco

    2014-09-26

    The three column intermittent simulated moving bed (3C-ISMB) process is a new type of multi-column chromatographic process for binary separations and can be regarded as a modification of the I-SMB process commercialized by Nippon Rensui Corporation. In contrast to conventional I-SMB, this enables the use of only three instead of four columns without compromising product purity and throughput. The novel mode of operation is characterized by intermittent feeding and product withdrawal as well as by partial recycling of the weakly retained component from section III to section I. Due to the smaller number of columns with respect to conventional I-SMB, higher internal flow rates can be applied without violating pressure drop constraints. Therefore, the application of 3C-ISMB allows for a higher throughput whilst using a smaller number of columns. As a result, we expect that the productivity given in terms of throughput per unit time and unit volume of stationary phase can be significantly increased. In this contribution, we describe the new process concept in detail and analyze its cyclic steady state behavior through an extensive simulation study. The latter shows that 3C-ISMB can be easily designed by Triangle Theory even under highly non-linear conditions. The simple process design is an important advantage to other advanced SMB-like processes. Moreover, the simulation study demonstrates the superior performance of 3C-ISMB, namely productivity increases by roughly 60% with respect to conventional I-SMB without significantly sacrificing solvent consumption. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Description and evaluation of GMXe: a new aerosol submodel for global simulations (v1

    Directory of Open Access Journals (Sweden)

    K. J. Pringle

    2010-09-01

    Full Text Available We present a new aerosol microphysics and gas aerosol partitioning submodel (Global Modal-aerosol eXtension, GMXe implemented within the ECHAM/MESSy Atmospheric Chemistry model (EMAC, version 1.8. The submodel is computationally efficient and is suitable for medium to long term simulations with global and regional models. The aerosol size distribution is treated using 7 log-normal modes and has the same microphysical core as the M7 submodel (Vignati et al., 2004.

    The main developments in this work are: (i the extension of the aerosol emission routines and the M7 microphysics, so that an increased (and variable number of aerosol species can be treated (new species include sodium and chloride, and potentially magnesium, calcium, and potassium, (ii the coupling of the aerosol microphysics to a choice of treatments of gas/aerosol partitioning to allow the treatment of semi-volatile aerosol, and, (iii the implementation and evaluation of the developed submodel within the EMAC model of atmospheric chemistry.

    Simulated concentrations of black carbon, particulate organic matter, dust, sea spray, sulfate and ammonium aerosol are shown to be in good agreement with observations (for all species at least 40% of modeled values are within a factor of 2 of the observations. The distribution of nitrate aerosol is compared to observations in both clean and polluted regions. Concentrations in polluted continental regions are simulated quite well, but there is a general tendency to overestimate nitrate, particularly in coastal regions (geometric mean of modelled values/geometric mean of observed data ≈2. In all regions considered more than 40% of nitrate concentrations are within a factor of two of the observations. Marine nitrate concentrations are well captured with 96% of modeled values within a factor of 2 of the observations.

  5. Development of a compartment model based on CFD simulations for description of mixing in bioreactors

    Directory of Open Access Journals (Sweden)

    Crine, M.

    2010-01-01

    Full Text Available Understanding and modeling the complex interactions between biological reaction and hydrodynamics are a key problem when dealing with bioprocesses. It is fundamental to be able to accurately predict the hydrodynamics behavior of bioreactors of different size and its interaction with the biological reaction. CFD can provide detailed modeling about hydrodynamics and mixing. However, it is computationally intensive, especially when reactions are taken into account. Another way to predict hydrodynamics is the use of "Compartment" or "Multi-zone" models which are much less demanding in computation time than CFD. However, compartments and fluxes between them are often defined by considering global quantities not representative of the flow. To overcome the limitations of these two methods, a solution is to combine compartment modeling and CFD simulations. Therefore, the aim of this study is to develop a methodology in order to propose a compartment model based on CFD simulations of a bioreactor. The flow rate between two compartments can be easily computed from the velocity fields obtained by CFD. The difficulty lies in the definition of the zones in such a way they can be considered as perfectly mixed. The creation of the model compartments from CFD cells can be achieved manually or automatically. The manual zoning consists in aggregating CFD cells according to the user's wish. The automatic zoning defines compartments as regions within which the value of one or several properties are uniform with respect to a given tolerance. Both manual and automatic zoning methods have been developed and compared by simulating the mixing of an inert scalar. For the automatic zoning, several algorithms and different flow properties have been tested as criteria for the compartment creation.

  6. Computer simulation of magnetic resonance angiography imaging: model description and validation.

    Directory of Open Access Journals (Sweden)

    Artur Klepaczko

    Full Text Available With the development of medical imaging modalities and image processing algorithms, there arises a need for methods of their comprehensive quantitative evaluation. In particular, this concerns the algorithms for vessel tracking and segmentation in magnetic resonance angiography images. The problem can be approached by using synthetic images, where true geometry of vessels is known. This paper presents a framework for computer modeling of MRA imaging and the results of its validation. A new model incorporates blood flow simulation within MR signal computation kernel. The proposed solution is unique, especially with respect to the interface between flow and image formation processes. Furthermore it utilizes the concept of particle tracing. The particles reflect the flow of fluid they are immersed in and they are assigned magnetization vectors with temporal evolution controlled by MR physics. Such an approach ensures flexibility as the designed simulator is able to reconstruct flow profiles of any type. The proposed model is validated in a series of experiments with physical and digital flow phantoms. The synthesized 3D images contain various features (including artifacts characteristic for the time-of-flight protocol and exhibit remarkable correlation with the data acquired in a real MR scanner. The obtained results support the primary goal of the conducted research, i.e. establishing a reference technique for a quantified validation of MR angiography image processing algorithms.

  7. Brookhaven solar-heat-pump simulator: technical description and experimental results

    Energy Technology Data Exchange (ETDEWEB)

    Catan, M.

    1982-07-01

    The series solar-assisted heat pump (SAHP) system has the potential to deliver heat with very high seasonal coefficients of performance (coefficient of performance or COP is the ratio of useful heat delivered to electrical power consumed). This potential rests on the ability of the heat-pump component to use the high source temperatures available from the solar-collector component to deliver heat with a COP which rises monotonically with source temperature. The Brookhaven National Laboratory (BNL) Heat Pump Simulator has played an important role in a program aimed at demonstrating the feasibility of building simple potentially inexpensive heat pumps for use in SAHP systems. Basically the work described here consists of the following: (1) The construction and testing of a laboratory heat pump built from conventional components and characterized by a very desirable COP versus source temperature profile. (2) The testing of two prototype SAHPs built by heat-pump manufacturers under contract to DOE. (3) Detailed component and control tests aimed at establishing improvements in the SAHP prototypes. The paper describes, in some detail, the BNL Heat Pump Simulator, a versatile instrument used to test heat pumps and heat-pump subcomponents under transient and steady-state conditions.

  8. Quantification and Standardized Description of Color Vision Deficiency Caused by Anomalous Trichromats—Part I: Simulation and Measurement

    Directory of Open Access Journals (Sweden)

    Seungji Yang

    2008-02-01

    Full Text Available The MPEG-21 Multimedia Framework allows visually impaired users to have an improved access to visual content by enabling content adaptation techniques such as color compensation. However, one important issue is the method to create and interpret the standardized CVD descriptions when making the use of generic color vision tests. In Part I of our study to tackle the issue, we present a novel computerized hue test (CHT to examine and quantify CVD, which allows reproducing and manipulating test colors for the purposes of computer simulation and analysis of CVD. Both objective evaluation via color difference measurement and subjective evaluation via clinical experiment showed that the CHT works well as a color vision test: it is highly correlated with the Farnsworth-Munsell 100 Hue (FM100H test and allows for a more elaborate and correct color reproduction than the FM100H test.

  9. Confirmation via Analogue Simulation: A Bayesian Analysis

    CERN Document Server

    Dardashti, Radin; Thebault, Karim P Y; Winsberg, Eric

    2016-01-01

    Analogue simulation is a novel mode of scientific inference found increasingly within modern physics, and yet all but neglected in the philosophical literature. Experiments conducted upon a table-top 'source system' are taken to provide insight into features of an inaccessible 'target system', based upon a syntactic isomorphism between the relevant modelling frameworks. An important example is the use of acoustic 'dumb hole' systems to simulate gravitational black holes. In a recent paper it was argued that there exists circumstances in which confirmation via analogue simulation can obtain; in particular when the robustness of the isomorphism is established via universality arguments. The current paper supports these claims via an analysis in terms of Bayesian confirmation theory.

  10. Observationally-Motivated Analysis of Simulated Galaxies

    CERN Document Server

    Miranda, M S; Gibson, B K

    2015-01-01

    The spatial and temporal relationships between stellar age, kinematics, and chemistry are a fundamental tool for uncovering the physics driving galaxy formation and evolution. Observationally, these trends are derived using carefully selected samples isolated via the application of appropriate magnitude, colour, and gravity selection functions of individual stars; conversely, the analysis of chemodynamical simulations of galaxies has traditionally been restricted to the age, metallicity, and kinematics of `composite' stellar particles comprised of open cluster-mass simple stellar populations. As we enter the Gaia era, it is crucial that this approach changes, with simulations confronting data in a manner which better mimics the methodology employed by observers. Here, we use the \\textsc{SynCMD} synthetic stellar populations tool to analyse the metallicity distribution function of a Milky Way-like simulated galaxy, employing an apparent magnitude plus gravity selection function similar to that employed by the ...

  11. Stochastic analysis for finance with simulations

    CERN Document Server

    Choe, Geon Ho

    2016-01-01

    This book is an introduction to stochastic analysis and quantitative finance; it includes both theoretical and computational methods. Topics covered are stochastic calculus, option pricing, optimal portfolio investment, and interest rate models. Also included are simulations of stochastic phenomena, numerical solutions of the Black–Scholes–Merton equation, Monte Carlo methods, and time series. Basic measure theory is used as a tool to describe probabilistic phenomena. The level of familiarity with computer programming is kept to a minimum. To make the book accessible to a wider audience, some background mathematical facts are included in the first part of the book and also in the appendices. This work attempts to bridge the gap between mathematics and finance by using diagrams, graphs and simulations in addition to rigorous theoretical exposition. Simulations are not only used as the computational method in quantitative finance, but they can also facilitate an intuitive and deeper understanding of theoret...

  12. Detector Description Framework in LHCb

    CERN Document Server

    Ponce, Sébastien

    2003-01-01

    The Gaudi architecture and framework are designed to provide a common infrastructure and environment for simulation, filtering, reconstruction and analysis applications. In this context, a Detector Description Service was developed in LHCb in order to also provide easy and coherent access to the description of the experimental apparatus. This service centralizes every information about the detector, including geometry, materials, alignment, calibration, structure and controls. From the proof of concept given by the first functional implementation of this service late 2000, the Detector Description Service has grown and has become one of the major components of the LHCb software, shared among all applications, including simulation, reconstruction, analysis and visualization. We describe here the full and functional implementation of the service. We stress the easiness of customization and extension of the detector description by the user, on the seamless integration with condition databases in order to handle ...

  13. Spectral evolution of weakly nonlinear random waves: kinetic description vs direct numerical simulations

    Science.gov (United States)

    Annenkov, Sergei; Shrira, Victor

    2016-04-01

    We study numerically the long-term evolution of water wave spectra without wind forcing, using three different models, aiming at understanding the role of different sets of assumptions. The first model is the classical Hasselmann kinetic equation (KE). We employ the WRT code kindly provided by G. van Vledder. Two other models are new. As the second model, we use the generalised kinetic equation (gKE), derived without the assumption of quasi-stationarity. Thus, unlike the KE, the gKE is valid in the cases when a wave spectrum is changing rapidly (e.g. at the initial stage of evolution of a narrow spectrum). However, the gKE employs the same statistical closure as the KE. The third model is based on the Zakharov integrodifferential equation for water waves and does not depend on any statistical assumptions. Since the Zakharov equation plays the role of the primitive equation of the theory of wave turbulence, we refer to this model as direct numerical simulation of spectral evolution (DNS-ZE). For initial conditions, we choose two narrow-banded spectra with the same frequency distribution (a JONSWAP spectrum with high peakedness γ = 6) and different degrees of directionality. These spectra are from the set of observations collected in a directional wave tank by Onorato et al (2009). Spectrum A is very narrow in angle (corresponding to N = 840 in the cosN directional model). Spectrum B is initially wider in angle (corresponds to N = 24). Short-term evolution of both spectra (O(102) wave periods) has been studied numerically by Xiao et al (2013) using two other approaches (broad-band modified nonlinear Schrödinger equation and direct numerical simulation based on the high-order spectral method). We use these results to verify the initial stage of our DNS-ZE simulations. However, the advantage of the DNS-ZE method is that it allows to study long-term spectral evolution (up to O(104) periods), which was previously possible only with the KE. In the short-term evolution

  14. Tropospheric aerosol microphysics simulation with assimilated meteorology: model description and intermodel comparison

    Directory of Open Access Journals (Sweden)

    W. Trivitayanurak

    2007-10-01

    Full Text Available We implement the TwO-Moment Aerosol Sectional (TOMAS microphysics module into GEOS-CHEM, a CTM driven by assimilated meteorology. TOMAS has 30 size sections covering 0.01–10 μm diameter with conservation equations for both aerosol mass and number. The implementation enables GEOS-CHEM to simulate aerosol microphysics, size distributions, mass and number concentrations. The model system is developed for sulfate and sea-salt aerosols, a year-long simulation has been performed, and results are compared to observations. Additionally model intercomparison was carried out involving global models with sectional microphysics: GISS GCM-II' and GLOMAP. Comparison with marine boundary layer observations of CN and CCN(0.2% shows that all models perform well with average errors of 30–50%. However, all models underpredict CN by up to 42% between 15° S and 45° S while overpredicting CN up to 52% between 45° N and 60° N, which could be due to the sea-salt emission parameterization and the assumed size distribution of primary sulfate emission, in each case respectively. Model intercomparison at the surface shows that GISS GCM-II' and GLOMAP, each compared against GEOS-CHEM, both predict 40% higher CN and predict 20% and 30% higher CCN(0.2% on average, respectively. Major discrepancies are due to different emission inventories and transport. Budget comparison shows GEOS-CHEM predicts the lowest global CCN(0.2% due to microphysical growth being a factor of 2 lower than other models because of lower SO2 availability. These findings stress the need for accurate meteorological inputs and updated emission inventories when evaluating global aerosol microphysics models.

  15. Sensitivity Analysis of Multidisciplinary Rotorcraft Simulations

    Science.gov (United States)

    Wang, Li; Diskin, Boris; Biedron, Robert T.; Nielsen, Eric J.; Bauchau, Olivier A.

    2017-01-01

    A multidisciplinary sensitivity analysis of rotorcraft simulations involving tightly coupled high-fidelity computational fluid dynamics and comprehensive analysis solvers is presented and evaluated. An unstructured sensitivity-enabled Navier-Stokes solver, FUN3D, and a nonlinear flexible multibody dynamics solver, DYMORE, are coupled to predict the aerodynamic loads and structural responses of helicopter rotor blades. A discretely-consistent adjoint-based sensitivity analysis available in FUN3D provides sensitivities arising from unsteady turbulent flows and unstructured dynamic overset meshes, while a complex-variable approach is used to compute DYMORE structural sensitivities with respect to aerodynamic loads. The multidisciplinary sensitivity analysis is conducted through integrating the sensitivity components from each discipline of the coupled system. Numerical results verify accuracy of the FUN3D/DYMORE system by conducting simulations for a benchmark rotorcraft test model and comparing solutions with established analyses and experimental data. Complex-variable implementation of sensitivity analysis of DYMORE and the coupled FUN3D/DYMORE system is verified by comparing with real-valued analysis and sensitivities. Correctness of adjoint formulations for FUN3D/DYMORE interfaces is verified by comparing adjoint-based and complex-variable sensitivities. Finally, sensitivities of the lift and drag functions obtained by complex-variable FUN3D/DYMORE simulations are compared with sensitivities computed by the multidisciplinary sensitivity analysis, which couples adjoint-based flow and grid sensitivities of FUN3D and FUN3D/DYMORE interfaces with complex-variable sensitivities of DYMORE structural responses.

  16. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  17. Simulation study of passive target motion analysis

    Institute of Scientific and Technical Information of China (English)

    HU Youfeng; JIAO Binli

    2003-01-01

    In this paper, the problem of underwater passive target motion analysis (TMA) in three dimensions is discussed using the measurements of passive bearings and elevation and frequency on the condition that acoustic source and observer are in different horizontal planes. Simulation results with both of the PLE (pseudo-linear estimation) and MLE (Maximum likelihood estimation) show that the TMA method is effective in oceanic environment. Its error covariance curves tend to its Cramer-Rao lower bounds.

  18. National Infrastructure Simulation and Analysis Center Overview

    Energy Technology Data Exchange (ETDEWEB)

    Berscheid, Alan P. [Los Alamos National Laboratory

    2012-07-30

    National Infrastructure Simulation and Analysis Center (NISAC) mission is to: (1) Improve the understanding, preparation, and mitigation of the consequences of infrastructure disruption; (2) Provide a common, comprehensive view of U.S. infrastructure and its response to disruptions - Scale & resolution appropriate to the issues and All threats; and (3) Built an operations-tested DHS capability to respond quickly to urgent infrastructure protection issues.

  19. Correlation of Descriptive Analysis and Instrumental Puncture Testing of Watermelon Cultivars.

    Science.gov (United States)

    Shiu, J W; Slaughter, D C; Boyden, L E; Barrett, D M

    2016-06-01

    The textural properties of 5 seedless watermelon cultivars were assessed by descriptive analysis and the standard puncture test using a hollow probe with increased shearing properties. The use of descriptive analysis methodology was an effective means of quantifying watermelon sensory texture profiles for characterizing specific cultivars' characteristics. Of the 10 cultivars screened, 71% of the variation in the sensory attributes was measured using the 1st 2 principal components. Pairwise correlation of the hollow puncture probe and sensory parameters determined that initial slope, maximum force, and work after maximum force measurements all correlated well to the sensory attributes crisp and firm. These findings confirm that maximum force correlates well with not only firmness in watermelon, but crispness as well. The initial slope parameter also captures the sensory crispness of watermelon, but is not as practical to measure in the field as maximum force. The work after maximum force parameter is thought to reflect cellular arrangement and membrane integrity that in turn impact sensory firmness and crispness. Watermelon cultivar types were correctly predicted by puncture test measurements in heart tissue 87% of the time, although descriptive analysis was correct 54% of the time.

  20. Molecular dynamics simulation of water in and around carbon nanotubes: A coarse-grained description

    Science.gov (United States)

    Pantawane, Sanwardhini; Choudhury, Niharendu

    2016-05-01

    In the present study, we intend to investigate behaviour of water in and around hydrophobic open ended carbon nanotubes (CNTs) using a coarse-grained, core-softened model potential for water. The model potential considered here for water has recently been shown to successfully reproduce dynamic, thermodynamic and structural anomalies of water. The epitome of the study is to understand the incarceration of this coarse-grained water in a single-file carbon nanotube. In order to examine the effect of fluid-water van der Waals interaction on the structure of fluid in and around the nanotube, we have simulated three different CNT-water systems with varying degree of solute-water dispersion interaction. The analyses of the radial one-particle density profiles reveal varying degree of permeation and wetting of the CNT interior depending on the degree of fluid-solute attractive van der Waals interaction. A peak in the radial density profile slightly off the nanotube axis signifies a zigzag chain of water molecule around the CNT axis. The average numbers of water molecules inside the CNT have been shown to increase with the increase in fluid-water attractive dispersion interaction.

  1. Molecular dynamics simulation of water in and around carbon nanotubes: A coarse-grained description

    Energy Technology Data Exchange (ETDEWEB)

    Pantawane, Sanwardhini [Department of Physics, UM-DAE Centre for Excellence in Basic Sciences, University of Mumbai, Kalina Campus, Mumbai 400098 (India); Choudhury, Niharendu, E-mail: nihcho@barc.gov.in [Theoretical Chemistry Section, Bhabha Atomic Research Centre, Mumbai 400 085 (India)

    2016-05-23

    In the present study, we intend to investigate behaviour of water in and around hydrophobic open ended carbon nanotubes (CNTs) using a coarse-grained, core-softened model potential for water. The model potential considered here for water has recently been shown to successfully reproduce dynamic, thermodynamic and structural anomalies of water. The epitome of the study is to understand the incarceration of this coarse-grained water in a single-file carbon nanotube. In order to examine the effect of fluid-water van der Waals interaction on the structure of fluid in and around the nanotube, we have simulated three different CNT-water systems with varying degree of solute-water dispersion interaction. The analyses of the radial one-particle density profiles reveal varying degree of permeation and wetting of the CNT interior depending on the degree of fluid-solute attractive van der Waals interaction. A peak in the radial density profile slightly off the nanotube axis signifies a zigzag chain of water molecule around the CNT axis. The average numbers of water molecules inside the CNT have been shown to increase with the increase in fluid-water attractive dispersion interaction.

  2. Simulating snow maps for Norway: description and statistical evaluation of the seNorge snow model

    Directory of Open Access Journals (Sweden)

    T. M. Saloranta

    2012-11-01

    Full Text Available Daily maps of snow conditions have been produced in Norway with the seNorge snow model since 2004. The seNorge snow model operates with 1 × 1 km resolution, uses gridded observations of daily temperature and precipitation as its input forcing, and simulates, among others, snow water equivalent (SWE, snow depth (SD, and the snow bulk density (ρ. In this paper the set of equations contained in the seNorge model code is described and a thorough spatiotemporal statistical evaluation of the model performance from 1957–2011 is made using the two major sets of extensive in situ snow measurements that exist for Norway. The evaluation results show that the seNorge model generally overestimates both SWE and ρ, and that the overestimation of SWE increases with elevation throughout the snow season. However, the R2-values for model fit are 0.60 for (log-transformed SWE and 0.45 for ρ, indicating that after removal of the detected systematic model biases (e.g. by recalibrating the model or expressing snow conditions in relative units the model performs rather well. The seNorge model provides a relatively simple, not very data-demanding, yet nonetheless process-based method to construct snow maps of high spatiotemporal resolution. It is an especially well suited alternative for operational snow mapping in regions with rugged topography and large spatiotemporal variability in snow conditions, as is the case in the mountainous Norway.

  3. Description of EQSAM4: gas-liquid-solid partitioning model for global simulations

    Science.gov (United States)

    Metzger, S.; Steil, B.; Xu, L.; Penner, J. E.; Lelieveld, J.

    2011-10-01

    We introduce version 4 of the EQuilibrium Simplified Aerosol Model (EQSAM4), which is part of our aerosol chemistry-microphysics module (GMXe) and chemistry-climate model (EMAC). We focus on the relative humidity of deliquescence (RHD) based water uptake of atmospheric aerosols, as this is important for atmospheric chemistry and climate modeling, e.g. to calculate the aerosol optical depth (AOD). Since the main EQSAM4 applications will involve large-scale, long-term and high-resolution atmospheric chemistry-climate modeling with EMAC, computational efficiency is an important requirement. EQSAM4 parameterizes the composition and water uptake of multicomponent atmospheric aerosols by considering the gas-liquid-solid partitioning of single and mixed solutes. EQSAM4 builds on analytical, and hence CPU efficient, aerosol hygroscopic growth parameterizations to compute the aerosol liquid water content (AWC). The parameterizations are described in the companion paper (Metzger et al., 2011) and only require a compound specific coefficient νi to derive the single solute molality and the AWC for the whole range of water activity (aw). νi is pre-calculated and applied during runtime by using internal look-up tables. Here, the EQSAM4 equilibrium model is described and compared to the more explicit thermodynamic model ISORROPIA II. Both models are imbedded in EMAC/GMXe. Box model inter-comparisons, including the reference model E-AIM, and global simulations with EMAC show that gas-particle partitioning, including semi-volatiles and water, is in good agreement. A more comprehensive box model inter-comparison of EQSAM4 with EQUISOLV II is subject of the revised publication of Xu et al. (2009), i.e. Xu et al. (2011).

  4. Loch Linnhe `94: Test operations description and on-site analysis, US activities

    Energy Technology Data Exchange (ETDEWEB)

    Mantrom, D.D.

    1994-11-01

    A field experiment named Loch Linnhe `94 (LL94) is described. This experiment was conducted in upper Loch Linnhe, Scotland, in September 1994, as an exercise involving UK and US investigators, under the Joint UK/US Radar Ocean Imaging Program. This experiment involved a dual-frequency, dual-polarization hillside real aperture radar operated by the UK, Lawrence Livermore National Laboratory`s (LLNL) current meter array (CMA), in-water hydrodynamic sensors, and meteorological measurements. The primary measurements involved imaging ship-generated and ambient internal waves by the radar and the CMA. This report documents test operations from a US perspective and presents on-site analysis results derived by US investigators. The rationale underlying complementary radar and CMA measurements is described. Descriptions of the test site, platforms, and major US instrument systems are given. A summary of test operations and examples of radar, CMA, water column profile, and meteorological data are provided. A description of the rather extensive analysis of these data performed at the LL94 test site is presented. The products of this analysis are presented and some implications for further analysis and future experiments are discussed. All experimental objectives were either fully or partially met. Powerful on-site analysis capabilities generated many useful products and helped improve subsequent data collection. Significant further data analysis is planned.

  5. Regional hydrogeological simulations. Numerical modelling using ConnectFlow. Preliminary site description Simpevarp sub area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Hoch, Andrew; Hunter, Fiona; Jackson, Peter [Serco Assurance, Risley (United Kingdom); Marsic, Niko [Kemakta Konsult, Stockholm (Sweden)

    2005-02-01

    objective of this study is to support the development of a preliminary Site Description of the Simpevarp area on a regional-scale based on the available data of August 2004 (Data Freeze S1.2) and the previous Site Description. A more specific objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater in the Simpevarp area on a regional-scale. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale as well as predictions of future hydrogeological conditions. Other key objectives were to identify the model domain required to simulate regional flow and solute transport at the Simpevarp area and to incorporate a new geological model of the deformation zones produced for Version S1.2.Another difference with Version S1.1 is the increased effort invested in conditioning the hydrogeological property models to the fracture boremap and hydraulic data. A new methodology was developed for interpreting the discrete fracture network (DFN) by integrating the geological description of the DFN (GeoDFN) with the hydraulic test data from Posiva Flow-Log and Pipe-String System double-packer techniques to produce a conditioned Hydro-DFN model. This was done in a systematic way that addressed uncertainties associated with the assumptions made in interpreting the data, such as the relationship between fracture transmissivity and length. Consistent hydraulic data was only available for three boreholes, and therefore only relatively simplistic models were proposed as there isn't sufficient data to justify extrapolating the DFN away from the boreholes based on rock domain, for example. Significantly, a far greater quantity of hydro-geochemical data was available for calibration in the

  6. Description and Application of a Model for Simulating Regional Nitrogen Cycling and Calculating Nitrogen Flux

    Institute of Scientific and Technical Information of China (English)

    ZHENG Xunhua; LIU Chunyan; HAN Shenghui

    2008-01-01

    A regional nitrogen cycle model,named IAP-N,was designed for simulating regional nitrogen(N)cycling and calculating N fluxes flowing among cultivated soils,crops,and livestock,as well as human,atmospheric and other systems.The conceptual structure and calculation methods and procedures of this model are described in detail.All equations of the model are presented.In addition.definitions of all the involved variables and parameters are given.An application of the model in China at the national scale is presented.In this example,annual surpluses of consumed synthetic N fertilizer;emissions of nitrous oxide(N2O),ammonia(NH3)and nitrogen oxide(NO(x));N loss from agricultural lands due to leaching and runoff;and sources and sinks of anthropogenic reactive N(Nr)were estimated for the period 1961-2004.The model estimates show that surpluses of N fertilizer started to occur in the mid 1990s and amounted to 5.7 Tg N yr(-1)in the early 2000s.N2O emissions related to agriculture were estimated as 0.69 Tg N yr(-1)in 2004,of which 58%was released directly from N added to agricultural soils.Total NH3 and NO(x) emissions in 2004 amounted to 4.7 and 4.9 Tg N yr(1-),respectively.About 3.9 Tg N yr(-1) of N was estimated to have flowed out of the cultivated soil layer in 2004.which accounted for 33%of applied synthetic N fertilizer.Anthropogenic Nr sources changed from 2.8(1961)to 28.1 Tg N yr(-1)(2004),while removal(sinks)changed from to 2.1 to 8.4 Tg N yr(-1).The ratio of anthropogenic Nr sources to sinks was only 1.4 in 1961 but 3.3 in 2004.Further development of the IAP-N model is suggested to focus upon:(a)inter-comparison with other regional N models;(b)overcoming the limitations of the current model version,such as adaptation to other regions,high-resolution database,and so on;and(c)developing the capacity to estimate the safe threshold of anthropogenic Nr source to sink ratios.

  7. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  8. Process Simulation Analysis of HF Stripping

    Directory of Open Access Journals (Sweden)

    Thaer A. Abdulla

    2013-05-01

    Full Text Available    HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq. Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee, affecting the results within (0.1-58% variation for the most cases.        The simulated results show that about 4% of paraffin (C10 & C11 presents at the top stream, which may cause a problem in the LAB production plant. The major variations were noticed for the total top vapor flow rate with bottom temperature and with feed composition. The column profiles maintain fairly constants from tray 5 to tray 18. The study gives evidence about a successful simulation with HYSYS because the results correspond with the real plant operation data.

  9. Descriptive analysis of staff satisfaction and turnover intention in a Malaysian University

    Science.gov (United States)

    Sidik, Mohamad Hazeem; Hamid, Mohd Rashid Ab; Ibrahim, Abdullah

    2017-05-01

    This paper discussed the descriptive analysis of staff satisfaction in education organisation. This study employed a cross-sectional study involving a total of 1042 of respondents from a university in east coast of Malaysia. The survey covers six dimensions of staff satisfaction which are leadership, staff involvement, workload, self-development, working environment and communication. From the analysis of the mean score, it reveals that the staff enjoyed moderate level of satisfaction and the findings of the study generally support the past findings in the literature. This study paved the way for in-depth investigation towards staff satisfaction at the university under study.

  10. Multi-level model for 2D human motion analysis and description

    Science.gov (United States)

    Foures, Thomas; Joly, Philippe

    2003-01-01

    This paper deals with the proposition of a model for human motion analysis in a video. Its main caracteristic is to adapt itself automatically to the current resolution, the actual quality of the picture, or the level of precision required by a given application, due to its possible decomposition into several hierarchical levels. The model is region-based to address some analysis processing needs. The top level of the model is only defined with 5 ribbons, which can be cut into sub-ribbons regarding to a given (or an expected) level of details. Matching process between model and current picture consists in the comparison of extracted subject shape with a graphical rendering of the model built on the base of some computed parameters. The comparison is processed by using a chamfer matching algorithm. In our developments, we intend to realize a platform of interaction between a dancer and tools synthetizing abstract motion pictures and music in the conditions of a real-time dialogue between a human and a computer. In consequence, we use this model in a perspective of motion description instead of motion recognition: no a priori gestures are supposed to be recognized as far as no a priori application is specially targeted. The resulting description will be made following a Description Scheme compliant with the movement notation called "Labanotation".

  11. Scientific tourism communication in Brazil: Descriptive analysis of national journals from 1990 to 2012

    Directory of Open Access Journals (Sweden)

    Glauber Eduardo de Oliveira Santos

    2013-04-01

    Full Text Available This paper provides descriptive analysis of 2.126 articles published in 20 Brazilian tourism journals from 1990 to 2012. It offers a comprehensive and objective picture of these journals, contributing to the debate about editorial policies, as well as to a broader understanding of the Brazilian academic research developed in this period. The study analyses the evolution of the number of published papers and descriptive statistics about the length of articles, titles and abstracts. Authors with the largest number of publications and the most recurrent keywords are identified. The integration level among journals is analyzed; point out which publications are closer to the center of the Brazilian tourism scientific publishing network.

  12. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    Science.gov (United States)

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  13. Optoelectronic Devices Advanced Simulation and Analysis

    CERN Document Server

    Piprek, Joachim

    2005-01-01

    Optoelectronic devices transform electrical signals into optical signals and vice versa by utilizing the sophisticated interaction of electrons and light within micro- and nano-scale semiconductor structures. Advanced software tools for design and analysis of such devices have been developed in recent years. However, the large variety of materials, devices, physical mechanisms, and modeling approaches often makes it difficult to select appropriate theoretical models or software packages. This book presents a review of devices and advanced simulation approaches written by leading researchers and software developers. It is intended for scientists and device engineers in optoelectronics, who are interested in using advanced software tools. Each chapter includes the theoretical background as well as practical simulation results that help to better understand internal device physics. The software packages used in the book are available to the public, on a commercial or noncommercial basis, so that the interested r...

  14. Sample Analysis at Mars Instrument Simulator

    Science.gov (United States)

    Benna, Mehdi; Nolan, Tom

    2013-01-01

    The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes

  15. Automated methods of textual content analysis and description of text structures

    CERN Document Server

    Chýla, Roman

    Universal Semantic Language (USL) is a semi-formalized approach for the description of knowledge (a knowledge representation tool). The idea of USL was introduced by Vladimir Smetacek in the system called SEMAN which was used for keyword extraction tasks in the former Information centre of the Czechoslovak Republic. However due to the dissolution of the centre in early 90's, the system has been lost. This thesis reintroduces the idea of USL in a new context of quantitative content analysis. First we introduce the historical background and the problems of semantics and knowledge representation, semes, semantic fields, semantic primes and universals. The basic methodology of content analysis studies is illustrated on the example of three content analysis tools and we describe the architecture of a new system. The application was built specifically for USL discovery but it can work also in the context of classical content analysis. It contains Natural Language Processing (NLP) components and employs the algorith...

  16. Grammatical Conversion of Descriptive Narrative - an application of discourse analysis in conceptual modelling

    Directory of Open Access Journals (Sweden)

    Bruce Calway

    1996-05-01

    Full Text Available Fact-oriented conceptual modelling begins with the search for facts about a universe of discourse (UoD. These facts may be obtained from many sources, including information systems reports, tables, manuals and descriptive narrative both verbal and written. This paper presents some initial findings that support the use of discourse analysis techniques as an approach to developing elementary fact based sentences for information systems conceptual schema development from written text. Although this discussion paper only considers the NIAM (fact-oriented conceptual schema modelling method, the IS087 report from which the research case study is taken describes other conceptual methods for which the research contained in this paper could be applicable (e.g. Entity Relationship analysis. The case study could be modelled exactly in the form in which the text is initially found, but grammatical analysis focuses consideration on alternative, potentially better, expressions of a sentence, a theme which is described and demonstrated. As a result of having applied grammatical sentence simplification with co-ordinate clause splitting, each sentence could be expressed as a complete, finite, independent collection of declarative simple statements. The outcome from the application of the techniques described provides at a minimum a discourse analysis of descriptive narrative which will have retained its meaning and contextual integrity while at the same time providing a simplified and independent clause representation for input to the fact-oriented conceptual schema modelling procedure.

  17. Psychological analysis of primary school pupils self-description in a computer game

    Directory of Open Access Journals (Sweden)

    I. D. Spirina

    2017-06-01

    Full Text Available Objective. The aim of this study was to reveal of the specific impact of computer games on the children`s consciousness in primary school. Materials and Methods. 30 children aged from 6 to 11 years were examined. The qualitative research methods of descriptions the children`s computer games experience according to the main stages of structured phenomenological research have been used. The questionnaire for children`s self- description in a computer game has been developed and qualitative analysis of these descriptions has been conducted. Results. While analyzing the descriptions the difficulty of “true”/“false” separating, the use of personal pronouns of the language, the absence of the proper distinction between "Self" as a game character and "Self" of the child on the whole, attributing the properties of living creatures to virtual "opponents" or "partners" and the confusion of time and spatial terms use while describing the game by the children have been revealed. Only the outer game plan, such as plot, "event", "action", the difficulties occurring in the game have been described by the children, but there have not been any reflected emotions at all. While describing the "events" occurring in the game, the children were not able to focus on themselves either then or during the game. Conclusions. The involvement of a child into the computer game causes, first of all, the disorder of emotional sphere functioning, when the emotions are not understood by the child. The discrepancies while describing by the children themselves, their nature and the trends of their favourite games have been exposed, indicating that there have been the disorders in the child`s self-attitude and self-esteem forming. While playing the computer game a special "operation mode" of the child's mind emerges when the impact of the irreal image on the child`s mind can distort the natural flow of cognitive and emotional reflection of reality forming.

  18. The carbon cycle in the Australian Community Climate and Earth System Simulator (ACCESS-ESM1) - Part 1: Model description and pre-industrial simulation

    Science.gov (United States)

    Law, Rachel M.; Ziehn, Tilo; Matear, Richard J.; Lenton, Andrew; Chamberlain, Matthew A.; Stevens, Lauren E.; Wang, Ying-Ping; Srbinovsky, Jhan; Bi, Daohua; Yan, Hailin; Vohralik, Peter F.

    2017-07-01

    Earth system models (ESMs) that incorporate carbon-climate feedbacks represent the present state of the art in climate modelling. Here, we describe the Australian Community Climate and Earth System Simulator (ACCESS)-ESM1, which comprises atmosphere (UM7.3), land (CABLE), ocean (MOM4p1), and sea-ice (CICE4.1) components with OASIS-MCT coupling, to which ocean and land carbon modules have been added. The land carbon model (as part of CABLE) can optionally include both nitrogen and phosphorous limitation on the land carbon uptake. The ocean carbon model (WOMBAT, added to MOM) simulates the evolution of phosphate, oxygen, dissolved inorganic carbon, alkalinity and iron with one class of phytoplankton and zooplankton. We perform multi-centennial pre-industrial simulations with a fixed atmospheric CO2 concentration and different land carbon model configurations (prescribed or prognostic leaf area index). We evaluate the equilibration of the carbon cycle and present the spatial and temporal variability in key carbon exchanges. Simulating leaf area index results in a slight warming of the atmosphere relative to the prescribed leaf area index case. Seasonal and interannual variations in land carbon exchange are sensitive to whether leaf area index is simulated, with interannual variations driven by variability in precipitation and temperature. We find that the response of the ocean carbon cycle shows reasonable agreement with observations. While our model overestimates surface phosphate values, the global primary productivity agrees well with observations. Our analysis highlights some deficiencies inherent in the carbon models and where the carbon simulation is negatively impacted by known biases in the underlying physical model and consequent limits on the applicability of this model version. We conclude the study with a brief discussion of key developments required to further improve the realism of our model simulation.

  19. The carbon cycle in the Australian Community Climate and Earth System Simulator (ACCESS-ESM1 – Part 1: Model description and pre-industrial simulation

    Directory of Open Access Journals (Sweden)

    R. M. Law

    2017-07-01

    Full Text Available Earth system models (ESMs that incorporate carbon–climate feedbacks represent the present state of the art in climate modelling. Here, we describe the Australian Community Climate and Earth System Simulator (ACCESS-ESM1, which comprises atmosphere (UM7.3, land (CABLE, ocean (MOM4p1, and sea-ice (CICE4.1 components with OASIS-MCT coupling, to which ocean and land carbon modules have been added. The land carbon model (as part of CABLE can optionally include both nitrogen and phosphorous limitation on the land carbon uptake. The ocean carbon model (WOMBAT, added to MOM simulates the evolution of phosphate, oxygen, dissolved inorganic carbon, alkalinity and iron with one class of phytoplankton and zooplankton. We perform multi-centennial pre-industrial simulations with a fixed atmospheric CO2 concentration and different land carbon model configurations (prescribed or prognostic leaf area index. We evaluate the equilibration of the carbon cycle and present the spatial and temporal variability in key carbon exchanges. Simulating leaf area index results in a slight warming of the atmosphere relative to the prescribed leaf area index case. Seasonal and interannual variations in land carbon exchange are sensitive to whether leaf area index is simulated, with interannual variations driven by variability in precipitation and temperature. We find that the response of the ocean carbon cycle shows reasonable agreement with observations. While our model overestimates surface phosphate values, the global primary productivity agrees well with observations. Our analysis highlights some deficiencies inherent in the carbon models and where the carbon simulation is negatively impacted by known biases in the underlying physical model and consequent limits on the applicability of this model version. We conclude the study with a brief discussion of key developments required to further improve the realism of our model simulation.

  20. The carbon cycle in the Australian Community Climate and Earth System Simulator (ACCESS-ESM1 – Part 1: Model description and pre-industrial simulation

    Directory of Open Access Journals (Sweden)

    R. M. Law

    2015-09-01

    Full Text Available Earth System Models (ESMs that incorporate carbon-climate feedbacks represent the present state of the art in climate modelling. Here, we describe the Australian Community Climate and Earth System Simulator (ACCESS-ESM1 that combines existing ocean and land carbon models into the physical climate model to simulate exchanges of carbon between the land, atmosphere and ocean. The land carbon model can optionally include both nitrogen and phosphorous limitation on the land carbon uptake. The ocean carbon model simulates the evolution of nitrate, oxygen, dissolved inorganic carbon, alkalinity and iron with one class of phytoplankton and zooplankton. From two multi-centennial simulations of the pre-industrial period with different land carbon model configurations, we evaluate the equilibration of the carbon cycle and present the spatial and temporal variability in key carbon exchanges. For the land carbon cycle, leaf area index is simulated reasonably, and seasonal carbon exchange is well represented. Interannual variations of land carbon exchange are relatively large, driven by variability in precipitation and temperature. We find that the response of the ocean carbon cycle shows reasonable agreement with observations and very good agreement with existing Coupled Model Intercomparison Project (CMIP5 models. While our model over estimates surface nitrate values, the primary productivity agrees well with observations. Our analysis highlights some deficiencies inherent in the carbon models and where the carbon simulation is negatively impacted by known biases in the underlying physical model. We conclude the study with a brief discussion of key developments required to further improve the realism of our model simulation.

  1. The carbon cycle in the Australian Community Climate and Earth System Simulator (ACCESS-ESM1) - Part 1: Model description and pre-industrial simulation

    Science.gov (United States)

    Law, R. M.; Ziehn, T.; Matear, R. J.; Lenton, A.; Chamberlain, M. A.; Stevens, L. E.; Wang, Y. P.; Srbinovsky, J.; Bi, D.; Yan, H.; Vohralik, P. F.

    2015-09-01

    Earth System Models (ESMs) that incorporate carbon-climate feedbacks represent the present state of the art in climate modelling. Here, we describe the Australian Community Climate and Earth System Simulator (ACCESS)-ESM1 that combines existing ocean and land carbon models into the physical climate model to simulate exchanges of carbon between the land, atmosphere and ocean. The land carbon model can optionally include both nitrogen and phosphorous limitation on the land carbon uptake. The ocean carbon model simulates the evolution of nitrate, oxygen, dissolved inorganic carbon, alkalinity and iron with one class of phytoplankton and zooplankton. From two multi-centennial simulations of the pre-industrial period with different land carbon model configurations, we evaluate the equilibration of the carbon cycle and present the spatial and temporal variability in key carbon exchanges. For the land carbon cycle, leaf area index is simulated reasonably, and seasonal carbon exchange is well represented. Interannual variations of land carbon exchange are relatively large, driven by variability in precipitation and temperature. We find that the response of the ocean carbon cycle shows reasonable agreement with observations and very good agreement with existing Coupled Model Intercomparison Project (CMIP5) models. While our model over estimates surface nitrate values, the primary productivity agrees well with observations. Our analysis highlights some deficiencies inherent in the carbon models and where the carbon simulation is negatively impacted by known biases in the underlying physical model. We conclude the study with a brief discussion of key developments required to further improve the realism of our model simulation.

  2. A Descriptive Analysis of Oral Health Systematic Reviews Published 1991–2012: Cross Sectional Study

    Science.gov (United States)

    Saltaji, Humam; Cummings, Greta G.; Armijo-Olivo, Susan; Major, Michael P.; Amin, Maryam; Major, Paul W.; Hartling, Lisa; Flores-Mir, Carlos

    2013-01-01

    Objectives To identify all systematic reviews (SRs) published in the domain of oral health research and describe them in terms of their epidemiological and descriptive characteristics. Design Cross sectional, descriptive study. Methods An electronic search of seven databases was performed from inception through May 2012; bibliographies of relevant publications were also reviewed. Studies were considered for inclusion if they were oral health SRs defined as therapeutic or non-therapeutic investigations that studied a topic or an intervention related to dental, oral or craniofacial diseases/disorders. Data were extracted from all the SRs based on a number of epidemiological and descriptive characteristics. Data were analysed descriptively for all the SRs, within each of the nine dental specialities, and for Cochrane and non-Cochrane SRs separately. Results 1,188 oral health (126 Cochrane and 1062 non-Cochrane) SRs published from 1991 through May 2012 were identified, encompassing the nine dental specialties. Over half (n = 676; 56.9%) of the SRs were published in specialty oral health journals, with almost all (n = 1,178; 99.2%) of the SRs published in English and almost none of the non-Cochrane SRs (n = 11; 0.9%) consisting of updates of previously published SRs. 75.3% of the SRs were categorized as therapeutic, with 64.5% examining non-drug interventions, while approximately half (n = 150/294; 51%) of the non-therapeutic SRs were classified as epidemiological SRs. The SRs included a median of 15 studies, with a meta-analysis conducted in 43.6%, in which a median of 9 studies/1 randomized trial were included in the largest meta-analysis conducted. Funding was received for 25.1% of the SRs, including nearly three-quarters (n = 96; 76.2%) of the Cochrane SRs. Conclusion Epidemiological and descriptive characteristics of the 1,188 oral health SRs varied across the nine dental specialties and by SR category (Cochrane vs. non-Cochrane). There is a

  3. Sensory characterization of a ready-to-eat sweetpotato breakfast cereal by descriptive analysis

    Science.gov (United States)

    Dansby, M. A.; Bovell-Benjamin, A. C.

    2003-01-01

    The sweetpotato [Ipomoea batatas (L.) Lam], an important industry in the United States, has been selected as a candidate crop to be grown on future long-duration space missions by NASA. Raw sweetpotato roots were processed into flour, which was used to formulate ready-to-eat breakfast cereal (RTEBC). Twelve trained panelists evaluated the sensory attributes of the extruded RTEBC using descriptive analysis. The samples were significantly different (Psensory attributes, which could be used to differentiate the appearance, texture, and flavor of sweetpotato RTEBC, were described. The data could be used to optimize the RTEBC and for designing studies to test its consumer acceptance.

  4. Description of the Mountain Cloud Chemistry Program version of the PLUVIUS MOD 5. 0 reactive storm simulation model

    Energy Technology Data Exchange (ETDEWEB)

    Luecken, D.J.; Whiteman, C.D.; Chapman, E.G.; Andrews, G.L.; Bader, D.C.

    1987-07-01

    Damage to forest ecosystems on mountains in the eastern United States has prompted a study conducted for the US Environmental Protection Agency's Mountain Cloud Chemistry Program (MCCP). This study has led to the development of a numerical model called MCCP PLUVIUS, which has been used to investigate the chemical transformations and cloud droplet deposition in shallow, nonprecipitating orographic clouds. The MCCP PLUVIUS model was developed as a specialized version of the existing PLUVIUS MOD 5.0 reactive storm model. It is capable of simulating aerosol scavenging, nonreactive gas scavenging, aqueous phase SO/sub 2/ reactions, and cloud water deposition. A description of the new model is provided along with information on model inputs and outputs, as well as suggestions for its further development. The MCCP PLUVIUS incorporates a new method to determine the depth of the layer of air which flows over a mountaintop to produce an orographic cloud event. It provides a new method for calculating hydrogen ion concentrations, and provides updated expressions and values for solubility, dissociation and reaction rate constants.

  5. Metabolic control analysis of biochemical pathways based on a thermokinetic description of reaction rates

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1997-01-01

    Metabolic control analysis is a powerful technique for the evaluation of flux control within biochemical pathways. Its foundation is the elasticity coefficients and the flux control coefficients (FCCs). On the basis of a thermokinetic description of reaction rates it is here shown...... affinity. This parameter can often be determined from experiments in vitro. The methodology is applicable only to the analysis of simple two-step pathways, but in many cases larger pathways can be lumped into two overall conversions. In cases where this cannot be done it is necessary to apply an extension...... be much more widely applied, although it was originally based on linearized kinetics. The methodology of determining elasticity coefficients directly from pool levels is illustrated with an analysis of the first two steps of the biosynthetic pathway of penicillin. The results compare well with previous...

  6. 40 CFR 86.162-00 - Approval of alternative air conditioning test simulations and descriptions of AC1 and AC2.

    Science.gov (United States)

    2010-07-01

    ... Procedures § 86.162-00 Approval of alternative air conditioning test simulations and descriptions of AC1 and AC2. The alternative air conditioning test procedures AC1 and AC2 are approved by the Administrator... requirements of paragraph (a) of this section and meet the requirements of § 86.163-00. Air conditioning tests...

  7. Optimization and Simulation in Drug Development - Review and Analysis

    DEFF Research Database (Denmark)

    Schjødt-Eriksen, Jens; Clausen, Jens

    2003-01-01

    We give a review of pharmaceutical R&D and mathematical simulation and optimization methods used to support decision making within the pharmaceutical development process. The complex nature of drug development is pointed out through a description of the various phases of the pharmaceutical...... development process. A part of the paper is dedicated to the use of simulation techniques to support clinical trials. The paper ends with a section describing portfolio modelling methods in the context of the pharmaceutical industry....

  8. Optimization and Simulation in Drug Development - Review and Analysis

    OpenAIRE

    Schjødt-Eriksen, Jens; Clausen, Jens

    2003-01-01

    We give a review of pharmaceutical R&D and mathematical simulation and optimization methods used to support decision making within the pharmaceutical development process. The complex nature of drug development is pointed out through a description of the various phases of the pharmaceutical development process. A part of the paper is dedicated to the use of simulation techniques to support clinical trials. The paper ends with a section describing portfolio modelling methods in the context ...

  9. The coupled chemistry-climate model LMDz-REPROBUS: description and evaluation of a transient simulation of the period 1980–1999

    Directory of Open Access Journals (Sweden)

    F. Lott

    2008-06-01

    Full Text Available We present a description and evaluation of the Chemistry-Climate Model (CCM LMDz-REPROBUS, which couples interactively the extended version of the Laboratoire de Météorologie Dynamique General Circulation Model (LMDz GCM and the stratospheric chemistry module of the REactive Processes Ruling the Ozone BUdget in the Stratosphere (REPROBUS model. The transient simulation evaluated here covers the period 1980–1999. The introduction of an interactive stratospheric chemistry module improves the model dynamical climatology, with a substantial reduction of the temperature biases in the lower tropical stratosphere. However, at high latitudes in the Southern Hemisphere, a negative temperature bias, that is already present in the GCM version, albeit with a smaller magnitude, leads to an overestimation of the ozone depletion and its vertical extent in the CCM. This in turn contributes to maintain low polar temperatures in the vortex, delay the break-up of the vortex and the recovery of polar ozone. The latitudinal and vertical variation of the mean age of air compares favourable with estimates derived from long-lived species measurements, though the model mean age of air is 1–3 years too young in the middle stratosphere. The model also reproduces the observed "tape recorder" in tropical total hydrogen (=H2O+2×CH4, but its propagation is about 30% too fast and its signal fades away slightly too quickly. The analysis of the global distributions of CH4 and N2O suggests that the subtropical transport barriers are correctly represented in the simulation. LMDz-REPROBUS also reproduces fairly well most of the spatial and seasonal variations of the stratospheric chemical species, in particular ozone. However, because of the Antarctic cold bias, large discrepancies are found for most species at high latitudes in the Southern Hemisphere during the spring and early summer. In the Northern Hemisphere, polar ozone depletion and its variability are underestimated

  10. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  11. An overview of the design and analysis of simulation experiments for sensitivity analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models. This review surveys 'classic' and 'modern' designs for experiments with simulation models. Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc. These designs

  12. Analysis and simulation of straw fuel logistics

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Daniel [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Agricultural Engineering

    1998-12-31

    Straw is a renewable biomass that has a considerable potential to be used as fuel in rural districts. This bulky fuel is, however, produced over large areas and must be collected during a limited amount of days and taken to the storages before being ultimately transported to heating plants. Thus, a well thought-out and cost-effective harvesting and handling system is necessary to provide a satisfactory fuel at competitive costs. Moreover, high-quality non-renewable fuels are used in these operations. To be sustainable, the energy content of these fuels should not exceed the energy extracted from the straw. The objective of this study is to analyze straw as fuel in district heating plants with respect to environmental and energy aspects, and to improve the performance and reduce the costs of straw handling. Energy, exergy and emergy analyses were used to assess straw as fuel from an energy point of view. The energy analysis showed that the energy balance is 12:1 when direct and indirect energy requirements are considered. The exergy analysis demonstrated that the conversion step is ineffective, whereas the emergy analysis indicated that large amounts of energy have been used in the past to form the straw fuel (the net emergy yield ratio is 1.1). A dynamic simulation model, called SHAM (Straw HAndling Model), has also been developed to investigate handling of straw from the fields to the plant. The primary aim is to analyze the performance of various machinery chains and management strategies in order to reduce the handling costs and energy needs. The model, which is based on discrete event simulation, takes both weather and geographical conditions into account. The model has been applied to three regions in Sweden (Svaloev, Vara and Enkoeping) in order to investigate the prerequisites for straw harvest at these locations. The simulations showed that straw has the best chances to become a competitive fuel in south Sweden. It was also demonstrated that costs can be

  13. A descriptive analysis of quantitative indices for multi-objective block layout

    Directory of Open Access Journals (Sweden)

    Amalia Medina Palomera

    2012-09-01

    Full Text Available Layout generation methods provide alternative solutions whose feasibility and quality must be evaluated. Indices must be used to distinguish the feasible solutions (involving different criteria obtained for block layout to identify s solution’s suitability, according to set objectives. This paper provides an accurate and descriptive analysis of the geometric indices used in designing facility layout (during block layout phase. The indices studied here have advantages and disadvantages which should be considered by an analyst before attempting to resolve the facility layout problem. New equations are proposed for measuring geometric indices. The analysis revealed redundant indices and that a minimum number of indices covering overall quality criteria may be used when selecting alternative solutions.

  14. Analysis and simulation of XPM intensity modulation

    Institute of Scientific and Technical Information of China (English)

    Jing Huang; Jianquan Yao

    2005-01-01

    Based on the split-step Fourier method and small signal analysis, an improved analytical solution which describes the cross-phase modulation (XPM) intensity is derived. It can suppress the spurious XPM intensity modulation efficiently in the whole transmission fiber. Thus it is more coincidence with the practical result. Furthermore, it is convenient, because it is independent of channel separation and the dispersion and nonlinear effects interact through the XPM intensity. A criterion of select the step size is described as the derived XPM intensity modulation being taken into account. It is non-uniform distribution method, the simulation accuracy is improved when the step size is determined by the improved XPM intensity.

  15. A novel thermodynamic state recursion method for description of nonideal nonlinear chromatographic process of frontal analysis.

    Science.gov (United States)

    Liu, Qian; OuYang, Liangfei; Liang, Heng; Li, Nan; Geng, Xindu

    2012-06-01

    A novel thermodynamic state recursion (TSR) method, which is based on nonequilibrium thermodynamic path described by the Lagrangian-Eulerian representation, is presented to simulate the whole chromatographic process of frontal analysis using the spatial distribution of solute bands in time series like as a series of images. TSR differs from the current numerical methods using the partial differential equations in Eulerian representation. The novel method is used to simulate the nonideal, nonlinear hydrophobic interaction chromatography (HIC) processes of lysozyme and myoglobin under the discrete complex boundary conditions. The results show that the simulated breakthrough curves agree well with the experimental ones. The apparent diffusion coefficient and the Langmuir isotherm parameters of the two proteins in HIC are obtained by the state recursion inverse method. Due to its the time domain and Markov characteristics, TSR is applicable to the design and online control of the nonlinear multicolumn chromatographic systems.

  16. Data entry skills in a computer-based spread sheet amongst postgraduate medical students: A simulation based descriptive assessment

    Directory of Open Access Journals (Sweden)

    Amir Maroof Khan

    2014-01-01

    Full Text Available Background: In India, research work in the form of a thesis is a mandatory requirement for the postgraduate (PG medical students. Data entry in a computer-based spread sheet is one of the important basic skills for research, which has not yet been studied. This study was conducted to assess the data entry skills of the 2 nd year PG medical students of a medical college of North India. Materials and Methods: A cross-sectional, descriptive study was conducted among 111 second year PG students by using four simulated filled case record forms and a computer-based spread sheet in which data entry was to be carried out. Results: On a scale of 0-10, only 17.1% of the students scored more than seven. The specific sub-skills that were found to be lacking in more than half of the respondents were as follows: Inappropriate coding (93.7%, long variable names (51.4%, coding not being done for all the variables (76.6%, missing values entered in a non-uniform manner (84.7% and two variables entered in the same column in the case of blood pressure reading (80.2%. Conclusion: PG medical students were not found to be proficient in data entry skill and this can act as a barrier to do research. This being a first of its kind study in India, more research is needed to understand this issue and then include this yet neglected aspect in teaching research methodology to the medical students.

  17. Advanced Modeling, Simulation and Analysis (AMSA) Capability Roadmap Progress Review

    Science.gov (United States)

    Antonsson, Erik; Gombosi, Tamas

    2005-01-01

    Contents include the following: NASA capability roadmap activity. Advanced modeling, simulation, and analysis overview. Scientific modeling and simulation. Operations modeling. Multi-special sensing (UV-gamma). System integration. M and S Environments and Infrastructure.

  18. An educational model for ensemble streamflow simulation and uncertainty analysis

    National Research Council Canada - National Science Library

    AghaKouchak, A; Nakhjiri, N; Habib, E

    2013-01-01

    ...) are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI) and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity...

  19. Augmenting health care failure modes and effects analysis with simulation

    DEFF Research Database (Denmark)

    Staub-Nielsen, Ditte Emilie; Dieckmann, Peter; Mohr, Marlene

    2014-01-01

    This study explores whether simulation plays a role in health care failure mode and effects analysis (HFMEA); it does this by evaluating whether additional data are found when a traditional HFMEA is augmented with simulation. Two multidisciplinary teams identified vulnerabilities in a process...... for deeper analysis. The study indicates that simulation has a role in HFMEA. Both ways of using simulation seemed feasible, and our results are not conclusive in selecting one over the other....

  20. Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study.

    Science.gov (United States)

    Vaismoradi, Mojtaba; Turunen, Hannele; Bondas, Terese

    2013-09-01

    Qualitative content analysis and thematic analysis are two commonly used approaches in data analysis of nursing research, but boundaries between the two have not been clearly specified. In other words, they are being used interchangeably and it seems difficult for the researcher to choose between them. In this respect, this paper describes and discusses the boundaries between qualitative content analysis and thematic analysis and presents implications to improve the consistency between the purpose of related studies and the method of data analyses. This is a discussion paper, comprising an analytical overview and discussion of the definitions, aims, philosophical background, data gathering, and analysis of content analysis and thematic analysis, and addressing their methodological subtleties. It is concluded that in spite of many similarities between the approaches, including cutting across data and searching for patterns and themes, their main difference lies in the opportunity for quantification of data. It means that measuring the frequency of different categories and themes is possible in content analysis with caution as a proxy for significance. © 2013 Wiley Publishing Asia Pty Ltd.

  1. Passion fruit juice with different sweeteners: sensory profile by descriptive analysis and acceptance.

    Science.gov (United States)

    Rocha, Izabela Furtado de Oliveira; Bolini, Helena Maria André

    2015-03-01

    This study evaluated the effect of different sweeteners on the sensory profile, acceptance, and drivers of preference of passion fruit juice samples sweetened with sucrose, aspartame, sucralose, stevia, cyclamate/saccharin blend 2:1, and neotame. Sensory profiling was performed by 12 trained assessors using quantitative descriptive analysis (QDA). Acceptance tests (appearance, aroma, flavor, texture and overall impression) were performed with 124 consumers of tropical fruit juice. Samples with sucrose, aspartame and sucralose showed similar sensory profile (P fruit flavor affected positively and sweet aftertaste affected negatively the acceptance of the samples. Samples sweetened with aspartame, sucralose, and sucrose presented higher acceptance scores for the attributes flavor, texture, and overall impression, with no significant (P fruit juice.

  2. A descriptive analysis of overviews of reviews published between 2000 and 2011.

    Directory of Open Access Journals (Sweden)

    Lisa Hartling

    Full Text Available BACKGROUND: Overviews of systematic reviews compile data from multiple systematic reviews (SRs and are a new method of evidence synthesis. OBJECTIVES: To describe the methodological approaches in overviews of interventions. DESIGN: Descriptive study. METHODS: We searched 4 databases from 2000 to July 2011; we handsearched Evidence-based Child Health: A Cochrane Review Journal. We defined an overview as a study that: stated a clear objective; examined an intervention; used explicit methods to identify SRs; collected and synthesized outcome data from the SRs; and intended to include only SRs. We did not restrict inclusion by population characteristics (e.g., adult or children only. Two researchers independently screened studies and applied eligibility criteria. One researcher extracted data with verification by a second. We conducted a descriptive analysis. RESULTS: From 2,245 citations, 75 overviews were included. The number of overviews increased from 1 in 2000 to 14 in 2010. The interventions were pharmacological (n = 20, 26.7%, non-pharmacological (n = 26, 34.7%, or both (n = 29, 38.7%. Inclusion criteria were clearly stated in 65 overviews. Thirty-three (44% overviews searched at least 2 databases. The majority reported the years and databases searched (n = 46, 61%, and provided key words (n = 58, 77%. Thirty-nine (52% overviews included Cochrane SRs only. Two reviewers independently screened and completed full text review in 29 overviews (39%. Methods of data extraction were reported in 45 (60%. Information on quality of individual studies was extracted from the original SRs in 27 (36% overviews. Quality assessment of the SRs was performed in 28 (37% overviews; at least 9 different tools were used. Quality of the body of evidence was assessed in 13 (17% overviews. Most overviews provided a narrative or descriptive analysis of the included SRs. One overview conducted indirect analyses and the other conducted mixed

  3. CERENA: ChEmical REaction Network Analyzer--A Toolbox for the Simulation and Analysis of Stochastic Chemical Kinetics.

    Science.gov (United States)

    Kazeroonian, Atefeh; Fröhlich, Fabian; Raue, Andreas; Theis, Fabian J; Hasenauer, Jan

    2016-01-01

    Gene expression, signal transduction and many other cellular processes are subject to stochastic fluctuations. The analysis of these stochastic chemical kinetics is important for understanding cell-to-cell variability and its functional implications, but it is also challenging. A multitude of exact and approximate descriptions of stochastic chemical kinetics have been developed, however, tools to automatically generate the descriptions and compare their accuracy and computational efficiency are missing. In this manuscript we introduced CERENA, a toolbox for the analysis of stochastic chemical kinetics using Approximations of the Chemical Master Equation solution statistics. CERENA implements stochastic simulation algorithms and the finite state projection for microscopic descriptions of processes, the system size expansion and moment equations for meso- and macroscopic descriptions, as well as the novel conditional moment equations for a hybrid description. This unique collection of descriptions in a single toolbox facilitates the selection of appropriate modeling approaches. Unlike other software packages, the implementation of CERENA is completely general and allows, e.g., for time-dependent propensities and non-mass action kinetics. By providing SBML import, symbolic model generation and simulation using MEX-files, CERENA is user-friendly and computationally efficient. The availability of forward and adjoint sensitivity analyses allows for further studies such as parameter estimation and uncertainty analysis. The MATLAB code implementing CERENA is freely available from http://cerenadevelopers.github.io/CERENA/.

  4. Discrete Event Simulation Modeling and Analysis of Key Leader Engagements

    Science.gov (United States)

    2012-06-01

    SIMULATION MODELING AND ANALYSIS OF KEY LEADER ENGAGEMENTS by Clifford C. Wakeman June 2012 Thesis Co-Advisors: Arnold H. Buss Susan...DATE June 2012 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Discrete Event Simulation Modeling and Analysis of Key...for public release; distribution is unlimited DISCRETE EVENT SIMULATION MODELING AND ANALYSIS OF KEY LEADER ENGAGEMENTS Clifford C. Wakeman

  5. The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP): Overview and Description of Models, Simulations and Climate Diagnostics

    Science.gov (United States)

    Lamarque, J.-F.; Shindell, D. T.; Naik, V.; Plummer, D.; Josse, B.; Righi, M.; Rumbold, S. T.; Schulz, M.; Skeie, R. B.; Strode, S.; Young, P. J.; Cionni, I.; Dalsoren, S.; Eyring, V.; Bergmann, D.; Cameron-Smith, P.; Collins, W. J.; Doherty, R.; Faluvegi, G.; Folberth, G.; Ghan, S. J.; Horowitz, L. W.; Lee, Y. H.; MacKenzie, I. A.; Nagashima, T.

    2013-01-01

    The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP) consists of a series of time slice experiments targeting the long-term changes in atmospheric composition between 1850 and 2100, with the goal of documenting composition changes and the associated radiative forcing. In this overview paper, we introduce the ACCMIP activity, the various simulations performed (with a requested set of 14) and the associated model output. The 16 ACCMIP models have a wide range of horizontal and vertical resolutions, vertical extent, chemistry schemes and interaction with radiation and clouds. While anthropogenic and biomass burning emissions were specified for all time slices in the ACCMIP protocol, it is found that the natural emissions are responsible for a significant range across models, mostly in the case of ozone precursors. The analysis of selected present-day climate diagnostics (precipitation, temperature, specific humidity and zonal wind) reveals biases consistent with state-of-the-art climate models. The model-to- model comparison of changes in temperature, specific humidity and zonal wind between 1850 and 2000 and between 2000 and 2100 indicates mostly consistent results. However, models that are clear outliers are different enough from the other models to significantly affect their simulation of atmospheric chemistry.

  6. Simulation Analysis of Cylindrical Panoramic Image Mosaic

    Directory of Open Access Journals (Sweden)

    ZHU Ningning

    2017-04-01

    Full Text Available With the rise of virtual reality (VR technology, panoramic images are used more widely, which obtained by multi-camera stitching and take advantage of homography matrix and image transformation, however, this method will destroy the collinear condition, make it's difficult to 3D reconstruction and other work. This paper proposes a new method for cylindrical panoramic image mosaic, which set the number of mosaic camera, imaging focal length, imaging position and imaging attitude, simulate the mapping process of multi-camera and construct cylindrical imaging equation from 3D points to 2D image based on photogrammetric collinearity equations. This cylindrical imaging equation can not only be used for panoramic stitching, but also be used for precision analysis, test results show: ①this method can be used for panoramic stitching under the condition of multi-camera and incline imaging; ②the accuracy of panoramic stitching is affected by 3 kinds of parameter errors including focus, displacement and rotation angle, in which focus error can be corrected by image resampling, displacement error is closely related to object distance and rotation angle error is affected mainly by the number of cameras.

  7. The Use of Descriptive Analysis to Identify and Manipulate Schedules of Reinforcement in the Treatment of Food Refusal

    Science.gov (United States)

    Casey, Sean D.; Cooper-Brown, Linda J.; Wacker, David P.; Rankin, Barbara E.

    2006-01-01

    The feeding behaviors of a child diagnosed with failure to thrive were assessed using descriptive analysis methodology to identify the schedules of reinforcement provided by the child's parents. This analysis revealed that the child's appropriate feeding behaviors (i.e., bite acceptance, self-feeding) were on a lean schedule of positive…

  8. Relationships between Descriptive Sensory Attributes and Physicochemical Analysis of Broiler and Taiwan Native Chicken Breast Meat.

    Science.gov (United States)

    Chumngoen, Wanwisa; Tan, Fa-Jui

    2015-07-01

    Unique organoleptic characteristics such as rich flavors and chewy texture contribute to the higher popularity of native chicken in many Asian areas, while the commercial broilers are well-accepted due to their fast-growing and higher yields of meat. Sensory attributes of foods are often used to evaluate food eating quality and serve as references during the selection of foods. In this study, a three-phase descriptive sensory study was conducted to evaluate the sensory attributes of commercial broiler (BR) and Taiwan native chicken (TNC) breast meat, and investigate correlations between these sensory attributes and instrumental measurements. The results showed that for the first bite (phase 1), TNC meat had significantly higher moisture release, hardness, springiness, and cohesiveness than BR meat. After chewing for 10 to 12 bites (phase 2), TNC meat presented significantly higher chewdown hardness and meat particle size, whereas BR meat had significantly higher cohesiveness of mass. After swallowing (phase 3), TNC meat had higher chewiness and oily mouthcoat and lower residual loose particles than BR meat. TNC meat also provided more intense chicken flavors. This study clearly demonstrates that descriptive sensory analysis provides more detailed and more objectively information about the sensory attributes of meats from various chicken breeds. Additionally, sensory textural attributes vary between BR and TNC meat, and are highly correlated to the shear force value and collagen content which influence meat eating qualities greatly. The poultry industry and scientists should be able to recognize the sensory characteristics of different chicken meats more clearly. Accordingly, based on the meat's unique sensory and physicochemical characteristics, future work might address how meat from various breeds could best satisfy consumer needs using various cooking methods.

  9. Simulation Development and Analysis of Crew Vehicle Ascent Abort

    Science.gov (United States)

    Wong, Chi S.

    2016-01-01

    I have taken thus far that focus on pure logic, simulation code focuses on mimicking the physical world with some approximation and can have inaccuracies or numerical instabilities. Learning from my mistake, I adopted new methods to analyze these different simulations. One method the student used was to numerically plot various physical parameters using MATLAB to confirm the mechanical behavior of the system in addition to comparing the data to the output from a separate simulation tool called FAST. By having full control over what was being outputted from the simulation, I could choose which parameters to change and to plot as well as how to plot them, allowing for an in depth analysis of the data. Another method of analysis was to convert the output data into a graphical animation. Unlike the numerical plots, where all of the physical components were displayed separately, this graphical display allows for a combined look at the simulation output that makes it much easier for one to see the physical behavior of the model. The process for converting SOMBAT output for EDGE graphical display had to be developed. With some guidance from other EDGE users, I developed a process and created a script that would easily allow one to display simulations graphically. Another limitation with the SOMBAT model was the inability for the capsule to have the main parachutes instantly deployed with a large angle between the air speed vector and the chutes drag vector. To explore this problem, I had to learn about different coordinate frames used in Guidance, Navigation & Control (J2000, ECEF, ENU, etc.) to describe the motion of a vehicle and about Euler angles (e.g. Roll, Pitch, Yaw) to describe the orientation of the vehicle. With a thorough explanation from my mentor about the description of each coordinate frame, as well as how to use a directional cosine matrix to transform one frame to another, I investigated the problem by simulating different capsule orientations. In the end

  10. Flash Profile for rapid descriptive analysis in sensory characterization of passion fruit juice

    Directory of Open Access Journals (Sweden)

    Flávia Daiana Montanuci

    2015-07-01

    Full Text Available The Flash Profile is a descriptive analysis method derived from Free-Choice Profile, in which each taster chooses and uses his/her own words to evaluate the product while comparing several attributes. Four passion fruit juices were analyzed, two juices were produced with concentrated juice, one with pulp and one with reconstituted juice; all juices had different levels of sugar, some had gum and dyes. This study aimed to evaluate the physicochemical properties (color, titratable acidity and solid content as well as sensory analysis like Flash profile and affective test. In physicochemical characterization and in Flash Profile, the juice A (pulp had higher solid content and consistence, the juice B (concentrated juice was the least acidic and presented the lowest value of soluble solids and presented strong aroma and flavor of passion-fruit, the juice C (reconstituted juice was pale yellow and showed artificial flavor and the juice D (concentrated juice was the most acidic, consistent with the natural flavor. In the acceptance test, all the juices scored 5-6, indicating that panelists tasters neither liked nor disliked. Flash Profile proved to be an easy and rapid technique showing a good correlation between panelists and the attributes and confirmed the results of physicochemical characterization.

  11. Analysis and simulation of Wiseman hypocycloid engine

    Directory of Open Access Journals (Sweden)

    Priyesh Ray

    2014-12-01

    Full Text Available This research studies an alternative to the slider-crank mechanism for internal combustion engines, which was proposed by the Wiseman Technologies Inc. Their design involved replacing the crankshaft with a hypocycloid gear assembly. The unique hypocycloid gear arrangement allowed the piston and connecting rod to move in a straight line creating a perfect sinusoidal motion, without any side loads. In this work, the Wiseman hypocycloid engine was modeled in a commercial engine simulation software and compared to slider-crank engine of the same size. The engine’s performance was studied, while operating on diesel, ethanol, and gasoline fuel. Furthermore, a scaling analysis on the Wiseman engine prototypes was carried out to understand how the performance of the engine is affected by increasing the output power and cylinder displacement. It was found that the existing 30cc Wiseman engine produced about 7% less power at peak speeds than the slider-crank engine of the same size. These results were concurrent with the dynamometer tests performed in the past. It also produced lower torque and was about 6% less fuel efficient than the slider-crank engine. The four-stroke diesel variant of the same Wiseman engine performed better than the two-stroke gasoline version. The Wiseman engine with a contra piston (that allowed to vary the compression ratio showed poor fuel efficiency but produced higher torque when operating on E85 fuel. It also produced about 1.4% more power than while running on gasoline. While analyzing effects of the engine size on the Wiseman hypocycloid engine prototypes, it was found that the engines performed better in terms of power, torque, fuel efficiency, and cylinder brake mean effective pressure as the displacement increased. The 30 horsepower (HP conceptual Wiseman prototype, while operating on E85, produced the most optimum results in all aspects, and the diesel test for the same engine proved to be the most fuel efficient.

  12. A Description of the Clinical Proteomic Tumor Analysis Consortium (CPTAC) Common Data Analysis Pipeline.

    Science.gov (United States)

    Rudnick, Paul A; Markey, Sanford P; Roth, Jeri; Mirokhin, Yuri; Yan, Xinjian; Tchekhovskoi, Dmitrii V; Edwards, Nathan J; Thangudu, Ratna R; Ketchum, Karen A; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Stein, Stephen E

    2016-03-01

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) has produced large proteomics data sets from the mass spectrometric interrogation of tumor samples previously analyzed by The Cancer Genome Atlas (TCGA) program. The availability of the genomic and proteomic data is enabling proteogenomic study for both reference (i.e., contained in major sequence databases) and nonreference markers of cancer. The CPTAC laboratories have focused on colon, breast, and ovarian tissues in the first round of analyses; spectra from these data sets were produced from 2D liquid chromatography-tandem mass spectrometry analyses and represent deep coverage. To reduce the variability introduced by disparate data analysis platforms (e.g., software packages, versions, parameters, sequence databases, etc.), the CPTAC Common Data Analysis Platform (CDAP) was created. The CDAP produces both peptide-spectrum-match (PSM) reports and gene-level reports. The pipeline processes raw mass spectrometry data according to the following: (1) peak-picking and quantitative data extraction, (2) database searching, (3) gene-based protein parsimony, and (4) false-discovery rate-based filtering. The pipeline also produces localization scores for the phosphopeptide enrichment studies using the PhosphoRS program. Quantitative information for each of the data sets is specific to the sample processing, with PSM and protein reports containing the spectrum-level or gene-level ("rolled-up") precursor peak areas and spectral counts for label-free or reporter ion log-ratios for 4plex iTRAQ. The reports are available in simple tab-delimited formats and, for the PSM-reports, in mzIdentML. The goal of the CDAP is to provide standard, uniform reports for all of the CPTAC data to enable comparisons between different samples and cancer types as well as across the major omics fields.

  13. Can simulations of flux exchanges between the land surface and the atmosphere be improved by a more complex description of soil and plant processes?

    Science.gov (United States)

    Klein, Christian

    2013-04-01

    Can simulations of flux exchanges between the land surface and the atmosphere be improved by a more complex description of soil and plant processes? Christian Klein, Christian Biernath, Peter Hoffmann and Eckart Priesack Helmholtz Zentrum München, German Research Center for Environmental Health, Institute of Soil Ecology, Oberschleissheim, Germany christian.klein@helmholtz-muenchen.de, ++ 49 89 3187 3015 Recent studies show, that uncertainties in regional and global climate simulations are partly caused by inadequate descriptions of soil-plant-atmosphere. Therefore, we coupled the soil-plant model system Expert-N to the regional climate and weather forecast model WRF. Key features of the Expert-N model system are the simulation of water flow, heat transfer and solute transport in soils and the transpiration of grassland and forest stands. Particularly relevant for the improvement of regional weather forecast are simulations of the feedback between the land surface and atmosphere, which influences surface temperature, surface pressure and precipitation. The WRF model was modified to optionally select either the land surface model Expert-N or NOAH to simulate the exchange of water and energy fluxes between the land surface and the atmosphere for every single grid cell within the simulation domain. Where the standard land surface model NOAH interpolates monthly LAI input values to simulate interactions between plant and atmosphere Expert-N simulates a dynamic plant growth with respect to water and nutrient availability in the soil. In this way Expert-N can be applied to study the effect of dynamic vegetation growth simulation on regional climate simulation results. For model testing Expert-N was used with two different soil parameterizations. The first parametrization used the USGS soil texture classification and simplifies the soil profile to one horizon (similar to the NOAH model). The second parameterization is based on the German soil texture classification

  14. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  15. Hardware synthesis from DDL description. [simulating a digital system for computerized design of large scale integrated circuits

    Science.gov (United States)

    Shiva, S. G.; Shah, A. M.

    1980-01-01

    The details of digital systems can be conveniently input into the design automation system by means of hardware description language (HDL). The computer aided design and test (CADAT) system at NASA MSFC is used for the LSI design. The digital design language (DDL) was selected as HDL for the CADAT System. DDL translator output can be used for the hardware implementation of the digital design. Problems of selecting the standard cells from the CADAT standard cell library to realize the logic implied by the DDL description of the system are addressed.

  16. Compact, accurate description of diagnostic neutral beam propagation and attenuation in a high temperature plasma for charge exchange recombination spectroscopy analysis.

    Science.gov (United States)

    Bespamyatnov, Igor O; Rowan, William L; Granetz, Robert S

    2008-10-01

    Charge exchange recombination spectroscopy on Alcator C-Mod relies on the use of the diagnostic neutral beam injector as a source of neutral particles which penetrate deep into the plasma. It employs the emission resulting from the interaction of the beam atoms with fully ionized impurity ions. To interpret the emission from a given point in the plasma as the density of emitting impurity ions, the density of beam atoms must be known. Here, an analysis of beam propagation is described which yields the beam density profile throughout the beam trajectory from the neutral beam injector to the core of the plasma. The analysis includes the effects of beam formation, attenuation in the neutral gas surrounding the plasma, and attenuation in the plasma. In the course of this work, a numerical simulation and an analytical approximation for beam divergence are developed. The description is made sufficiently compact to yield accurate results in a time consistent with between-shot analysis.

  17. A Descriptive Analysis of US Prehospital Care Response to Law Enforcement Tactical Incidents.

    Science.gov (United States)

    Aberle, Sara J; Lohse, Christine M; Sztajnkrycer, Matthew D

    2015-01-01

    Law enforcement tactical incidents involve high-risk operations that exceed the capabilities of regular, uniformed police. Despite the existence of tactical teams for 50 years, little is known about the frequency or nature of emergency medical services (EMS) response to tactical events in the United States. The purpose of this study was to perform a descriptive analysis of tactical events reported to a national EMS database. Descriptive analysis of the 2012 National Emergency Medical Services Information System (NEMSIS) Public Release research data set, containing EMS emergency response data from 41 states. A total of 17,479,328 EMS events were reported, of which 3,953 events were coded as "Activation-Tactical or SWAT Specialty Service/Response Team." The most common level of prehospital care present on scene was basic life support (55.2%). The majority (72.3%) of tactical incident activations involved a single patient; mass casualty incidents occurred in 0.5% of events. The most common EMS response locations were homes (48.4%), streets or highways (37.0%), and public buildings (6.3%). The mean age of treated patients was 44.1 years ± 22.0 years; 3.5% of tactical incident activation patients were aged 8 years or less. Injuries were coded as firearm assault in 14.8% and as chemical exposure in 8.9% of events. Cardiac arrest occurred in 5.1% of patients, with the majority (92.2%) occurring prior to EMS arrival. The primary symptoms reported by EMS personnel were pain (37.4%), change in responsiveness (13.1%), and bleeding (8.1%). Advanced airway procedures occurred in 30 patients. No patients were documented as receiving tourniquets or needle thoracostomy. Approximately 11 EMS responses in support of law enforcement tactical operations occur daily in the United States. The majority occur in homes and involve a single patient. Advanced airway procedures are required in a minority of patients. Cardiac arrest is rare and occurs prior to EMS response in the majority of

  18. Efficiency assessment of Flash Profiling and Ranking Descriptive Analysis: a comparative study with star fruit-powdered flavored drink

    Directory of Open Access Journals (Sweden)

    Maria Eugênia de Oliveira MAMEDE

    2016-01-01

    Full Text Available Abstract The objective of this study was to assess the efficiency of Flash Profiling (FP and Ranking Descriptive Analysis (RDA methods regarding sensory characterization using star fruit-flavored drink as matrix. Sample A was used as a standard. Other three samples were prepared based on sample A, by adding sugar, citric acid, carboxymethylcellulose or dye. The same panel (twelve assessors was used to carry out FP and, subsequently, RDA analysis. The qualitative training stage used in RDA method showed no difference regarding the assessors’ performance and panel consensus compared to FP. Both methods were efficient and discriminated samples in a similar way and in agreement with the physicochemical characterization. However, astringent and bitter aftertaste attributes were additionally used in sample description by RDA. The latter attribute was also relevant for samples discrimination in RDA. FP application was simpler and faster, mainly regarding time spent by the assessors; however, RDA provided more comprehensive description of samples.

  19. Generator dynamics in aeroelastic analysis and simulations

    DEFF Research Database (Denmark)

    Larsen, Torben J.; Hansen, Morten Hartvig; Iov, F.

    2003-01-01

    This report contains a description of a dynamic model for a doubly-fed induction generator. The model has physical input parameters (voltage, resistance, reactance etc.) and can be used to calculate rotor and stator currents, hence active and reactivepower. A perturbation method has been used...... to reduce the original generator model equations to a set of equations which can be solved with the same time steps as a typical aeroelastic code. The method is used to separate the fast transients of the modelfrom the slow variations and deduce a reduced order expression for the slow part. Dynamic effects...

  20. The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP: overview and description of models, simulations and climate diagnostics

    Directory of Open Access Journals (Sweden)

    J.-F. Lamarque

    2012-08-01

    Full Text Available The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP consists of a series of timeslice experiments targeting the long-term changes in atmospheric composition between 1850 and 2100, with the goal of documenting radiative forcing and the associated composition changes. Here we introduce the various simulations performed under ACCMIP and the associated model output. The ACCMIP models have a wide range of horizontal and vertical resolutions, vertical extent, chemistry schemes and interaction with radiation and clouds. While anthropogenic and biomass burning emissions were specified for all time slices in the ACCMIP protocol, it is found that the natural emissions lead to a significant range in emissions, mostly for ozone precursors. The analysis of selected present-day climate diagnostics (precipitation, temperature, specific humidity and zonal wind reveals biases consistent with state-of-the-art climate models. The model-to-model comparison of changes in temperature, specific humidity and zonal wind between 1850 and 2000 and between 2000 and 2100 indicates mostly consistent results, but with outliers different enough to possibly affect their representation of climate impact on chemistry.

  1. A Descriptive and Explorative Case Study of a Scratch Programming Experience Involving the Creation of a Lunar Simulation/Model with Grade Six Learners

    Science.gov (United States)

    Martin, Stephen Alexander

    This mixed methods descriptive and exploratory case study examined the experiences of two classes in an elementary school (39 students) using the Scratch programming language to create a lunar simulation of the Earth/Moon System over six days. Using the Computational Thinking Framework developed by Brennan and Resnick (2012) the researcher examined the computational thinking (CT) concepts and practices students were exposed to. This study finds that all of the student groups experienced at least partial success in building their simulation. The researcher found that all of the groups explored the CT concepts of sequence, events, parallelism, conditionals and operators while building their simulation and more than 80% of the groups used data and loops. There is evidence that the students were involved in three of the computational practices: incremental and iterative, testing and debugging and, abstracting and modularizing. This study offers recommendations for practice and for future research.

  2. Descriptive Analysis on Flouting and Hedging of Conversational Maxims in the “Post Grad” Movie

    Directory of Open Access Journals (Sweden)

    Nastiti Rokhmania

    2012-11-01

    Full Text Available This research is focused on analyzing flouting and hedging of conversational maxim of utterances used by the main characters in “Post Grad” movie. Conversational maxims are the rules of cooperative principle categorized into four categories; Maxim of Quality, Maxim of Quantity, Maxim of Relevance, and Maxim of Manner. If these maxims are used in conversations, the conversations can go smoothly. However, people often break the maxims overtly (flouting maxim and sometimes break the maxims secretly (hedging maxims when they make a conversation. This research is conducted using descriptive qualitative method based on the theory known as Grice’s Maxims. The data are in form of utterances used by the characters in “Post Grad” movie. The data analysis reveals some finding covering the formulated research question. The maxims are flouted when the speaker breaks some conversational maxims when using the utterances in the form of rhetorical strategies, such as tautology, metaphor, hyperbole, irony, and rhetorical question. On the other hand, conversational maxims are also hedged when the information is not totally accurate or unclearly stated but seems informative, well-founded, and relevant.

  3. A pedigree-analysis approach to the descriptive epidemiology of autosomal-recessive disorders.

    Science.gov (United States)

    Man, W Y N; Nicholas, F W; James, J W

    2007-03-17

    We describe a pedigree-analysis approach to estimating descriptive epidemiological parameters for autosomal-recessive disorders when the ancestral source of the disorder is known. We show that the expected frequency of carriers in a cohort equals the gene contribution of the ancestral source to that cohort, which is equivalent to the direct (additive) genetic relationship of that ancestor to the cohort. Also, the expected incidence of affected foetuses ranges from (1/2)F* to F*, where F* is the mean partial inbreeding coefficient (due to the ancestor) of the cohort. We applied this approach to complex vertebral malformation (CVM) in Holstein-Friesians in Australia, for which the ancestral source is a USA-born bull, Carlin-M Ivanhoe Bell. The estimated frequency of carriers was 2.47% for the 1992-born and 4.44% for the 1997-born cohort of Holstein-Friesian cows in Australia. The estimated incidence of affected foetuses/calves was considerably less than one per thousand, ranging from 0.0024 to 0.0048% for the 1992-born cohort, and from 0.0288 to 0.0576% for the 1997-born cohort. These incidences correspond to expected numbers of affected female foetuses/calves ranging from 2 to 4 for the 1992-born cohort and from 28 to 56 for the 1997-born cohort. This approach is easy to implement using software that is readily available.

  4. A Descriptive Analysis of Decision Support Systems Research Between 1990 and 2003

    Directory of Open Access Journals (Sweden)

    David Arnott

    2005-05-01

    Full Text Available This paper is the first major report of a project that is investigating the theoretic foundations of decision support systems (DSS. The project was principally motivated by a concern for the direction and relevance of DSS research. The main areas of research focus are the decision and judgement theoretic base of the discipline, the research strategies used in published articles, and the professional relevance of DSS research. The project has analysed 926 DSS articles published in 14 major journals from 1990 to 2003. The findings indicate that DSS research is more dominated by positivist research than general information systems (in particular experiments, surveys, and descriptions of specific applications and systems, is heavily influenced by the work of Herbert Simon, is poorly grounded in contemporary judgement and decision-making research, and falls down in the identification of the nature of clients and users. Of great concern is the finding that DSS research has relatively low professional relevance. An overview of the direction of further analysis is presented.

  5. Cervical necrotizing fasciitis: descriptive, retrospective analysis of 59 cases treated at a single center.

    Science.gov (United States)

    Elander, Johanna; Nekludov, Michael; Larsson, Agneta; Nordlander, Britt; Eksborg, Staffan; Hydman, Jonas

    2016-12-01

    To provide retrospective, descriptive information on patients with cervical necrotizing fasciitis treated at a single center during the years 1998-2014, and to evaluate the outcome of a newly introduced treatment strategy. Retrospective analysis of clinical data obtained from medical records. Mortality, pre-morbidity, severity of illness, primary site of infection, type of bacteria, time parameters. The observed 3-month mortality was 6/59 (10 %). The most common initial foci of the infection were pharyngeal, dental or hypopharyngeal. The most common pathogen was Streptococcus milleri bacteria within the Streptococcus anginosus group (66 % of the cases). Using a combined treatment with early surgical debridement combined with hyperbaric oxygen treatment, it is possible to reduce the mortality rate among patients suffering from cervical necrotizing fasciitis, compared to the expected mortality rate and to previous historical reports. Data indicated that early onset of hyperbaric oxygen treatment may have a positive impact on survival rate, but no identifiable factor was found to prognosticate outcome.

  6. Cholera outbreaks in South and Southeast Asia: descriptive analysis, 2003-2012.

    Science.gov (United States)

    Mahapatra, Tanmay; Mahapatra, Sanchita; Babu, Giridhara R; Tang, Weiming; Banerjee, Barnali; Mahapatra, Umakanta; Das, Aritra

    2014-01-01

    We conducted descriptive analysis of available information regarding the epidemiology of cholera outbreaks in South and Southeast Asia during 2003-2012. Information from 58 articles, 8 reports, and World Health Organization databases were analyzed. Overall, 113 cholera outbreaks were studied in South and Southeast Asia during the past 10 years. The majority of the outbreaks (69%) occurred in Southeast Asia, including India (52%). The highest number of outbreaks was observed in 2004 (25.7%). The most commonly identified source was contaminated water: however, in some countries, the spread of cholera was facilitated via contaminated seafood (e.g., Myanmar, Thailand, and Singapore). Several genotypes and phenotypes of Vibrio cholerae, the causative agent of cholera, were identified in the outbreaks, including V. cholerae O1 El Tor (Ogawa and Inaba) and V. cholerae O139. The emergence of multidrug-resistant V. cholerae strains was a major concern. Cholera-related mortality was found to be low across the outbreaks, except in Orissa, India (currently Odisha) during 2007, where the case fatality rate was 8.6%. Potential limitations included underreporting, discrepancies, possible exclusion of nonindexed reports, and incomprehensive search terms. The provision of safe water and proper sanitation appear to be critical for the control of further spread of cholera in South Asian and Southeast Asian regions.

  7. Schedule Risk Analysis Simulator using Beta Distribution

    Directory of Open Access Journals (Sweden)

    Isha Sharma,

    2011-06-01

    Full Text Available This paper describes an application of simulation and Modelling in Software risk management. This paper describes a simulation based software risk management tool which helps manager to identifyhigh risk areas of software process. In this paper an endeavour has been made to build up a Stochastic Simulator which helps in decision making to identify the critical activities which are given due priorities during the development of Software Project. In response to new information or revised estimates, it may be necessary to reassign resources, cancel optional tasks, etc. Project management tools that make projections while treating decisions about tasks and resource assignments as static will not yield realistic results. The usual PERT procedure may lead to overly optimistic results as many pass which are not critical but slightly shorter than critical on the basis of estimated activity duration or average durations.Due to randomness of durations, these pass under some combination of activity durations, could become longer than the average longest path. Such paths would be ignored while using the PERT technique onthe basis of the average durations. In order to overcome this problem and be more reasonable, the said Stochastic Simulator has been designed by generating random samples from a specific probabilitydistribution associated with that particular activity of SPM. The said simulator is also not bugged with overly estimated results.

  8. Episodes of voluntary total fasting (hunger strike) in Spanish prisons: A descriptive analysis.

    Science.gov (United States)

    García-Guerrero, J; Vera-Remartínez, E J

    2015-08-01

    To provide a description of the frequency and main features of the episodes of voluntary total fasting (VTF) taking place in Spanish prisons. Information on the episodes of VTF reported between 04/01/2013 and 03/31/2014 was gathered. Once the appropriate informed consent was given, other data on social, demographic, penitentiary and clinical aspects were collected. A descriptive study of such variables together with a bivariate analysis was then carried out by means of standard statistical techniques and binary logistic regression models. IBM SPSS Statistics v.20 software was used for this purpose. This study was approved by an accredited Clinical Research Ethics Committee. 354 episodes of VTF took place among an average population of 29,762 prisoners. Therefore, the incidence rate was 11.9 VTF episodes per ‰ inmates-year. Informed consent (IC) was given in 180 cases (50.8%). 114 were of Spanish nationality and the average age was 38.7 years old (95% CI 37.2-40.1). The median duration of the episodes was 3 days (IQR 1-10), ranged between 1 and 71 days. The main reason was a disagreement on the decisions of treatment groups (57 cases, 31.7%). The average weight loss was 1.3 kg (70.8 vs. 69.5; p < 0.0001) and 0.7 of the BMI (24.5 vs. 23.8; p < 0.0001). 60 prisoners (33.3%) lost no weight at all and only 8 (4.4%) lost over 12% of the basal weight (8.5 kg). Ketone smell was identified in 61 cases (33.9%) and ketonuria in 63 (35%). Only one third of those who go on hunger strike in prison actually fast. Revindicative episodes of voluntary total fasting are somewhat common in Spanish prisons, but rarely are they carried out rigorously and entail a risk for those who fast. Copyright © 2015. Published by Elsevier Ltd.

  9. Analysis on descriptions of precautionary statements in package inserts of medicines

    Directory of Open Access Journals (Sweden)

    Tsuchiya F

    2012-02-01

    Full Text Available Keita Nabeta1, Masaomi Kimura2, Michiko Ohkura2, Fumito Tsuchiya31Graduate School of Engineering and Science, Shibaura Institute of Technology, Toyosu 3-7-5, Koto-ku, Tokyo, 135-8548 Japan; 2Faculty of Engineering, Shibaura Institute of Technology, Toyosu 3-7-5, Koto-ku, Tokyo, 135-8548 Japan; 3School of Pharmacy, International University of Health and Welfare, Minami-Aoyama 1-24-1, Minato-ku, Tokyo, 107-0062 JapanBackground: To prevent medical accidents, users must be informed of the cautions written in medical package inserts. To realize countermeasures by utilizing information systems, we must also implement a drug information database. However, this is not easy to develop, since the descriptions in package inserts are too complex and their information poorly structured. It is necessary to analyze package insert information and propose a data structure.Methods: We analyzed the descriptions of 'precautions for application' in package inserts via text mining methods. In order to summarize statements, we applied dependency analysis to statements and visualized their relations between predicate words and other words. Furthermore, we extracted words representing timing to execute the order.Results: We found that there are four types of statements: direct orders such as "使用する" (use, causative orders such as "使用させる" (make someone use, direct interdictions such as "使用しない" (do not use, and causative interdictions such as "使用させない" (do not make user use. As for words representing timing, we extracted six groups: "at the time of delivery," "at the time of preparation," "in use," "after use," and "at the time of storage." From these results, we obtained points of consideration concerning the subjects of orders in the statements and timing of their execution.Conclusion: From the obtained knowledge, we can define the information structure used to describe the precautionary statement. It should contain information such

  10. Regional hydrogeological simulations for Forsmark - numerical modelling using CONNECTFLOW. Preliminary site description Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Cox, Ian; Hunter, Fiona; Jackson, Peter; Joyce, Steve; Swift, Ben [Serco Assurance, Risley (United Kingdom); Gylling, Bjoern; Marsic, Niko [Kemakta Konsult AB, Stockholm (Sweden)

    2005-05-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) carries out site investigations in two different candidate areas in Sweden with the objective of describing the in-situ conditions for a bedrock repository for spent nuclear fuel. The site characterisation work is divided into two phases, an initial site investigation phase (IPLU) and a complete site investigation phase (KPLU). The results of IPLU are used as a basis for deciding on a subsequent KPLU phase. On the basis of the KPLU investigations a decision is made as to whether detailed characterisation will be performed (including sinking of a shaft). An integrated component in the site characterisation work is the development of site descriptive models. These comprise basic models in three dimensions with an accompanying text description. Central in the modelling work is the geological model, which provides the geometrical context in terms of a model of deformation zones and the rock mass between the zones. Using the geological and geometrical description models as a basis, descriptive models for other geo-disciplines (hydrogeology, hydro-geochemistry, rock mechanics, thermal properties and transport properties) will be developed. Great care is taken to arrive at a general consistency in the description of the various models and assessment of uncertainty and possible needs of alternative models. Here, a numerical model is developed on a regional-scale (hundreds of square kilometres) to understand the zone of influence for groundwater flow that affects the Forsmark area. Transport calculations are then performed by particle tracking from a local-scale release area (a few square kilometres) to identify potential discharge areas for the site and using greater grid resolution. The main objective of this study is to support the development of a preliminary Site Description of the Forsmark area on a regional-scale based on the available data of 30 June 2004 and the previous Site Description. A more specific

  11. Who uses nursing theory? A univariate descriptive analysis of five years' research articles.

    Science.gov (United States)

    Bond, A Elaine; Eshah, Nidal Farid; Bani-Khaled, Mohammed; Hamad, Atef Omar; Habashneh, Samira; Kataua', Hussein; al-Jarrah, Imad; Abu Kamal, Andaleeb; Hamdan, Falastine Rafic; Maabreh, Roqia

    2011-06-01

    Since the early 1950s, nursing leaders have worked diligently to build the Scientific Discipline of Nursing, integrating Theory, Research and Practice. Recently, the role of theory has again come into question, with some scientists claiming nurses are not using theory to guide their research, with which to improve practice. The purposes of this descriptive study were to determine: (i) Were nursing scientists' research articles in leading nursing journals based on theory? (ii) If so, were the theories nursing theories or borrowed theories? (iii) Were the theories integrated into the studies, or were they used as organizing frameworks? Research articles from seven top ISI journals were analysed, excluding regularly featured columns, meta-analyses, secondary analysis, case studies and literature reviews. The authors used King's dynamic Interacting system and Goal Attainment Theory as an organizing framework. They developed consensus on how to identify the integration of theory, searching the Title, Abstract, Aims, Methods, Discussion and Conclusion sections of each research article, whether quantitative or qualitative. Of 2857 articles published in the seven journals from 2002 to, and including, 2006, 2184 (76%) were research articles. Of the 837 (38%) authors who used theories, 460 (55%) used nursing theories, 377 (45%) used other theories: 776 (93%) of those who used theory integrated it into their studies, including qualitative studies, while 51 (7%) reported they used theory as an organizing framework for their studies. Closer analysis revealed theory principles were implicitly implied, even in research reports that did not explicitly report theory usage. Increasing numbers of nursing research articles (though not percentagewise) continue to be guided by theory, and not always by nursing theory. Newer nursing research methods may not explicitly state the use of nursing theory, though it is implicitly implied. © 2010 The Authors. Scandinavian Journal of Caring

  12. EFP Warhead Missile Fuze Analysis and Simulation

    Institute of Scientific and Technical Information of China (English)

    LI Wei; FAN Ning-jun; WANG Zheng-jie

    2008-01-01

    The explosive formed penetrator(EFP)warhead missile projects the blast fragments in one direction normal to the missile longitudinal axis.Through analyzing on the two restrictions of EFP warhead explosion:trajectory restriction and attitude requirement,the concept of fuse time-delay tolerance is presented to be the measurement of the time of the EFP warhead explosion.The calculation models of fuze time-delaY tolerance under two restrictions are provided.Some crucial parameters playing important roles in calculation under attitude requirements are simulated.The simulation results show that the engagement plane angle,roll rate and warhead attack standoff influence the tolerance dramatically.

  13. Mathematical Analysis and Simulation of Crop Micrometeorology

    NARCIS (Netherlands)

    Chen, J.

    1984-01-01

    In crop micrometeorology the transfer of radiation, momentum, heat and mass to or from a crop canopy is studied. Simulation models for these processes do exist but are not easy to handle because of their complexity and the long computing time they need. Moreover, up to now such models can only be ru

  14. Dynamic Process Simulation for Analysis and Design.

    Science.gov (United States)

    Nuttall, Herbert E., Jr.; Himmelblau, David M.

    A computer program for the simulation of complex continuous process in real-time in an interactive mode is described. The program is user oriented, flexible, and provides both numerical and graphic output. The program has been used in classroom teaching and computer aided design. Typical input and output are illustrated for a sample problem to…

  15. A New Baseline for the Inertial Navigation Strapdown Simulator Program. Volume 3. Program and Description and Users Guide

    Science.gov (United States)

    1978-07-01

    matrix. (See Section 7d for subroutine AUP description). ’I ~ Extrapolate the wander angle, ALF, to mid computation cycle DALF =(V (2) *SINLT *DT/2)/(RE...angle is,I / / DALF (OALF + ALF)/2/ / From the 4ander angle, ALF, are calculated, CALF = COS (ALF) / SALF = SIN (ALF) / 2-164/ Now the transformation...calculated from XSALF = SALF + DALF * CALF XCALF = CALF - DALF * SALF and the computed velocity vector, NAVV, is tranformed to ENU fram or NAVV = Z(-ALF

  16. In situ and in-transit analysis of cosmological simulations

    Science.gov (United States)

    Friesen, Brian; Almgren, Ann; Lukić, Zarija; Weber, Gunther; Morozov, Dmitriy; Beckner, Vincent; Day, Marcus

    2016-08-01

    Modern cosmological simulations have reached the trillion-element scale, rendering data storage and subsequent analysis formidable tasks. To address this circumstance, we present a new MPI-parallel approach for analysis of simulation data while the simulation runs, as an alternative to the traditional workflow consisting of periodically saving large data sets to disk for subsequent `offline' analysis. We demonstrate this approach in the compressible gasdynamics/ N-body code Nyx, a hybrid MPI+OpenMP code based on the BoxLib framework, used for large-scale cosmological simulations. We have enabled on-the-fly workflows in two different ways: one is a straightforward approach consisting of all MPI processes periodically halting the main simulation and analyzing each component of data that they own (` in situ'). The other consists of partitioning processes into disjoint MPI groups, with one performing the simulation and periodically sending data to the other `sidecar' group, which post-processes it while the simulation continues (`in-transit'). The two groups execute their tasks asynchronously, stopping only to synchronize when a new set of simulation data needs to be analyzed. For both the in situ and in-transit approaches, we experiment with two different analysis suites with distinct performance behavior: one which finds dark matter halos in the simulation using merge trees to calculate the mass contained within iso-density contours, and another which calculates probability distribution functions and power spectra of various fields in the simulation. Both are common analysis tasks for cosmology, and both result in summary statistics significantly smaller than the original data set. We study the behavior of each type of analysis in each workflow in order to determine the optimal configuration for the different data analysis algorithms.

  17. DESCRIPTIVE STATISTICS IN COST RESEARCH: ANALYSIS OF XIV BRAZILIAN CONGRESS OF COSTS

    OpenAIRE

    Diehl, Carlos Alberto; UNISINOS; Souza, Marcos Antônio de; UNISINOS; Domingos, Laura Elaine Cabral

    2009-01-01

    The objective of this article is the study of the utilization of descriptive statistics in costs researches, specifically in those presented in XIV Brazilian Congress of Costs, carried out in 2007, in João Pessoa city (PB). Firstly one does a theoretical revision about descriptive statistic and the presentation of the Costs Congress, carried out since 1994, under the organization of the Brazilian Association of Costs. In the sequence the methodological aspects of the study are presented, clas...

  18. A Descriptive Study of Registers Found in Spoken and Written Communication (A Semantic Analysis)

    OpenAIRE

    Nurul Hidayah

    2016-01-01

    This research is descriptive study of registers found in spoken and written communication. The type of this research is Descriptive Qualitative Research. In this research, the data of the study is register in spoken and written communication that are found in a book entitled "Communicating! Theory and Practice" and from internet. The data can be in the forms of words, phrases and abbreviation. In relation with method of collection data, the writer uses the library method as her instrument. Th...

  19. Assessment of competence in simulated flexible bronchoscopy using motion analysis

    DEFF Research Database (Denmark)

    Collela, Sara; Svendsen, Morten Bo Søndergaard; Konge, Lars

    2015-01-01

    Background: Flexible bronchoscopy should be performed with a correct posture and a straight scope to optimize bronchoscopy performance and at the same time minimize the risk of work-related injuries and endoscope damage. Objectives: We aimed to test whether an automatic motion analysis system could...... with the performance on the simulator (virtual-reality simulator score; p correct movements during self-directed training on simulators might help new bronchoscopists learn how to handle...

  20. Comparative visual analysis of 3D urban wind simulations

    Science.gov (United States)

    Röber, Niklas; Salim, Mohamed; Grawe, David; Leitl, Bernd; Böttinger, Michael; Schlünzen, Heinke

    2016-04-01

    Climate simulations are conducted in large quantity for a variety of different applications. Many of these simulations focus on global developments and study the Earth's climate system using a coupled atmosphere ocean model. Other simulations are performed on much smaller regional scales, to study very small fine grained climatic effects. These microscale climate simulations pose similar, yet also different, challenges for the visualization and the analysis of the simulation data. Modern interactive visualization and data analysis techniques are very powerful tools to assist the researcher in answering and communicating complex research questions. This presentation discusses comparative visualization for several different wind simulations, which were created using the microscale climate model MITRAS. The simulations differ in wind direction and speed, but are all centered on the same simulation domain: An area of Hamburg-Wilhelmsburg that hosted the IGA/IBA exhibition in 2013. The experiments contain a scenario case to analyze the effects of single buildings, as well as examine the impact of the Coriolis force within the simulation. The scenario case is additionally compared with real measurements from a wind tunnel experiment to ascertain the accuracy of the simulation and the model itself. We also compare different approaches for tree modeling and evaluate the stability of the model. In this presentation, we describe not only our workflow to efficiently and effectively visualize microscale climate simulation data using common 3D visualization and data analysis techniques, but also discuss how to compare variations of a simulation and how to highlight the subtle differences in between them. For the visualizations we use a range of different 3D tools that feature techniques for statistical data analysis, data selection, as well as linking and brushing.

  1. PDB4DNA: Implementation of DNA geometry from the Protein Data Bank (PDB) description for Geant4-DNA Monte-Carlo simulations

    Science.gov (United States)

    Delage, E.; Pham, Q. T.; Karamitros, M.; Payno, H.; Stepan, V.; Incerti, S.; Maigne, L.; Perrot, Y.

    2015-07-01

    This paper describes PDB4DNA, a new Geant4 user application, based on an independent, cross-platform, free and open source C++ library, so-called PDBlib, which enables use of atomic level description of DNA molecule in Geant4 Monte Carlo particle transport simulations. For the evaluation of direct damage induced on the DNA molecule by ionizing particles, the application makes use of an algorithm able to determine the closest atom in the DNA molecule to energy depositions. Both the PDB4DNA application and the PDBlib library are available as free and open source under the Geant4 license.

  2. The correspondence between the concepts in description logics for contexts and formal concept analysis

    Institute of Scientific and Technical Information of China (English)

    MA Yue; SUI YueFei; CAO CunGen

    2012-01-01

    Formal concept analysis (FCA) and description logic (DL) are meant to be formalizations of concepts.A formal concept in the former consists of its intent and extent,where the intent is the set of all the attributes shared by each object in the extent of the concept,and the extent is the set of all the objects sharing each property in the intent of the concept.A concept in the latter formalization is simply a concept name,the interpretation of which is a subset of a universe.To consider the correspondence between concepts in both formalizations,a multi-valued formal context must be represented both as a knowledge base and as a model of the DL for contexts,where concepts are decomposed into tuple concepts C,interpreted as a set of tuples and value concepts V,interpreted as a set of attribute-value pairs.We show that there is a difference between the interpretation of concepts (V)R.V/(V)R-.C and the Galois connection between the extent/intent of formal concepts in FCA.According to the Galois connection,there should be concepts of the form +(V)/R.V and +(V)R-.C interpreted in FCA,and hence the logical language L for DL is extended to be L+ together with +(V) as a constructor so that +(V)R.V and +(V)R-.C are well-defined concepts.Conversely,according to the interpretation in DL there should be pseudo concepts in FCA so that the interpretation of concepts (V)R.V/(V)R-.C is the extent/intent of pseudo concepts. The correspondence between formal concepts and concepts in L+,and between pseudo concepts and concepts in L are presented in this paper.

  3. DESCRIPTIVE ANALYSIS OF THE INTERNATIONAL MIGRATION PHENOMENON IN ROMANIA BETWEEN 1991 AND 2008

    Directory of Open Access Journals (Sweden)

    Bac Dorin Paul

    2011-07-01

    Full Text Available Migration represented and represents a very important phenomenon at global level, taking into consideration besides its demographic implications, its extremely diverse implications such as socio-economic, socio-cultural, territorial, or environmental. This represents, probably, the main reason why the research on migration is interdisciplinary, having strong connections with sociology, political sciences, history, economics, geography, demography, psychology, or low, among others. All these disciplines target different aspects of population migration, and a proper comprehension of the phenomenon implies a contribution from the part of all of them. Although migration represents a phenomenon manifested since ancient times, it has never been such an universal or significant phenomenon from the socio-economical or political perspective, as it is in present times. International migration has both a negative and positive impact on both provider and receiving countries, in general playing a very important role in the structure and dimension of the population of a country. Romania is not an exception to the previously expressed statement; furthermore, after the fall of the communist regime, migration became for Romania one of the most important socio-economical phenomena. The present paper aims at analyzing in a descriptive manner the international migration phenomenon in Romania between 1991 and 2008, from quantitative perspective. Based on data identified in the "Statistical Yearbook of Romania - 2008 and 2009 editions - the analysis revealed the fact that both immigration and emigration flows registered oscillatory evolutions in the analysed period, but the general trend of immigration was of increasing, while the one of emigration was of decreasing. Immigration was dominated by the presence of males, of persons aged between 26 and 40 and of persons coming from the Republic of Moldova. On the other side, in the case of emigration the significant

  4. The simulation study of three typical time frequency analysis methods

    Directory of Open Access Journals (Sweden)

    Li Yifeng

    2017-01-01

    Full Text Available The principals and characteristics of three typical time frequency analysis methods that Short Time Furious transformation, wavelet transformation and Hilbert-Huang transformation are introduced, and the mathematical definition, characteristics and application ranges of these analysis methods and so on are pointed out, then their time-frequency local performance is made analysis and comparison through computer programming and simulation.

  5. Appendices to the model description document for a computer program for the emulation/simulation of a space station environmental control and life support system

    Science.gov (United States)

    Yanosy, James L.

    1988-01-01

    A Model Description Document for the Emulation Simulation Computer Model was already published. The model consisted of a detailed model (emulation) of a SAWD CO2 removal subsystem which operated with much less detailed (simulation) models of a cabin, crew, and condensing and sensible heat exchangers. The purpose was to explore the utility of such an emulation simulation combination in the design, development, and test of a piece of ARS hardware, SAWD. Extensions to this original effort are presented. The first extension is an update of the model to reflect changes in the SAWD control logic which resulted from test. Also, slight changes were also made to the SAWD model to permit restarting and to improve the iteration technique. The second extension is the development of simulation models for more pieces of air and water processing equipment. Models are presented for: EDC, Molecular Sieve, Bosch, Sabatier, a new condensing heat exchanger, SPE, SFWES, Catalytic Oxidizer, and multifiltration. The third extension is to create two system simulations using these models. The first system presented consists of one air and one water processing system. The second consists of a potential air revitalization system.

  6. Simulation Approach to Mission Risk and Reliability Analysis Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  7. Perfil sensorial de ovos de Páscoa Descriptive analysis of easter eggs

    Directory of Open Access Journals (Sweden)

    Valéria P. R. MINIM

    2000-04-01

    Full Text Available Ovo de Páscoa é um produto de chocolate na forma de ovo comercializado no Brasil durante a Páscoa. No presente estudo, Análise Descritiva Quantitativa foi aplicada para levantar atributos sensoriais que melhor definem modificações na aparência, aroma, sabor e textura, quando sucedâneos da manteiga de cacau (SUMC são adicionados ao ovo de Páscoa. Amostras com e sem a adição de SUMC foram avaliadas por uma equipe selecionada de provadores e quatorze atributos sensorial foram definidos. Após um período de treinamento, os provadores avaliaram as amostras através de delineamento de blocos completos balanceadas usando escala não estruturada de 9cm. Os resultados foram analisados através da análise do componente principal, ANOVA e teste de Tukey (pEaster egg is a popular chocolate-candy in egg form commercialized in Brazil during Easter time. In this research, Quantitative Descriptive Analysis was applied to select sensory attributes which best define the modifications in appearance, aroma, flavor and texture when cocoa butter equivalent (CBE is added to Easter eggs. Samples with and without CBE were evaluated by a selected panel and fourteen attributes best describing similarities and differences between them, were defined. Terms definition, reference materials and a consensus ballot were developed. After a training period, panelists evaluated the samples in a Complete Block Design using a 9 cm unstructured scale. Principal Component Analysis, ANOVA and Tukey test (p<0.05 were applied to the data in order to select attributes which best discriminated and characterized the samples. Samples showed significant differences (p<0.05 in all attributes. Easter egg without CBE showed higher intensities (p<0.05 in relation to the following descriptors: brown color, characteristic aroma, cocoa mass aroma, cocoa butter aroma, characteristic flavor, cocoa mass flavor, hardness and brittleness.

  8. A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

    OpenAIRE

    Turk, Matthew J.; Smith., Britton D.; Oishi, Jeffrey S.; Skory, Stephen; Skillman, Samuel W.; Abel, Tom; Norman, Michael L.

    2010-01-01

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/), an open source, community-developed astrophysical analysis and visualization toolkit. ...

  9. Shock Mechanism Analysis and Simulation of High-Power Hydraulic Shock Wave Simulator

    Directory of Open Access Journals (Sweden)

    Xiaoqiu Xu

    2017-01-01

    Full Text Available The simulation of regular shock wave (e.g., half-sine can be achieved by the traditional rubber shock simulator, but the practical high-power shock wave characterized by steep prepeak and gentle postpeak is hard to be realized by the same. To tackle this disadvantage, a novel high-power hydraulic shock wave simulator based on the live firing muzzle shock principle was proposed in the current work. The influence of the typical shock characteristic parameters on the shock force wave was investigated via both theoretical deduction and software simulation. According to the obtained data compared with the results, in fact, it can be concluded that the developed hydraulic shock wave simulator can be applied to simulate the real condition of the shocking system. Further, the similarity evaluation of shock wave simulation was achieved based on the curvature distance, and the results stated that the simulation method was reasonable and the structural optimization based on software simulation is also beneficial to the increase of efficiency. Finally, the combination of theoretical analysis and simulation for the development of artillery recoil tester is a comprehensive approach in the design and structure optimization of the recoil system.

  10. ORSA: Orbit Reconstruction, Simulation and Analysis

    Science.gov (United States)

    Tricarico, Pasquale

    2012-04-01

    ORSA is an interactive tool for scientific grade Celestial Mechanics computations. Asteroids, comets, artificial satellites, solar and extra-solar planetary systems can be accurately reproduced, simulated, and analyzed. The software uses JPL ephemeris files for accurate planets positions and has a Qt-based graphical user interface. It offers an advanced 2D plotting tool and 3D OpenGL viewer and the standalone numerical library liborsa and can import asteroids and comets from all the known databases (MPC, JPL, Lowell, AstDyS, and NEODyS). In addition, it has an integrated download tool to update databases.

  11. Simulation Analysis of Indoor Gas Explosion Damage

    Institute of Scientific and Technical Information of China (English)

    钱新明; 陈林顺; 冯长根

    2003-01-01

    The influence factors and process of indoor gas explosion are studied with AutoReaGas explosion simulator. The result shows that venting pressure has great influence on the indoor gas explosion damage. The higher the venting pressure is, the more serious the hazard consequence will be. The ignition location has also evident effect on the gas explosion damage. The explosion static overpressure would not cause major injury to person and serious damage to structure in the case of low venting pressure (lower than 2 kPa). The high temperature combustion after the explosion is the major factor to person injury in indoor gas explosion accidents.

  12. ROSA-V large scale test facility (LSTF) system description for the third and fourth simulated fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Mitsuhiro; Nakamura, Hideo; Ohtsu, Iwao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment] [and others

    2003-03-01

    The Large Scale Test Facility (LSTF) is a full-height and 1/48 volumetrically scaled test facility of the Japan Atomic Energy Research Institute (JAERI) for system integral experiments simulating the thermal-hydraulic responses at full-pressure conditions of a 1100 MWe-class pressurized water reactor (PWR) during small break loss-of-coolant accidents (SBLOCAs) and other transients. The LSTF can also simulate well a next-generation type PWR such as the AP600 reactor. In the fifth phase of the Rig-of-Safety Assessment (ROSA-V) Program, eighty nine experiments have been conducted at the LSTF with the third simulated fuel assembly until June 2001, and five experiments have been conducted with the newly-installed fourth simulated fuel assembly until December 2002. In the ROSA-V program, various system integral experiments have been conducted to certify effectiveness of both accident management (AM) measures in beyond design basis accidents (BDBAs) and improved safety systems in the next-generation reactors. In addition, various separate-effect tests have been conducted to verify and develop computer codes and analytical models to predict non-homogeneous and multi-dimensional phenomena such as heat transfer across the steam generator U-tubes under the presence of non-condensable gases in both current and next-generation reactors. This report presents detailed information of the LSTF system with the third and fourth simulated fuel assemblies for the aid of experiment planning and analyses of experiment results. (author)

  13. Simulation and Analysis of ACDR-oriented Vertebral Endplate Cutting Process

    Directory of Open Access Journals (Sweden)

    Heqiang Tian

    2015-09-01

    Full Text Available In the artificial cervical disc replacement (ACDR, the polishing quality of fitting surface between vertebral endplate and artificial intervertebral disc has a direct relation with operation result. For this reason, an in-depth study is necessary on the vertebral endplate cutting process. This paper is intended to simulate and analyze the cutting process of vertebral endplate using numerical computation method. Firstly, a plane cutting model is established through an analysis on the cutting process to simulate the single-edge cutting process, and the temperature distribution calculation is carried out using the mean heat flux of continuous cutting. Secondly, a vertebral endplate material constitutive model is established on the basis of the endplate anatomy and mechanical property, so that some problems are analyzed about simulation algorithm, frictional contact and chip separation in the vertebral endplate cutting finite element (FE simulation. Finally, stress distribution, cutting force, endplate deformation and residual stress and cutting temperature, as well as simulation results, are analyzed in vertebral endplate cutting process. The FE simulation may simplify vertebral endplate cutting, improve analytical precision, avoid plenty of repetitive cutting experiments, and reduce research cost greatly. Moreover, it gives full and fine description to the whole continuous, dynamic cutting process. All of this lay solid theoretical basis for an in-depth study of influence of different cutting parameters on endplate cutting as well as improvement of operation effect.

  14. Analysis of Expressive Techniques of Description in Nima Yooshij’s Poetry

    Directory of Open Access Journals (Sweden)

    محمدی محمدی

    2011-05-01

    Full Text Available Nima Yooshij applied various techniques in order to describe diverse poetic subjects. This research deals with analysing the usage of these expressive techniques from two perspectives - poetic and structural. Nima applied structural expressive techniques more often than poetic expressive techniques in order to describe various poetic subjects. These techniques include adjectives, adverbs, adjective clauses, adjective predicate, descriptive sentences, and imaginative verbs. The usage of adjectives and descriptive sentences is more common as opposed to other techniques. Among the poetic techniques such symbols as allegory, irony, metaphor and personification is in first position. The more extensive use of structural expressive techniques confirms that Nima’s poetry is closer to natural expression and rhythmic prose. Key words: Description, poetic and structural expressive techniques, Nima Yooshij.

  15. A Descriptive Study of Registers Found in Spoken and Written Communication (A Semantic Analysis

    Directory of Open Access Journals (Sweden)

    Nurul Hidayah

    2016-07-01

    Full Text Available This research is descriptive study of registers found in spoken and written communication. The type of this research is Descriptive Qualitative Research. In this research, the data of the study is register in spoken and written communication that are found in a book entitled "Communicating! Theory and Practice" and from internet. The data can be in the forms of words, phrases and abbreviation. In relation with method of collection data, the writer uses the library method as her instrument. The writer relates it to the study of register in spoken and written communication. The technique of analyzing the data using descriptive method. The types of register in this term will be separated into formal register and informal register, and identify the meaning of register.

  16. Simulation of valveless micropump and mode analysis

    CERN Document Server

    Lan, W P; Wu, K C; Shih, Y C

    2008-01-01

    In this work, a 3-D simulation is performed to study for the solid-fluid coupling effect driven by piezoelectric materials and utilizes asymmetric obstacles to control the flow direction. The result of simulation is also verified. For a micropump, it is crucial to find the optimal working frequency which produce maximum net flow rate. The PZT plate vibrates under the first mode, which is symmetric. Adjusting the working frequency, the maximum flow rate can be obtained. For the micrpump we studied, the optimal working frequency is 3.2K Hz. At higher working frequency, say 20K Hz, the fluid-solid membrane may come out a intermediate mode, which is different from the first mode and the second mode. It is observed that the center of the mode drifts. Meanwhile, the result shows that a phase shift lagging when the excitation force exists in the vibration response. Finally, at even higher working frequency, say 30K Hz, a second vibration mode is observed.

  17. Aircraft vulnerability analysis by modelling and simulation

    CSIR Research Space (South Africa)

    Willers, CJ

    2014-09-01

    Full Text Available attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft...

  18. Simulation and analysis of breechblock mechanism

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    This paper analyzes the motion state and forces of the warship gun breechblock cam. Firstly, the method of driven cam locking the breechblock safely and reliably can be obtained through analyzing the force between the driven cam and breechblock. And then according to the geometry and motion relationships between the driving cam, driven cam and stopper, the angular range of cams and the instantaneous angular velocity of driven cam are obtained. Finally,with the specific velocity, the velocity-acceleration curve at the head of the driven cam can be obtained by simulating with the Adams software. It can ensure that the driven cam can make the part driven certain velocity in a short time and cannot deform plastically.

  19. Simulating potential growth and yield of oil palm (Elaeis guineensis) with PALMSIM: Model description, evaluation and application

    NARCIS (Netherlands)

    Hoffmann, M.; Castaneda Vera, A.; Wijk, van M.T.; Giller, K.E.; Oberthür, T.; Donough, C.; Whitbread, A.M.

    2014-01-01

    Reducing the gap between water-limited potential yield and actual yield in oil palm production systems through intensification is seen as an important option for sustainably increasing palm oil production. Simulation models can play an important role in quantifying water-limited potential yield, and

  20. An Innovative Tool for Intraoperative Electron Beam Radiotherapy Simulation and Planning: Description and Initial Evaluation by Radiation Oncologists

    Energy Technology Data Exchange (ETDEWEB)

    Pascau, Javier, E-mail: jpascau@mce.hggm.es [Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Departamento de Bioingenieria e Ingenieria Aeroespacial, Universidad Carlos III de Madrid, Madrid (Spain); Santos Miranda, Juan Antonio [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Calvo, Felipe A. [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Departamento de Oncologia, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Bouche, Ana; Morillo, Virgina [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); Gonzalez-San Segundo, Carmen [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Ferrer, Carlos; Lopez Tarjuelo, Juan [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); and others

    2012-06-01

    Purpose: Intraoperative electron beam radiation therapy (IOERT) involves a modified strategy of conventional radiation therapy and surgery. The lack of specific planning tools limits the spread of this technique. The purpose of the present study is to describe a new simulation and planning tool and its initial evaluation by clinical users. Methods and Materials: The tool works on a preoperative computed tomography scan. A physician contours regions to be treated and protected and simulates applicator positioning, calculating isodoses and the corresponding dose-volume histograms depending on the selected electron energy. Three radiation oncologists evaluated data from 15 IOERT patients, including different tumor locations. Segmentation masks, applicator positions, and treatment parameters were compared. Results: High parameter agreement was found in the following cases: three breast and three rectal cancer, retroperitoneal sarcoma, and rectal and ovary monotopic recurrences. All radiation oncologists performed similar segmentations of tumors and high-risk areas. The average applicator position difference was 1.2 {+-} 0.95 cm. The remaining cancer sites showed higher deviations because of differences in the criteria for segmenting high-risk areas (one rectal, one pancreas) and different surgical access simulated (two rectal, one Ewing sarcoma). Conclusions: The results show that this new tool can be used to simulate IOERT cases involving different anatomic locations, and that preplanning has to be carried out with specialized surgical input.

  1. Dynamic probabilistic simulation of dairy herd management practices 1. Model description and outcome of different seasonal calving patterns.

    NARCIS (Netherlands)

    Jalvingh, A.W.; Arendonk, van J.A.M.; Dijkhuizen, A.A.

    1993-01-01

    A dynamic probabilistic model has been designed to determine the technical and economic consequences of various biological variables and management strategies concerning reproduction, replacement and calving patterns in dairy herds. The Markov chain approach is used to simulate herd dynamics. Herds

  2. Raised fields in the Llanos de Moxos, Bolivia - description and analysis of their morphology

    Science.gov (United States)

    Rodrigues, Leonor; Lombardo, Umberto; Veit, Heinz

    2014-05-01

    The disturbance of Pre Columbian populations on Amazonian ecosystems is being actively debated. The traditional view of amazon being an untouched landscape because of its poor soils and harsh climate has been challenged and the extreme opposite idea of highly modified landscapes with complex societies is growing. Recent research has led to new impulses and issues requesting about the agricultural strategies people developed to survive in this climate. The Llanos de Moxos, situated in the Bolivian Lowlands in south-eastern Amazonia is one important region which was densely altered and where a great variety of earthworks can be found. One of the most impressive earth works are the raised fields, which are earth platforms for cultivation of differing shape and dimension that are elevated above the landscapes natural surface. In contrast to the "terra preta" formation where artefacts and amendments like charcoal and kitchen waste have been clearly identified, raised fields have shown to be artefact poor and studies up till now couldn't find any evidence of additional amendments which could have improved soil quality in the long term. As a result the function and productivity of raised fields is still not well understood and is being actively discussed. Detailed investigations on raised fields located in the indigenous community of Bermeo, in the vicinity of San Ignacio de Moxos provides data showing a novel explanation of the Pre-Columbian management of raised fields, and a chronological sequence of their utilization and abandonment. OSL dating has shown that the raised fields had been in use since as early as 600 AD. Comparison of Geochemistry with a reference profile, away from raised fields, showed that there is no evidence for manure amendments deriving from kitchen waste or animal residues suggesting a rather extensive use of those fields. Complementary the description of intern morphology and laboratory analysis of this raised fields, combined with radiocarbon

  3. Simulation Modeling and Analysis of TNMCS for the B-1 Strategic Bomber

    Science.gov (United States)

    2010-06-01

    SIMULATION MODELING AND ANALYSIS OF TNMCS FOR THE B- 1 STRATEGIC BOMBER...the United States Government. AFIT-OR-MS-ENS-10-09 SIMULATION MODELING AND ANALYSIS OF TNMCS FOR...Matrix.......................................................................... 64 1 SIMULATION MODELING AND ANALYSIS

  4. Pakistani English Newspaper Paid Obituary Announcements: A Descriptive Analysis of the Transliterated Vocabulary

    Science.gov (United States)

    Chaudhry, Sajid M.; Christopher, Anne A.; Krishnasamy, Hariharan A/L N.

    2016-01-01

    The study, qualitative and descriptive in nature, examines the use of transliteration in the paid Pakistani obituary announcements authored in the English language. Primarily, it identifies the frequently used transliterated vocabulary in these linguistic messages and reconnoiters the functional relationship that emerges in and between the textual…

  5. Contribution to aroma characteristics of mutton process flavor from the enzymatic hydrolysate of sheep bone protein assessed by descriptive sensory analysis and gas chromatography olfactometry.

    Science.gov (United States)

    Zhan, Ping; Tian, Honglei; Zhang, Xiaoming; Wang, Liping

    2013-03-15

    Changes in the aroma characteristics of mutton process flavors (MPFs) prepared from sheep bone protein hydrolysates (SBPHs) with different degrees of hydrolysis (DH) were evaluated using gas chromatography-mass spectrometry (GC-MS), gas chromatography-olfactometry (GC-O), and descriptive sensory analysis (DSA). Five attributes (muttony, meaty, roasted, mouthful, and simulate) were selected to assess MPFs. The results of DSA showed a distinct difference among the control sample MPF0 and other MPF samples with added SBPHs for different DHs of almost all sensory attributes. MPF5 (DH 25.92%) was the strongest in the muttony, meaty, and roasted attributes, whereas MPF6 (DH 30.89%) was the strongest in the simulate and roasted attributes. Thirty-six compounds were identified as odor-active compounds for the evaluation of the sensory characteristics of MPFs via GC-MS-O analysis. The results of correlation analysis among odor-active compounds, molecular weight, and DSA further confirmed that the SBPH with a DH range of 25.92-30.89% may be a desirable precursor for the sensory characteristics of MPF.

  6. Modeling and simulation of HTS cables for scattering parameter analysis

    Science.gov (United States)

    Bang, Su Sik; Lee, Geon Seok; Kwon, Gu-Young; Lee, Yeong Ho; Chang, Seung Jin; Lee, Chun-Kwon; Sohn, Songho; Park, Kijun; Shin, Yong-June

    2016-11-01

    Most of modeling and simulation of high temperature superconducting (HTS) cables are inadequate for high frequency analysis since focus of the simulation's frequency is fundamental frequency of the power grid, which does not reflect transient characteristic. However, high frequency analysis is essential process to research the HTS cables transient for protection and diagnosis of the HTS cables. Thus, this paper proposes a new approach for modeling and simulation of HTS cables to derive the scattering parameter (S-parameter), an effective high frequency analysis, for transient wave propagation characteristics in high frequency range. The parameters sweeping method is used to validate the simulation results to the measured data given by a network analyzer (NA). This paper also presents the effects of the cable-to-NA connector in order to minimize the error between the simulated and the measured data under ambient and superconductive conditions. Based on the proposed modeling and simulation technique, S-parameters of long-distance HTS cables can be accurately derived in wide range of frequency. The results of proposed modeling and simulation can yield the characteristics of the HTS cables and will contribute to analyze the HTS cables.

  7. A computer simulation of the turbocharged turbo compounded diesel engine system: A description of the thermodynamic and heat transfer models

    Science.gov (United States)

    Assanis, D. N.; Ekchian, J. E.; Frank, R. M.; Heywood, J. B.

    1985-01-01

    A computer simulation of the turbocharged turbocompounded direct-injection diesel engine system was developed in order to study the performance characteristics of the total system as major design parameters and materials are varied. Quasi-steady flow models of the compressor, turbines, manifolds, intercooler, and ducting are coupled with a multicylinder reciprocator diesel model, where each cylinder undergoes the same thermodynamic cycle. The master cylinder model describes the reciprocator intake, compression, combustion and exhaust processes in sufficient detail to define the mass and energy transfers in each subsystem of the total engine system. Appropriate thermal loading models relate the heat flow through critical system components to material properties and design details. From this information, the simulation predicts the performance gains, and assesses the system design trade-offs which would result from the introduction of selected heat transfer reduction materials in key system components, over a range of operating conditions.

  8. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obta

  9. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic t

  10. Stochastic Analysis Method of Sea Environment Simulated by Numerical Models

    Institute of Scientific and Technical Information of China (English)

    刘德辅; 焦桂英; 张明霞; 温书勤

    2003-01-01

    This paper proposes the stochastic analysis method of sea environment simulated by numerical models, such as wave height, current field, design sea levels and longshore sediment transport. Uncertainty and sensitivity analysis of input and output factors of numerical models, their long-term distribution and confidence intervals are described in this paper.

  11. Analysis and simulation of BGK electron holes

    Directory of Open Access Journals (Sweden)

    L. Muschietti

    1999-01-01

    Full Text Available Recent observations from satellites crossing regions of magnetic-field-aligned electron streams reveal solitary potential structures that move at speeds much greater than the ion acoustic/thermal velocity. The structures appear as positive potential pulses rapidly drifting along the magnetic field, and are electrostatic in their rest frame. We interpret them as BGK electron holes supported by a drifting population of trapped electrons. Using Laplace transforms, we analyse the behavior of one phase-space electron hole. The resulting potential shapes and electron distribution functions are self-consistent and compatible with the field and particle data associated with the observed pulses. In particular, the spatial width increases with increasing amplitude. The stability of the analytic solution is tested by means of a two-dimensional particle-in-cell simulation code with open boundaries. We consider a strongly magnetized parameter regime in which the bounce frequency of the trapped electrons is much less than their gyrofrequency. Our investigation includes the influence of the ions, which in the frame of the hole appear as an incident beam, and impinge on the BGK potential with considerable energy. The nonlinear structure is remarkably resilient

  12. Analysis of the CVT Efficiency by Simulation

    Directory of Open Access Journals (Sweden)

    Valerian Croitorescu

    2011-09-01

    Full Text Available All vehicle manufacturers desire an ideal vehicle that has the highest powertrain efficiency, best safety factor and ease of maintenance while being environmentally friendly. These highly valued vehicle development characteristics are only reachable after countless research hours. One major powertrain component to be studied in relation to these demands is the Continuous Variable Transmission that a Hybrid Electric Vehicle is equipped with. The CVT can increase the overall powertrain efficiency, offering a continuum variable gear ratios between established minimum and maximum limits. This paper aims to determine the losses of a CVT, operating on a HEV. Using simulation, the losses were computed and the fuel economy was analyzed. During various modes of operation, such as electric, regenerative braking, engine charge for maintaining the battery state of charge, the losses and their dependence with the control properties were analyzed. A relevant determination of precise losses is able to reduce them by using appropriate materials for their components and fluids, more efficient technical manufacturing and usage solutions and innovative control strategy.

  13. Simulating aerosol microphysics with the ECHAM/MADE GCM – Part I: Model description and comparison with observations

    Directory of Open Access Journals (Sweden)

    A. Lauer

    2005-01-01

    Full Text Available The aerosol dynamics module MADE has been coupled to the general circulation model ECHAM4 to simulate the chemical composition, number concentration, and size distribution of the global submicrometer aerosol. The present publication describes the new model system ECHAM4/MADE and presents model results in comparison with observations. The new model is able to simulate the full life cycle of particulate matter and various gaseous particle precursors including emissions of primary particles and trace gases, advection, convection, diffusion, coagulation, condensation, nucleation of sulfuric acid vapor, aerosol chemistry, cloud processing, and size-dependent dry and wet deposition. Aerosol components considered are sulfate (SO4, ammonium (NH4, nitrate (NO3, black carbon (BC, particulate organic matter (POM, sea salt, mineral dust, and aerosol liquid water. The model is numerically efficient enough to allow long term simulations, which is an essential requirement for application in general circulation models. Since the current study is focusing on the submicrometer aerosol, a coarse mode is not being simulated. The model is run in a passive mode, i.e. no feedbacks between the MADE aerosols and clouds or radiation are considered yet. This allows the investigation of the effect of aerosol dynamics, not interfered by feedbacks of the altered aerosols on clouds, radiation, and on the model dynamics. In order to evaluate the results obtained with this new model system, calculated mass concentrations, particle number concentrations, and size distributions are compared to observations. The intercomparison shows, that ECHAM4/MADE is able to reproduce the major features of the geographical patterns, seasonal cycle, and vertical distributions of the basic aerosol parameters. In particular, the model performs well under polluted continental conditions in the northern hemispheric lower and middle troposphere. However, in comparatively clean remote areas, e

  14. Simulating aerosol microphysics with the ECHAM/MADE GCM - Part I: Model description and comparison with observations

    Science.gov (United States)

    Lauer, A.; Hendricks, J.; Ackermann, I.; Schell, B.; Hass, H.; Metzger, S.

    2005-12-01

    The aerosol dynamics module MADE has been coupled to the general circulation model ECHAM4 to simulate the chemical composition, number concentration, and size distribution of the global submicrometer aerosol. The present publication describes the new model system ECHAM4/MADE and presents model results in comparison with observations. The new model is able to simulate the full life cycle of particulate matter and various gaseous particle precursors including emissions of primary particles and trace gases, advection, convection, diffusion, coagulation, condensation, nucleation of sulfuric acid vapor, aerosol chemistry, cloud processing, and size-dependent dry and wet deposition. Aerosol components considered are sulfate (SO4), ammonium (NH4), nitrate (NO3), black carbon (BC), particulate organic matter (POM), sea salt, mineral dust, and aerosol liquid water. The model is numerically efficient enough to allow long term simulations, which is an essential requirement for application in general circulation models. Since the current study is focusing on the submicrometer aerosol, a coarse mode is not being simulated. The model is run in a passive mode, i.e. no feedbacks between the MADE aerosols and clouds or radiation are considered yet. This allows the investigation of the effect of aerosol dynamics, not interfered by feedbacks of the altered aerosols on clouds, radiation, and on the model dynamics. In order to evaluate the results obtained with this new model system, calculated mass concentrations, particle number concentrations, and size distributions are compared to observations. The intercomparison shows, that ECHAM4/MADE is able to reproduce the major features of the geographical patterns, seasonal cycle, and vertical distributions of the basic aerosol parameters. In particular, the model performs well under polluted continental conditions in the northern hemispheric lower and middle troposphere. However, in comparatively clean remote areas, e.g. in the upper

  15. Analysis of Medication Errors in Simulated Pediatric Resuscitation by Residents

    Directory of Open Access Journals (Sweden)

    Evelyn Porter

    2014-07-01

    Full Text Available Introduction: The objective of our study was to estimate the incidence of prescribing medication errors specifically made by a trainee and identify factors associated with these errors during the simulated resuscitation of a critically ill child. Methods: The results of the simulated resuscitation are described. We analyzed data from the simulated resuscitation for the occurrence of a prescribing medication error. We compared univariate analysis of each variable to medication error rate and performed a separate multiple logistic regression analysis on the significant univariate variables to assess the association between the selected variables. Results: We reviewed 49 simulated resuscitations . The final medication error rate for the simulation was 26.5% (95% CI 13.7% - 39.3%. On univariate analysis, statistically significant findings for decreased prescribing medication error rates included senior residents in charge, presence of a pharmacist, sleeping greater than 8 hours prior to the simulation, and a visual analog scale score showing more confidence in caring for critically ill children. Multiple logistic regression analysis using the above significant variables showed only the presence of a pharmacist to remain significantly associated with decreased medication error, odds ratio of 0.09 (95% CI 0.01 - 0.64. Conclusion: Our results indicate that the presence of a clinical pharmacist during the resuscitation of a critically ill child reduces the medication errors made by resident physician trainees.

  16. Physicomathematical Simulation Analysis for Small Bullets

    Directory of Open Access Journals (Sweden)

    D. P. Margaris

    2008-01-01

    Full Text Available A full six degrees of freedom (6-DOF flight dynamics model is proposed for the accurate prediction of short and long-range trajectories of small bullets via atmospheric flight to final impact point. The mathematical model is based on the full equations of motion set up in the no-roll body reference frame and is integrated numerically from given initial conditions at the firing site. The projectile maneuvering motion depends on the most significant force and moment variations, in addition to gravity and Magnus effect. The computational flight analysis takes into consideration the Mach number and total angle of attack effects by means of the variable aerodynamic coefficients. For the purposes of the present work, linear interpolation has been applied for aerodynamic coefficients from the official tabulated database. The developed computational method gives satisfactory agreement with published data of verified experiments and computational codes on atmospheric projectile trajectory analysis for various initial firing flight conditions.

  17. Strategic Mobility 21: Modeling, Simulation, and Analysis

    Science.gov (United States)

    2010-04-14

    Womack & Jones of the Lean Enterprise Institute (LEI) 3 . In our initial use of this methodology with Dole Foods, there were over five organizations...energy. The Value Stream Analysis Future State then designed Kaizens 3 Value Stream Mapping...principles described in this report are excerpted from ―Learning To See‖ written by James Womack & Dan Jones of the Lean Enterprise Institute (LEI). 7

  18. Simulating aerosol microphysics with the ECHAM/MADE GCM Part I: Model description and comparison with observations

    Science.gov (United States)

    Lauer, A.; Hendricks, J.; Ackermann, I.; Schell, B.; Hass, H.; Metzger, S.

    2005-09-01

    The aerosol dynamics module MADE has been coupled to the general circulation model ECHAM4 to simulate the chemical composition, number concentration, and size distribution of the global submicrometer aerosol. The present publication describes the new model system ECHAM4/MADE and presents model results in comparison with observations. The new model is able to simulate the full life cycle of particulate matter and various gaseous precursors including emissions of primary particles and trace gases, advection, convection, diffusion, coagulation, condensation, nucleation of sulfuric acid vapor, aerosol chemistry, cloud processing, and size-dependent dry and wet deposition. Aerosol components considered are sulfate (SO4), ammonium (NH4), nitrate (NO3), black carbon (BC), particulate organic matter (POM), sea salt, mineral dust, and aerosol liquid water. The model is numerically efficient enough to allow long term simulations, which is an essential requirement for application in general circulation models. In order to evaluate the results obtained with this new model system, calculated mass concentrations, particle number concentrations, and size distributions are compared to observations. The intercomparison shows, that ECHAM4/MADE is able to reproduce the major features of the geographical patterns, seasonal cycle, and vertical distributions of the basic aerosol parameters. In particular, the model performs well under polluted continental conditions in the northern hemispheric lower and middle troposphere. However, in comparatively clean remote areas, e.g. in the upper troposphere or in the southern hemispheric marine boundary layer, the current model version tends to underestimate particle number concentrations.

  19. Modular Architecture for Sensor Systems (MASS) : description, analysis, simulation, and implementation.

    Energy Technology Data Exchange (ETDEWEB)

    Stark, Douglas P.; Davis, Jesse Zehring; Edmonds, Nicholas

    2004-11-01

    A particular engineering aspect of distributed sensor networks that has not received adequate attention is the system level hardware architecture of the individual nodes of the network. A novel hardware architecture based on an idea of task specific modular computing is proposed to provide for both the high flexibility and low power consumption required for distributed sensing solutions. The power consumption of the architecture is mathematically analyzed against a traditional approach, and guidelines are developed for application scenarios that would benefit from using this new design. Furthermore a method of decentralized control for the modular system is developed and analyzed. Finally, a few policies for power minimization in the decentralized system are proposed and analyzed.

  20. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  1. Simulation Analysis of Divertor Performance in EAST

    Institute of Scientific and Technical Information of China (English)

    Zhu Sizheng; Zha Xuejun

    2005-01-01

    A detailed study of the divertor performance in the EAST has been conducted for both its double null and single null configurations. The results of the application of the SOLPS (B2/Eirene) code package to the analysis of the EAST divertor are summarized. Here we concentrate on the effects of the increased geometrical closure and variation in the magnetic topology on the behavior of divertor plasmas. The results of numerical predictions for the EAST divertor's operational window are also described in this paper.

  2. A Description for Rock Joint Roughness Based on Terrestrial Laser Scanner and Image Analysis

    Science.gov (United States)

    Ge, Yunfeng; Tang, Huiming; Eldin, M. A. M. Ez; Chen, Pengyu; Wang, Liangqing; Wang, Jinge

    2015-11-01

    Shear behavior of rock mass greatly depends upon the rock joint roughness which is generally characterized by anisotropy, scale effect and interval effect. A new index enabling to capture all the three features, namely brightness area percentage (BAP), is presented to express the roughness based on synthetic illumination of a digital terrain model derived from terrestrial laser scanner (TLS). Since only tiny planes facing opposite to shear direction make contribution to resistance during shear failure, therefore these planes are recognized through the image processing technique by taking advantage of the fact that they appear brighter than other ones under the same light source. Comparison with existing roughness indexes and two case studies were illustrated to test the performance of BAP description. The results reveal that the rock joint roughness estimated by the presented description has a good match with existing roughness methods and displays a wider applicability.

  3. Combining Elastic Network Analysis and Molecular Dynamics Simulations by Hamiltonian Replica Exchange.

    Science.gov (United States)

    Zacharias, Martin

    2008-03-01

    Coarse-grained elastic network models (ENM) of proteins can be used efficiently to explore the global mobility of a protein around a reference structure. A new Hamiltonian-replica exchange molecular dynamics (H-RexMD) method has been designed that effectively combines information extracted from an ENM analysis with atomic-resolution MD simulations. The ENM analysis is used to construct a distance-dependent penalty (flooding or biasing) potential that can drive the structure away from its current conformation in directions compatible with the ENM model. Various levels of the penalty or biasing potential are added to the force field description of the MD simulation along the replica coordinate. One replica runs at the original force field. By focusing the penalty potential on the relevant soft degrees of freedom the method avoids the rapid increase of the replica number with increasing system size to cover a desired temperature range in conventional (temperature) RexMD simulations. The application to domain motions in lysozyme of bacteriophage T4 and to peptide folding indicates significantly improved conformational sampling compared to conventional MD simulations.

  4. The Application of Simulation in Large Energy System Analysis

    Directory of Open Access Journals (Sweden)

    S.M. Divakaruni

    1985-10-01

    Full Text Available The Modular Modeling System (MMS developed by the Electric Power Research Institute (EPRI provides an efficient, economical, and user friendly computer code to engineers involved in the analysis of nuclear and fossil power plants. MMS will complement existing codes in the areas of nuclear and fossil power plant systems simulation. This paper provides a synopsis of MMS code features, development objectives, usage and results of fossil and nuclear plant simulation.

  5. Toward Verifying Fossil Fuel CO2 Emissions with the CMAQ Model: Motivation, Model Description and Initial Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Zhen; Bambha, Ray P.; Pinto, Joseph P.; Zeng, Tao; Boylan, Jim; Huang, Maoyi; Lei, Huimin; Zhao, Chun; Liu, Shishi; Mao, Jiafu; Schwalm, Christopher R.; Shi, Xiaoying; Wei, Yaxing; Michelsen, Hope A.

    2014-03-14

    Motivated by the urgent need for emission verification of CO2 and other greenhouse gases, we have developed regional CO2 simulation with CMAQ over the contiguous U.S. Model sensitivity experiments have been performed using three different sets of inputs for net ecosystem exchange (NEE) and two fossil fuel emission inventories, to understand the roles of fossil fuel emissions, atmosphere-biosphere exchange and transport in regulating the spatial and diurnal variability of CO2 near the surface, and to characterize the well-known ‘signal-to-noise’ problem, i.e. the interference from the biosphere on the interpretation of atmospheric CO2 observations. It is found that differences in the meteorological conditions for different urban areas strongly contribute to the contrast in concentrations. The uncertainty of NEE, as measured by the difference among the three different NEE inputs, has notable impact on regional distribution of CO2 simulated by CMAQ. Larger NEE uncertainty and impact are found over eastern U.S. urban areas than along the western coast. A comparison with tower CO2 measurements at Boulder Atmospheric Observatory (BAO) shows that the CMAQ model using hourly varied and high-resolution CO2 emission from the Vulcan inventory and CarbonTracker optimized NEE reasonably reproduce the observed diurnal profile, whereas switching to different NEE inputs significantly degrades the model performance. Spatial distribution of CO2 is found to correlate with NOx, SO2 and CO, due to their similarity in emission sources and transport processes. These initial results from CMAQ demonstrate the power of a state-of-the art CTM in helping interpret CO2 observations and verify fossil fuel emissions. The ability to simulate CO2 in CMAQ will also facilitate investigations of the utility of traditionally regulated pollutants and other species as tracers to CO2 source attribution.

  6. Toward verifying fossil fuel CO2 emissions with the CMAQ model: motivation, model description and initial simulation.

    Science.gov (United States)

    Liu, Zhen; Bambha, Ray P; Pinto, Joseph P; Zeng, Tao; Boylan, Jim; Huang, Maoyi; Lei, Huimin; Zhao, Chun; Liu, Shishi; Mao, Jiafu; Schwalm, Christopher R; Shi, Xiaoying; Wei, Yaxing; Michelsen, Hope A

    2014-04-01

    Motivated by the question of whether and how a state-of-the-art regional chemical transport model (CTM) can facilitate characterization of CO2 spatiotemporal variability and verify CO2 fossil-fuel emissions, we for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate CO2. This paper presents methods, input data, and initial results for CO2 simulation using CMAQ over the contiguous United States in October 2007. Modeling experiments have been performed to understand the roles of fossil-fuel emissions, biosphere-atmosphere exchange, and meteorology in regulating the spatial distribution of CO2 near the surface over the contiguous United States. Three sets of net ecosystem exchange (NEE) fluxes were used as input to assess the impact of uncertainty of NEE on CO2 concentrations simulated by CMAQ. Observational data from six tall tower sites across the country were used to evaluate model performance. In particular, at the Boulder Atmospheric Observatory (BAO), a tall tower site that receives urban emissions from Denver CO, the CMAQ model using hourly varying, high-resolution CO2 fossil-fuel emissions from the Vulcan inventory and Carbon Tracker optimized NEE reproduced the observed diurnal profile of CO2 reasonably well but with a low bias in the early morning. The spatial distribution of CO2 was found to correlate with NO(x), SO2, and CO, because of their similar fossil-fuel emission sources and common transport processes. These initial results from CMAQ demonstrate the potential of using a regional CTM to help interpret CO2 observations and understand CO2 variability in space and time. The ability to simulate a full suite of air pollutants in CMAQ will also facilitate investigations of their use as tracers for CO2 source attribution. This work serves as a proof of concept and the foundation for more comprehensive examinations of CO2 spatiotemporal variability and various uncertainties in the future. Atmospheric CO2 has long been modeled

  7. Analysis of the spiral structure in a simulated galaxy

    CERN Document Server

    Mata-Chávez, Dolores; Puerari, Ivânio

    2014-01-01

    We analyze the spiral structure that results in a numerical simulation of a galactic disk with stellar and gaseous components evolving in a potential that includes an axisymmetric halo and bulge. We perform a second simulation without the gas component to observe how it affects the spiral structure in the disk. To quantify this, we use a Fourier analysis and obtain values for the pitch angle and the velocity of the self-excited spiral pattern of the disk. The results show a tighter spiral in the simulation with gaseous component. The spiral structure is consistent with a superposition of waves, each with a constant pattern velocity in given radial ranges.

  8. Descriptions and Implementations of DL_F Notation: A Natural Chemical Expression System of Atom Types for Molecular Simulations.

    Science.gov (United States)

    Yong, Chin W

    2016-08-22

    DL_F Notation is an easy-to-understand, standardized atom typesetting expression for molecular simulations for a range of organic force field (FF) schemes such as OPLSAA, PCFF, and CVFF. It is implemented within DL_FIELD, a software program that facilitates the setting up of molecular FF models for DL_POLY molecular dynamics simulation software. By making use of the Notation, a single core conversion module (the DL_F conversion Engine) implemented within DL_FIELD can be used to analyze a molecular structure and determine the types of atoms for a given FF scheme. Users only need to provide the molecular input structure in a simple xyz format and DL_FIELD can produce the necessary force field file for DL_POLY automatically. In commensurate with the development concept of DL_FIELD, which placed emphasis on robustness and user friendliness, the Engine provides a single-step solution to setup complex FF models. This allows users to switch from one of the above-mentioned FF seamlessly to another while at the same time provides a consistent atom typing that is expressed in a natural chemical sense.

  9. An automated parallel simulation execution and analysis approach

    Science.gov (United States)

    Dallaire, Joel D.; Green, David M.; Reaper, Jerome H.

    2004-08-01

    State-of-the-art simulation computing requirements are continually approaching and then exceeding the performance capabilities of existing computers. This trend remains true even with huge yearly gains in processing power and general computing capabilities; simulation scope and fidelity often increases as well. Accordingly, simulation studies often expend days or weeks executing a single test case. Compounding the problem, stochastic models often require execution of each test case with multiple random number seeds to provide valid results. Many techniques have been developed to improve the performance of simulations without sacrificing model fidelity: optimistic simulation, distributed simulation, parallel multi-processing, and the use of supercomputers such as Beowulf clusters. An approach and prototype toolset has been developed that augments existing optimization techniques to improve multiple-execution timelines. This approach, similar in concept to the SETI @ home experiment, makes maximum use of unused licenses and computers, which can be geographically distributed. Using a publish/subscribe architecture, simulation executions are dispatched to distributed machines for execution. Simulation results are then processed, collated, and transferred to a single site for analysis.

  10. Structure optimization and simulation analysis of the quartz micromachined gyroscope

    Science.gov (United States)

    Wu, Xuezhong; Wang, Haoxu; Xie, Liqiang; Dong, Peitao

    2014-03-01

    Structure optimization and simulation analysis of the quartz micromachined gyroscope are reported in this paper. The relationships between the structure parameters and the frequencies of work mode were analysed by finite element analysis. The structure parameters of the quartz micromachined gyroscope were optimized to reduce the difference between the frequencies of the drive mode and the sense mode. The simulation results were proved by testing the prototype gyroscope, which was fabricated by micro-electromechanical systems (MEMS) technology. Therefore, the frequencies of the drive mode and the sense mode can match each other by the structure optimization and simulation analysis of the quartz micromachined gyroscope, which is helpful in the design of the high sensitivity quartz micromachined gyroscope.

  11. Structure optimization and simulation analysis of the quartz micromachined gyroscope

    Directory of Open Access Journals (Sweden)

    Xuezhong Wu

    2014-02-01

    Full Text Available Structure optimization and simulation analysis of the quartz micromachined gyroscope are reported in this paper. The relationships between the structure parameters and the frequencies of work mode were analysed by finite element analysis. The structure parameters of the quartz micromachined gyroscope were optimized to reduce the difference between the frequencies of the drive mode and the sense mode. The simulation results were proved by testing the prototype gyroscope, which was fabricated by micro-electromechanical systems (MEMS technology. Therefore, the frequencies of the drive mode and the sense mode can match each other by the structure optimization and simulation analysis of the quartz micromachined gyroscope, which is helpful in the design of the high sensitivity quartz micromachined gyroscope.

  12. High-Performance Astrophysical Simulations and Analysis with Python

    CERN Document Server

    Turk, Matthew J

    2011-01-01

    The usage of the high-level scripting language Python has enabled new mechanisms for data interrogation, discovery and visualization of scientific data. We present yt, an open source, community-developed astrophysical analysis and visualization toolkit for data generated by high-performance computing (HPC) simulations of astrophysical phenomena. Through a separation of responsibilities in the underlying Python code, yt allows data generated by incompatible, and sometimes even directly competing, astrophysical simulation platforms to be analyzed in a consistent manner, focusing on physically relevant quantities rather than quantities native to astrophysical simulation codes. We present on its mechanisms for data access, capabilities for MPI-parallel analysis, and its implementation as an in situ analysis and visualization tool.

  13. Dispersion analysis techniques within the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    The Space Vehicle Dynamics Simulation (SVDS) program was evaluated as a dispersion analysis tool. The Linear Error Analysis (LEA) post processor was examined in detail and simulation techniques relative to conducting a dispersion analysis using the SVDS were considered. The LEA processor is a tool for correlating trajectory dispersion data developed by simulating 3 sigma uncertainties as single error source cases. The processor combines trajectory and performance deviations by a root-sum-square (RSS process) and develops a covariance matrix for the deviations. Results are used in dispersion analyses for the baseline reference and orbiter flight test missions. As a part of this study, LEA results were verified as follows: (A) Hand calculating the RSS data and the elements of the covariance matrix for comparison with the LEA processor computed data. (B) Comparing results with previous error analyses. The LEA comparisons and verification are made at main engine cutoff (MECO).

  14. Social interaction, globalization and computer-aided analysis a practical guide to developing social simulation

    CERN Document Server

    Osherenko, Alexander

    2014-01-01

    This thorough, multidisciplinary study discusses the findings of social interaction and social simulation using understandable global examples. Shows the reader how to acquire intercultural data, illustrating each step with descriptive comments and program code.

  15. Simulation of the secondary electrons energy deposition produced by proton beams in PMMA: influence of the target electronic excitation description

    Science.gov (United States)

    Dapor, Maurizio; Abril, Isabel; de Vera, Pablo; Garcia-Molina, Rafael

    2015-06-01

    We have studied the radial dependence of the energy deposition of the secondary electron generated by swift proton beams incident with energies T = 50 keV-5 MeV on poly(methylmethacrylate) (PMMA). Two different approaches have been used to model the electronic excitation spectrum of PMMA through its energy loss function (ELF), namely the extended-Drude ELF and the Mermin ELF. The singly differential cross section and the total cross section for ionization, as well as the average energy of the generated secondary electrons, show sizeable differences at T ⩽ 0.1 MeV when evaluated with these two ELF models. In order to know the radial distribution around the proton track of the energy deposited by the cascade of secondary electrons, a simulation has been performed that follows the motion of the electrons through the target taking into account both the inelastic interactions (via electronic ionizations and excitations as well as electron-phonon and electron trapping by polaron creation) and the elastic interactions. The radial distribution of the energy deposited by the secondary electrons around the proton track shows notable differences between the simulations performed with the extended-Drude ELF or the Mermin ELF, being the former more spread out (and, therefore, less peaked) than the latter. The highest intensity and sharpness of the deposited energy distributions takes place for proton beams incident with T ~ 0.1-1 MeV. We have also studied the influence in the radial distribution of deposited energy of using a full energy distribution of secondary electrons generated by proton impact or using a single value (namely, the average value of the distribution); our results show that differences between both simulations become important for proton energies larger than ~0.1 MeV. The results presented in this work have potential applications in materials science, as well as hadron therapy (due to the use of PMMA as a tissue phantom) in order to properly consider the

  16. Big Data Visual Analytics for Exploratory Earth System Simulation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Steed, Chad A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ricciuto, Daniel M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thornton, Peter E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wang, Dali [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shi, Xiaoying [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-12-01

    Rapid increases in high performance computing are feeding the development of larger and more complex data sets in climate research, which sets the stage for so-called big data analysis challenges. However, conventional climate analysis techniques are inadequate in dealing with the complexities of today s data. In this paper, we describe and demonstrate a visual analytics system, called the Exploratory Data analysis ENvironment (EDEN), with specific application to the analysis of complex earth system simulation data sets. EDEN represents the type of interactive visual analysis tools that are necessary to transform data into insight, thereby improving critical comprehension of earth system processes. In addition to providing an overview of EDEN, we describe real-world studies using both point ensembles and global Community Land Model Version 4 (CLM4) simulations.

  17. Programmer's manual for IOSYM: an input-oriented simulation language for continuous systems. Volume 2: subprogram description

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.M.

    1981-06-01

    IOSYM is an extension of the GASP IV simulation language. It permits systems which are sequences of continuous processes to be modeled graphically. Normally the system can be described by data input only. The language permits stochastic sequencing and termination criteria for processes and allows crossing conditions for ending operations that are more general than GASP IV. Extensive capability exists for conditional branching and logical modification of the network. IOSYM has been used to model the cost of geothermal drilling where the various costly processes of drilling are represented by IOSYM operations. The language is much more general however; it retains more of GASP IV's discrete event capabilities and permits easy modeling of continuous processes.

  18. An attempt for a unified description of mechanical testing on Zircaloy-4 cladding subjected to simulated LOCA transient

    Directory of Open Access Journals (Sweden)

    Desquines Jean

    2016-01-01

    Full Text Available During a Loss Of Coolant Accident (LOCA, an important safety requirement is that the reflooding of the core by the emergency core cooling system should not lead to a complete rupture of the fuel rods. Several types of mechanical tests are usually performed in the industry to determine the degree of cladding embrittlement, such as ring compression tests or four-point bending of rodlets. Many other tests can be found in the open literature. However, there is presently no real intrinsic understanding of the failure conditions in these tests which would allow translation of the results from one kind of mechanical testing to another. The present study is an attempt to provide a unified description of the failure not directly depending on the tested geometry. This effort aims at providing a better understanding of the link between several existing safety criteria relying on very different mechanical testing. To achieve this objective, the failure mechanisms of pre-oxidized and pre-hydrided cladding samples are characterized by comparing the behavior of two different mechanical tests: Axial Tensile (AT test and “C”-shaped Ring Compression Test (CCT. The failure of samples in both cases can be described by usual linear elastic fracture mechanics theory. Using interrupted mechanical tests, metallographic examinations have evidenced that a set of parallel cracks are nucleated at the inner and outer surface of the samples just before failure, crossing both the oxide layer and the oxygen rich alpha layer. The stress intensity factors for multiple crack geometry are determined for both AT and CCT samples using finite element calculations. After each mechanical test performed on high temperature steam oxidized samples, metallography is then used to individually determine the crack depth and crack spacing. Using these two important parameters and considering the applied load at fracture, the stress intensity factor at failure is derived for each tested

  19. Simulation and Analysis of Uncooled Microbolometer for Serial Readout Architecture

    Directory of Open Access Journals (Sweden)

    Musaed Alhussein

    2016-01-01

    Full Text Available A detailed thermal behavior and theoretical analysis of uncooled resistive microbolometer is presented along with the proposed thermal imager simulator. An accurate model of a thermal detector is required to design a readout circuit that can compensate for the noise due to process variability and self-heating. This paper presents a realistic simulation model of microbolometer that addresses the fixed pattern noise, Johnson noise, and self-heating. Different simulations were performed to study the impact of infrared power and bias power on the performance of microbolometers. The microbolometers were biased with different bias currents along with different thermal parameters of the reference microbolometer to analyze the impact of self-heating on the thermal image. The proposed thermal imager simulator is used as a tool to visually analyze the impact of noise on the quality of a thermal image. This simulator not only helps in compensating the noise prior to the implementation in Analog Design Environment, but also can be used as a platform to explore different readout architectures. In this work, serial readout architecture was simulated with a row of blind microbolometers that served as a reference. Moreover, the algorithm for the proposed thermal imager simulator is presented.

  20. Medical repatriation of migrant farm workers in Ontario: a descriptive analysis.

    Science.gov (United States)

    Orkin, Aaron M; Lay, Morgan; McLaughlin, Janet; Schwandt, Michael; Cole, Donald

    2014-07-01

    Approximately 40 000 migrant farm workers are employed annually in Canada through temporary foreign worker programs. Workers experiencing health conditions that prevent ongoing work are normally repatriated to their home country, which raises concerns about human rights and health equity. In this study, we present data on the reasons for medical repatriation of migrant farm workers in Ontario. In this retrospective descriptive study, we examined medical repatriation data from Foreign Agricultural Resource Management Services, a non-profit corporation managing the contracts of more than 15 000 migrant farm workers in Ontario annually. We extracted repatriation and demographic data for workers from 2001-2011. Physician volunteers used a validated system to code the reported reasons for medical repatriation. We conducted descriptive analyses of the dominant reasons for repatriation and rates of repatriation. During 2001-2011, 787 repatriations occurred among 170 315 migrant farm workers arriving in Ontario (4.62 repatriations per 1000 workers). More than two-thirds of repatriated workers were aged 30-49 years. Migrant farm workers were most frequently repatriated for medical or surgical reasons (41.3%) and external injuries including poisoning (25.5%). This study provides quantitative health data related to a unique and vulnerable occupational group. Our findings reinforce existing knowledge regarding occupational hazards and health conditions among migrant farm workers. Medical repatriation of migrant farm workers merits further examination as a global health equity concern.

  1. Descriptive analysis of individual and community factors among African American youths in urban public housing.

    Science.gov (United States)

    Nebbitt, Von E; Williams, James Herbert; Lombe, Margaret; McCoy, Henrika; Stephens, Jennifer

    2014-07-01

    African American adolescents are disproportionately represented in urban public housing developments. These neighborhoods are generally characterized by high rates of poverty, crime, violence, and disorganization. Although evidence is emerging on youths in these communities, little is known about their depressive symptoms, perceived efficacy, or frequency of substance use and sex-risk behavior. Further, even less is known about their exposure to community and household violence, their parents' behavior, or their sense of connection to their communities. Using a sample of 782 African American adolescents living in public housing neighborhoods located in four large U.S. cities, this article attempts to rectify the observed gap in knowledge by presenting a descriptive overview of their self-reported depressive symptoms; self-efficacy; frequencies of delinquent and sexual-risk behavior; and alcohol, tobacco, and other drug use. The self-reported ratings of their parents' behavior as well as their exposure to community and household violence are presented. Analytic procedures include descriptive statistics and mean comparisons between genders and across research cities. Results suggest several differences between genders and across research sites. However, results are not very different from national data. Implications for social work practice are discussed.

  2. Multifractal analysis and simulation of multifractal random walks

    Science.gov (United States)

    Schmitt, Francois G.; Huang, Yongxiang

    2016-04-01

    Multifractal time series, characterized by a scale invariance and large fluctuations at all scales, are found in many fields of natural and applied sciences. They are found i.e. in many geophysical fields, such as atmospheric and oceanic turbulence, hydrology, earth sciences. Here we consider a quite general type of multifractal time series, called multifractal random walk, as non stationary stochastic processes with intermittent stationary increments. We first quickly recall how such time series can be analyzed and characterized, using structure functions and arbitrary order Hilbert spectral analysis. We then discuss the simulation approach. The main object is to provide a stochastic process generating time series having the same multiscale properties We review recent works on this topic, and provide stochastic simulations in order to verify the theoretical predictions. In the lognormal framework we provide a h - μ plane expressing the scale invariant properties of these simulations. The theoretical plane is compared to simulation results.

  3. Numerical simulation analysis of Guixi copper flash smelting furnace

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A numerical simulation analysis for reactions of chalcopyrite and pyrite particles coupled with momentum, heat and mass transfer between the particle and gas in a flash smelting furnace is presented. In the simulation, the equations governing the gas flow are solved numerically by Eular method. The particle phase is introduced into the gas flow by the particle-source-in-cell technique (PSIC). Predictions including the fluid flow field, temperature field, concentration field of gas phase and the tracks of particles have been obtained by the numerical simulation. The visualized results show that the reaction of sulfide particles is almost completed in the upper zone of the shaft within 1.5 m far from the central jet distributor (CJD) type concentrate burner. The simulation results are in good agreement with data obtained from a series of experiments and tests in the plant and the error is less than 2%.

  4. Template-based data entry for general description in medical records and data transfer to data warehouse for analysis.

    Science.gov (United States)

    Matsumura, Yasushi; Kuwata, Shigeki; Yamamoto, Yuichiro; Izumi, Kazunori; Okada, Yasushi; Hazumi, Michihiro; Yoshimoto, Sachiko; Mineno, Takahiro; Nagahama, Munetoshi; Fujii, Ayumi; Takeda, Hiroshi

    2007-01-01

    General descriptions in medical records are so diverse that they are usually entered as free text into an electronic medical record, and the resulting data analysis is often difficult. We developed and implemented a template-based data entry module and data analyzing system for general descriptions. We developed a template with tree structure, whose content master and entered patient's data are simultaneously expressed by XML. The entered structured data is converted to narrative form for easy reading. This module was implemented in the EMR system, and is used in 35 hospitals as of October, 2006. So far, 3725 templates (3242 concepts) have been produced. The data in XML and narrative text data are stored in the EMR database. The XML data are retrieved, and then patient's data are extracted, to be stored in the data ware-house (DWH). We developed a search assisting system that enables users to find objective data from the DWH without requiring complicated SQL. By using this method, general descriptions in medical records can be structured and made available for clinical research.

  5. Tank waste remediation system simulation analysis retrieval model

    Energy Technology Data Exchange (ETDEWEB)

    Fordham, R.A.

    1996-09-30

    The goal of simulation was to test tll(., consequences of assumptions. For the TWRS SIMAN Retrieval Model, l@lie specific assumptions are primarily defined with respect to waste processing arid transfer timing. The model tracks 73 chem1913ical constituents from underground waste tanks to glass; yet, the detailed (@hemistrv and complete set of unit operations of the TWRS process flow sheet are represented only at the level necessary to define the waste processing and transfer logic and to estimate the feed composition for the treatment facilities. Tlierefor(,, the model should net be regarded as a substitute for the TWRS process flow sheet. Pra(!ticallv the model functions as a dyrt(imic extension of the flow sheet model. I I The following sections present the description, assunipt@ions, architecture, arid evalua- tion of the TWRS SIMAN Retrieval Model. Section 2 describes the model in terms of an overview of the processes represented. Section 3 presents the assumptions for the simulation model. Specific assumptions 9.tt(l parameter values used in the model are provided for waste retrieval, pretreatment, low-level waste (LLNN7) immobilization, and high-level waste (HLW) immobilization functions. Section 4 describes the model in terms of its functional architec- rare to d(@fine a basis for a systematic evaluation of the model. Finally, Section 5 documents an independent test and evaluation of the niodel`s performance (i.e., the verification and validation). Additionally, Appendix A gives a complete listing of the tank inventory used. Appendix B documents the verification and validation plan that was used for the (Section 5) evaluation work. A description and listing of all the model variables is given in Appendix C along with a complete source listing.

  6. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  7. Theoretical Analysis and Simulation of Jacking Procedure of Pantadome System

    Institute of Scientific and Technical Information of China (English)

    WANG Xiaodun; SHI Yongjiu; WANG Yuanqing; Kawaguchi Mamoru

    2005-01-01

    In order to obtain the principle of Pantadome lifting process and make theoretical foundation for practical applications, the core idea of Pantadome was introduced, which is to make a structure become a mechanism by temporarily removing some members during the process of construction.The abstract motion model was built. By determining the change of the coordinates of the hinge joint and that of each point of the structure, simulative analysis of the mechanical motion of Pantadome was realized. Then general program that simulates the lifting process of Pantadome was developed based on AutoCAD environment by Auto Lisp language. By completing the theoretical analysis of the lifting process of Pantadome, three-dimensional simulation of the lifting process of Pantadome was realized. And it is successfully applied to bidding work of practical engineering.

  8. Simulation Modeling and Analysis of Operator-Machine Ratio

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Based on a simulation model of a semiconductor manufacturer, operator-machine ratio (OMR) analysis is made using work study and time study. Through sensitivity analysis, it is found that labor utilization decreases with the increase of lot size.Meanwhile, it is able to identify that the OMR for this company should be improved from 1∶3 to 1∶5. An application result shows that the proposed model can effectively improve the OMR by 33%.

  9. Novel 3D/VR interactive environment for MD simulations, visualization and analysis.

    Science.gov (United States)

    Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P

    2014-12-18

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.

  10. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  11. I PASS: an interactive policy analysis simulation system.

    Science.gov (United States)

    Doug Olson; Con Schallau; Wilbur Maki

    1984-01-01

    This paper describes an interactive policy analysis simulation system(IPASS) that can be used to analyze the long-term economic and demographic effects of alternative forest resource management policies. The IPASS model is a dynamic analytical tool that forecasts growth and development of an economy. It allows the user to introduce changes in selected parameters based...

  12. Nursing students' evaluation of a new feedback and reflection tool for use in high-fidelity simulation - Formative assessment of clinical skills. A descriptive quantitative research design.

    Science.gov (United States)

    Solheim, Elisabeth; Plathe, Hilde Syvertsen; Eide, Hilde

    2017-09-04

    Clinical skills training is an important part of nurses' education programmes. Clinical skills are complex. A common understanding of what characterizes clinical skills and learning outcomes needs to be established. The aim of the study was to develop and evaluate a new reflection and feedback tool for formative assessment. The study has a descriptive quantitative design. 129 students participated who were at the end of the first year of a Bachelor degree in nursing. After highfidelity simulation, data were collected using a questionnaire with 19 closed-ended and 2 open-ended questions. The tool stimulated peer assessment, and enabled students to be more thorough in what to assess as an observer in clinical skills. The tool provided a structure for selfassessment and made visible items that are important to be aware of in clinical skills. This article adds to simulation literature and provides a tool that is useful in enhancing peer learning, which is essential for nurses in practice. The tool has potential for enabling students to learn about reflection and developing skills for guiding others in practice after they have graduated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Unified Modeling Language description of the object-oriented multi-scale adaptive finite element method for Step-and-Flash Imprint Lithography Simulations

    Science.gov (United States)

    Paszyński, Maciej; Gurgul, Piotr; Sieniek, Marcin; Pardo, David

    2010-06-01

    In the first part of the paper we present the multi-scale simulation of the Step-and-Flash Imprint Lithography (SFIL), a modern patterning process. The simulation utilizes the hp adaptive Finite Element Method (hp-FEM) coupled with Molecular Statics (MS) model. Thus, we consider the multi-scale problem, with molecular statics applied in the areas of the mesh where the highest accuracy is required, and the continuous linear elasticity with thermal expansion coefficient applied in the remaining part of the domain. The degrees of freedom from macro-scale element's nodes located on the macro-scale side of the interface have been identified with particles from nano-scale elements located on the nano-scale side of the interface. In the second part of the paper we present Unified Modeling Language (UML) description of the resulting multi-scale application (hp-FEM coupled with MS). We investigated classical, procedural codes from the point of view of the object-oriented (O-O) programming paradigm. The discovered hierarchical structure of classes and algorithms makes the UML project as independent on the spatial dimension of the problem as possible. The O-O UML project was defined at an abstract level, independent on the programming language used.

  14. The description of dense hydrogen with Wave Packet Molecular Dynamics (WPMD) simulations; Die Beschreibung von dichtem Wasserstoff mit der Methode der Wellenpaket-Molekulardynamik (WPMD)

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, B.

    2006-10-10

    In this work the wave packet molecular dynamics (WPMD) is presented and applied to dense hydrogen. In the WPMD method the electrons are described by a slater determinant of periodic Gaussian wave packets. Each single particle wave function can parametrised through 8 coordinates which can be interpreted as the position and momentum, the width and its conjugate momentum. The equation of motion for these coordinates can be derived from a time depended variational principle. Properties of the equilibrium can be ascertained by a Monte Carlo simulation. With the now completely implemented antisymmetrisation the simulation yields a fundamental different behavior for dense hydrogen compare to earlier simplified models. The results show a phase transition to metallic hydrogen with a higher density than in the molecular phase. This behavior has e.g. a large implication to the physics of giant planets. This work describes the used model and explains in particular the calculation of the energy and forces. The periodicity of the wave function leads to a description in the Fourier space. The antisymmetrisation is done by Matrix operations. Moreover the numerical implementation is described in detail to allow the further development of the code. The results provided in this work show the equation of state in the temperature range 300K - 50000K an density 10{sup 23}-10{sup 24} cm{sup -3}, according a pressure 1 GPa-1000 GPa. In a phase diagram the phase transition to metallic hydrogen can be red off. The electrical conductivity of both phases is destined. (orig.)

  15. Descriptive Analysis of Writing Composition from the Ideas to the Paragraph

    Directory of Open Access Journals (Sweden)

    Titik Nurrohmah

    2016-06-01

    Full Text Available What are the ways to discover the ideas to write? How to organize the ideas into the paragraph? To answer these questions, the writer conducted a qualitative research that the object of the research is writing composition on the discovering the ideas and organize it into paragraph. The writer used descriptive method. For obtaining the data, the writer used library method as the instrument and used secondary sources that constituted secondhand information, such as reference book. Meanwhile, in analyzing the data, the writer used an expository writing. There were several ways in discovering ideas as the result of this study. Those included remembering experience, getting people opinion about particular subject by giving evidence, finding a great deal by asking other people about their experience and going to the library to get any ideas. Whereas, to organize the ideas into paragraph, someone has to do some steps, such as selecting a subject, planning a composition and making an outline.

  16. Descriptions of reference LWR facilities for analysis of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, K.J.; Kabele, T.J.

    1979-09-01

    To contribute to the Department of Energy's identification of needs for improved environmental controls in nuclear fuel cycles, a study was made of a light water reactor system. A reference LWR fuel cycle was defined, and each step in this cycle was characterized by facility description and mainline and effluent treatment process performance. The reference fuel cycle uses fresh uranium in light water reactors. Final treatment and ultimate disposition of waste from the fuel cycle steps were not included, and the waste is assumed to be disposed of by approved but currently undefined means. The characterization of the reference fuel cycle system is intended as basic information for further evaluation of alternative effluent control systems.

  17. Radioactive Solid Waste Storage and Disposal at Oak Ridge National Laboratory, Description and Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bates, L.D.

    2001-01-30

    Oak Ridge National Laboratory (ORNL) is a principle Department of Energy (DOE) Research Institution operated by the Union Carbide Corporation - Nuclear Division (UCC-ND) under direction of the DOE Oak Ridge Operations Office (DOE-ORO). The Laboratory was established in east Tennessee, near what is now the city of Oak Ridge, in the mid 1940s as a part of the World War II effort to develop a nuclear weapon. Since its inception, disposal of radioactively contaminated materials, both solid and liquid, has been an integral part of Laboratory operations. The purpose of this document is to provide a detailed description of the ORNL Solid Waste Storage Areas, to describe the practice and procedure of their operation, and to address the health and safety impacts and concerns of that operation.

  18. [Professor Xu Fu-song's traditional Chinese medicine protocols for male diseases: A descriptive analysis].

    Science.gov (United States)

    Liu, Cheng-yong; Xu, Fu-song

    2015-04-01

    To analyze the efficacy and medication principles of Professor Xu Fu-songs traditional Chinese medicine (TCM) protocols for male diseases. We reviewed and descriptively analyzed the unpublished complete medical records of 100 male cases treated by Professor Xu Fu-song with his TCM protocols from 1978 to 1992. The 100 cases involved 32 male diseases, most of which were difficult and complicated cases. The drug compliance was 95%. Each prescription was made up of 14 traditional Chinese drugs on average. The cure rate was 32% , and the effective rate was 85%. Professor Xu Fu-song advanced and proved some new theories and therapeutic methods. Professor Xu Fu-song's TCM protocols can be applied to a wide range of male diseases, mostly complicated, and are characterized by accurate differentiation of symptoms and signs, high drug compliance, and excellent therapeutic efficacy.

  19. Protein Data Bank Japan (PDBj): updated user interfaces, resource description framework, analysis tools for large structures

    Science.gov (United States)

    Kinjo, Akira R.; Bekker, Gert-Jan; Suzuki, Hirofumi; Tsuchiya, Yuko; Kawabata, Takeshi; Ikegawa, Yasuyo; Nakamura, Haruki

    2017-01-01

    The Protein Data Bank Japan (PDBj, http://pdbj.org), a member of the worldwide Protein Data Bank (wwPDB), accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins. We herein outline the updated web user interfaces together with RESTful web services and the backend relational database that support the former. To enhance the interoperability of the PDB data, we have previously developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, which is now a wwPDB standard called wwPDB/RDF. We have enhanced the connectivity of the wwPDB/RDF data by incorporating various external data resources. Services for searching, comparing and analyzing the ever-increasing large structures determined by hybrid methods are also described. PMID:27789697

  20. Descriptions of reference LWR facilities for analysis of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, K.J.; Kabele, T.J.

    1979-09-01

    To contribute to the Department of Energy's identification of needs for improved environmental controls in nuclear fuel cycles, a study was made of a light water reactor system. A reference LWR fuel cycle was defined, and each step in this cycle was characterized by facility description and mainline and effluent treatment process performance. The reference fuel cycle uses fresh uranium in light water reactors. Final treatment and ultimate disposition of waste from the fuel cycle steps were not included, and the waste is assumed to be disposed of by approved but currently undefined means. The characterization of the reference fuel cycle system is intended as basic information for further evaluation of alternative effluent control systems.

  1. The International Migration in the EU. A Descriptive Analysis Focused on Romania

    Directory of Open Access Journals (Sweden)

    Raluca Mariana Grosu

    2013-08-01

    Full Text Available Migration represents one of the main means humans have chosen for improving their standards of living. Even though it is an important phenomenon manifested since ancient times, migration has never been so much in the attention of scholars and policy makers as it is in present times, especially for its implications in different areas such as demography, economy, sociology, politics, etc. As well, migration is a vital component of the contemporary society and in the same time plays a key role in the development of regions, from various perspectives such as economic, social, or cultural. Taking into consideration the previously outlined framework, the present paper aims at analyzing in a descriptive manner the international migration phenomenon in the European Union (EU countries between 2006 and 2010, in order to highlight the frame in which Romania is placed from the perspective of the quantitative dimension of international migration.

  2. A Summary Description of a Computer Program Concept for the Design and Simulation of Solar Pond Electric Power Generation Systems

    Science.gov (United States)

    1984-01-01

    A solar pond electric power generation subsystem, an electric power transformer and switch yard, a large solar pond, a water treatment plant, and numerous storage and evaporation ponds. Because a solar pond stores thermal energy over a long period of time, plant operation at any point in time is dependent upon past operation and future perceived generation plans. This time or past history factor introduces a new dimension in the design process. The design optimization of a plant must go beyond examination of operational state points and consider the seasonal variations in solar, solar pond energy storage, and desired plant annual duty-cycle profile. Models or design tools will be required to optimize a plant design. These models should be developed in order to include a proper but not excessive level of detail. The model should be targeted to a specific objective and not conceived as a do everything analysis tool, i.e., system design and not gradient-zone stability.

  3. Advanced Thermal Simulator Testing: Thermal Analysis and Test Results

    Science.gov (United States)

    Bragg-Sitton, Shannon M.; Dickens, Ricky; Dixon, David; Reid, Robert; Adams, Mike; Davis, Joe

    2008-01-01

    Work at the NASA Marshall Space Flight Center seeks to develop high fidelity, electrically heated thermal simulators that represent fuel elements in a nuclear reactor design to support non-nuclear testing applicable to the development of a space nuclear power or propulsion system. Comparison between the fuel pins and thermal simulators is made at the outer fuel clad surface, which corresponds to the outer sheath surface in the thermal simulator. The thermal simulators that are currently being tested correspond to a SNAP derivative reactor design that could be applied for Lunar surface power. These simulators are designed to meet the geometric and power requirements of a proposed surface power reactor design, accommodate testing of various axial power profiles, and incorporate imbedded instrumentation. This paper reports the results of thermal simulator analysis and testing in a bare element configuration, which does not incorporate active heat removal, and testing in a water-cooled calorimeter designed to mimic the heat removal that would be experienced in a reactor core.

  4. Dispersion analysis and linear error analysis capabilities of the space vehicle dynamics simulation program

    Science.gov (United States)

    Snow, L. S.; Kuhn, A. E.

    1975-01-01

    Previous error analyses conducted by the Guidance and Dynamics Branch of NASA have used the Guidance Analysis Program (GAP) as the trajectory simulation tool. Plans are made to conduct all future error analyses using the Space Vehicle Dynamics Simulation (SVDS) program. A study was conducted to compare the inertial measurement unit (IMU) error simulations of the two programs. Results of the GAP/SVDS comparison are presented and problem areas encountered while attempting to simulate IMU errors, vehicle performance uncertainties and environmental uncertainties using SVDS are defined. An evaluation of the SVDS linear error analysis capability is also included.

  5. Analysis and Description of HOLTIN Service Provision for AECG monitoring in Complex Indoor Environments

    Directory of Open Access Journals (Sweden)

    Francisco Falcone

    2013-04-01

    Full Text Available In this work, a novel ambulatory ECG monitoring device developed in-house called HOLTIN is analyzed when operating in complex indoor scenarios. The HOLTIN system is described, from the technological platform level to its functional model. In addition, by using in-house 3D ray launching simulation code, the wireless channel behavior, which enables ubiquitous operation, is performed. The effect of human body presence is taken into account by a novel simplified model embedded within the 3D Ray Launching code. Simulation as well as measurement results are presented, showing good agreement. These results may aid in the adequate deployment of this novel device to automate conventional medical processes, increasing the coverage radius and optimizing energy consumption.

  6. Simulated spectra for QA/QC of spectral analysis software

    Energy Technology Data Exchange (ETDEWEB)

    Jackman, K. R. (Kevin R.); Biegalski, S. R.

    2004-01-01

    Monte Carlo simulated spectra have been developed to test the peak analysis algorithms of several spectral analysis software packages. Using MCNP 5, generic sample spectra were generated in order to perform ANSI N42.14 standard spectral tests on Canberra Genie-2000, Ortec GammaVision, and UniSampo. The reference spectra were generated in MCNP 5 using an F8, pulse height, tally with a detector model of an actual Germanium detector used in counting. The detector model matches the detector resolution, energy calibration, and efficiency. The simulated spectra have been found to be useful in testing the reliability and performance of spectral analysis programs. The detector model used was found to be useful in testing the performance of modern spectral analysis software tools. The software packages were analyzed and found to be in compliance with the ANSI 42.14 tests of the peak-search and peak-fitting algorithms. This method of using simulated spectra can be used to perform the ANSI 42.14 tests on the reliability and performance of spectral analysis programs in the absence of standard radioactive materials.

  7. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  8. Variable-density groundwater flow simulations and particle tracking. Numerical modelling using DarcyTools. Preliminary site description of the Simpevarp area, version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Follin, Sven [SF GeoLogic AB, Stockholm (Sweden); Stigsson, Martin; Berglund, Sten [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-12-01

    SKB is conducting site investigations for a high-level nuclear waste repository in fractured crystalline rocks at two coastal areas in Sweden, Forsmark and Simpevarp. The investigations started in 2002 and have been planned since the late 1990s. The work presented here investigates the possibility of using hydrogeochemical measurements in deep boreholes to reduce parameter uncertainty in a regional modelling of groundwater flow in fractured rock. The work was conducted with the aim of improving the palaeohydrogeological understanding of the Simpevarp area and to give recommendations to the preparations of the next version of the Preliminary Site Description (1.2). The study is based on a large number of numerical simulations of transient variable density groundwater flow through a strongly heterogeneous and anisotropic medium. The simulations were conducted with the computer code DarcyTools, the development of which has been funded by SKB. DarcyTools is a flexible porous media code specifically designed to treat groundwater flow and salt transport in sparsely fractured crystalline rock and it is noted that some of the features presented in this report are still under development or subjected to testing and verification. The simulations reveal the sensitivity of the results to different hydrogeological modelling assumptions, e.g. the sensitivity to the initial groundwater conditions at 10,000 BC, the size of the model domain and boundary conditions, and the hydraulic properties of deterministically and stochastically modelled deformation zones. The outcome of these simulations was compared with measured salinities and calculated relative proportions of different water types (mixing proportions) from measurements in two deep core drilled boreholes in the Laxemar subarea. In addition to the flow simulations, the statistics of flow related transport parameters were calculated for particle flowpaths from repository depth to ground surface for two subareas within the

  9. Rethinking Sensitivity Analysis of Nuclear Simulations with Topology

    Energy Technology Data Exchange (ETDEWEB)

    Dan Maljovec; Bei Wang; Paul Rosen; Andrea Alfonsi; Giovanni Pastore; Cristian Rabiti; Valerio Pascucci

    2016-01-01

    In nuclear engineering, understanding the safety margins of the nuclear reactor via simulations is arguably of paramount importance in predicting and preventing nuclear accidents. It is therefore crucial to perform sensitivity analysis to understand how changes in the model inputs affect the outputs. Modern nuclear simulation tools rely on numerical representations of the sensitivity information -- inherently lacking in visual encodings -- offering limited effectiveness in communicating and exploring the generated data. In this paper, we design a framework for sensitivity analysis and visualization of multidimensional nuclear simulation data using partition-based, topology-inspired regression models and report on its efficacy. We rely on the established Morse-Smale regression technique, which allows us to partition the domain into monotonic regions where easily interpretable linear models can be used to assess the influence of inputs on the output variability. The underlying computation is augmented with an intuitive and interactive visual design to effectively communicate sensitivity information to the nuclear scientists. Our framework is being deployed into the multi-purpose probabilistic risk assessment and uncertainty quantification framework RAVEN (Reactor Analysis and Virtual Control Environment). We evaluate our framework using an simulation dataset studying nuclear fuel performance.

  10. Hierarchical Visual Analysis and Steering Framework for Astrophysical Simulations

    Institute of Scientific and Technical Information of China (English)

    肖健; 张加万; 原野; 周鑫; 纪丽; 孙济洲

    2015-01-01

    A framework for accelerating modern long-running astrophysical simulations is presented, which is based on a hierarchical architecture where computational steering in the high-resolution run is performed under the guide of knowledge obtained in the gradually refined ensemble analyses. Several visualization schemes for facilitating ensem-ble management, error analysis, parameter grouping and tuning are also integrated owing to the pluggable modular design. The proposed approach is prototyped based on the Flash code, and it can be extended by introducing user-defined visualization for specific requirements. Two real-world simulations, i.e., stellar wind and supernova remnant, are carried out to verify the proposed approach.

  11. Simulation Process Analysis of Rubber Shock Absorber for Machine Tool

    Directory of Open Access Journals (Sweden)

    Chai Rong Xia

    2016-01-01

    Full Text Available The simulation on rubber shock absorber of machine tool was studied. The simple material model of rubber was obtained by through the finite element analysis software ABAQUS. The compression speed and the hardness of rubber material were considered to obtain the deformation law of rubber shock absorber. The location of fatigue were confirmed from the simulation results. The results shown that the fatigue position is distributed in the corner of shock absorber. The degree of deformation is increased with increasing of compress speed, and the hardness of rubber material is proportional to deformation.

  12. Battery Simulation Tool for Worst Case Analysis and Mission Evaluations

    Directory of Open Access Journals (Sweden)

    Lefeuvre Stéphane

    2017-01-01

    The first part of this paper presents the PSpice models including their respective variable parameters at SBS and cell level. Then the second part of the paper introduces to the reader the model parameters that were chosen and identified to perform Monte Carlo Analysis simulations. The third part reflects some MCA results for a VES16 battery module. Finally the reader will see some other simulations that were performed by re-using the battery model for an another Saft battery cell type (MP XTD for a specific space application, at high temperature.

  13. Computer simulation of ion beam analysis of laterally inhomogeneous materials

    Energy Technology Data Exchange (ETDEWEB)

    Mayer, M.

    2016-03-15

    The program STRUCTNRA for the simulation of ion beam analysis charged particle spectra from arbitrary two-dimensional distributions of materials is described. The code is validated by comparison to experimental backscattering data from a silicon grating on tantalum at different orientations and incident angles. Simulated spectra for several types of rough thin layers and a chessboard-like arrangement of materials as example for a multi-phase agglomerate material are presented. Ambiguities between back-scattering spectra from two-dimensional and one-dimensional sample structures are discussed.

  14. Descriptions of verbal communication errors between staff. An analysis of 84 root cause analysis-reports from Danish hospitals

    DEFF Research Database (Denmark)

    Rabøl, Louise Isager; Østergaard, Doris; Jensen, Brian Bjørn

    2011-01-01

    incidents. The objective of this study is to review RCA reports (RCAR) for characteristics of verbal communication errors between hospital staff in an organisational perspective. Method Two independent raters analysed 84 RCARs, conducted in six Danish hospitals between 2004 and 2006, for descriptions...... and characteristics of verbal communication errors such as handover errors and error during teamwork. Results Raters found description of verbal communication errors in 44 reports (52%). These included handover errors (35 (86%)), communication errors between different staff groups (19 (43%)), misunderstandings (13...... units and consults from other specialties, were particularly vulnerable processes. Conclusion With the risk of bias in mind, it is concluded that more than half of the RCARs described erroneous verbal communication between staff members as root causes of or contributing factors of severe patient safety...

  15. Simulation and analysis of conjunctive use with MODFLOW's farm process.

    Science.gov (United States)

    Hanson, R T; Schmid, W; Faunt, C C; Lockwood, B

    2010-01-01

    The extension of MODFLOW onto the landscape with the Farm Process (MF-FMP) facilitates fully coupled simulation of the use and movement of water from precipitation, streamflow and runoff, groundwater flow, and consumption by natural and agricultural vegetation throughout the hydrologic system at all times. This allows for more complete analysis of conjunctive use water-resource systems than previously possible with MODFLOW by combining relevant aspects of the landscape with the groundwater and surface water components. This analysis is accomplished using distributed cell-by-cell supply-constrained and demand-driven components across the landscape within "water-balance subregions" comprised of one or more model cells that can represent a single farm, a group of farms, or other hydrologic or geopolitical entities. Simulation of micro-agriculture in the Pajaro Valley and macro-agriculture in the Central Valley are used to demonstrate the utility of MF-FMP. For Pajaro Valley, the simulation of an aquifer storage and recovery system and related coastal water distribution system to supplant coastal pumpage was analyzed subject to climate variations and additional supplemental sources such as local runoff. For the Central Valley, analysis of conjunctive use from different hydrologic settings of northern and southern subregions shows how and when precipitation, surface water, and groundwater are important to conjunctive use. The examples show that through MF-FMP's ability to simulate natural and anthropogenic components of the hydrologic cycle, the distribution and dynamics of supply and demand can be analyzed, understood, and managed. This analysis of conjunctive use would be difficult without embedding them in the simulation and are difficult to estimate a priori.

  16. Descriptions of verbal communication errors between staff. An analysis of 84 root cause analysis-reports from Danish hospitals.

    Science.gov (United States)

    Rabøl, Louise Isager; Andersen, Mette Lehmann; Østergaard, Doris; Bjørn, Brian; Lilja, Beth; Mogensen, Torben

    2011-03-01

    Poor teamwork and communication between healthcare staff are correlated to patient safety incidents. However, the organisational factors responsible for these issues are unexplored. Root cause analyses (RCA) use human factors thinking to analyse the systems behind severe patient safety incidents. The objective of this study is to review RCA reports (RCAR) for characteristics of verbal communication errors between hospital staff in an organisational perspective. Two independent raters analysed 84 RCARs, conducted in six Danish hospitals between 2004 and 2006, for descriptions and characteristics of verbal communication errors such as handover errors and error during teamwork. Raters found description of verbal communication errors in 44 reports (52%). These included handover errors (35 (86%)), communication errors between different staff groups (19 (43%)), misunderstandings (13 (30%)), communication errors between junior and senior staff members (11 (25%)), hesitance in speaking up (10 (23%)) and communication errors during teamwork (8 (18%)). The kappa values were 0.44-0.78. Unproceduralized communication and information exchange via telephone, related to transfer between units and consults from other specialties, were particularly vulnerable processes. With the risk of bias in mind, it is concluded that more than half of the RCARs described erroneous verbal communication between staff members as root causes of or contributing factors of severe patient safety incidents. The RCARs rich descriptions of the incidents revealed the organisational factors and needs related to these errors.

  17. Transport Catastrophe Analysis as an Alternative to a Monofractal Description: Theory and Application to Financial Crisis Time Series

    Directory of Open Access Journals (Sweden)

    Sergey A. Kamenshchikov

    2014-01-01

    Full Text Available The goal of this investigation was to overcome limitations of a persistency analysis, introduced by Benoit Mandelbrot for monofractal Brownian processes: nondifferentiability, Brownian nature of process, and a linear memory measure. We have extended a sense of a Hurst factor by consideration of a phase diffusion power law. It was shown that precatastrophic stabilization as an indicator of bifurcation leads to a new minimum of momentary phase diffusion, while bifurcation causes an increase of the momentary transport. An efficiency of a diffusive analysis has been experimentally compared to the Reynolds stability model application. An extended Reynolds parameter has been introduced as an indicator of phase transition. A combination of diffusive and Reynolds analyses has been applied for a description of a time series of Dow Jones Industrial weekly prices for the world financial crisis of 2007–2009. Diffusive and Reynolds parameters showed extreme values in October 2008 when a mortgage crisis was fixed. A combined R/D description allowed distinguishing of market evolution short-memory and long-memory shifts. It was stated that a systematic large scale failure of a financial system has begun in October 2008 and started fading in February 2009.

  18. New analysis for consistency among markers in the study of genetic diversity: development and application to the description of bacterial diversity

    Directory of Open Access Journals (Sweden)

    Bailly Xavier

    2007-09-01

    Full Text Available Abstract Background The development of post-genomic methods has dramatically increased the amount of qualitative and quantitative data available to understand how ecological complexity is shaped. Yet, new statistical tools are needed to use these data efficiently. In support of sequence analysis, diversity indices were developed to take into account both the relative frequencies of alleles and their genetic divergence. Furthermore, a method for describing inter-population nucleotide diversity has recently been proposed and named the double principal coordinate analysis (DPCoA, but this procedure can only be used with one locus. In order to tackle the problem of measuring and describing nucleotide diversity with more than one locus, we developed three versions of multiple DPCoA by using three ordination methods: multiple co-inertia analysis, STATIS, and multiple factorial analysis. Results This combination of methods allows i testing and describing differences in patterns of inter-population diversity among loci, and ii defining the best compromise among loci. These methods are illustrated by the analysis of both simulated data sets, which include ten loci evolving under a stepping stone model and a locus evolving under an alternative population structure, and a real data set focusing on the genetic structure of two nitrogen fixing bacteria, which is influenced by geographical isolation and host specialization. All programs needed to perform multiple DPCoA are freely available. Conclusion Multiple DPCoA allows the evaluation of the impact of various loci in the measurement and description of diversity. This method is general enough to handle a large variety of data sets. It complements existing methods such as the analysis of molecular variance or other analyses based on linkage disequilibrium measures, and is very useful to study the impact of various loci on the measurement of diversity.

  19. Brain Based Learning in Science Education in Turkey: Descriptive Content and Meta Analysis of Dissertations

    Science.gov (United States)

    Yasar, M. Diyaddin

    2017-01-01

    This study aimed at performing content analysis and meta-analysis on dissertations related to brain-based learning in science education to find out the general trend and tendency of brain-based learning in science education and find out the effect of such studies on achievement and attitude of learners with the ultimate aim of raising awareness…

  20. A descriptive analysis of the climbing mechanics of a mountain goat (Oreamnos americanus).

    Science.gov (United States)

    Lewinson, Ryan T; Stefanyshyn, Darren J

    2016-12-01

    The mountain goat (Oreamnos americanus) is one of the most extraordinary mountaineers in the animal kingdom. While observational descriptions exist to indicate factors that may influence their climbing ability, these have never been assessed biomechanically. Here, we describe whole-body motion of a mountain goat during ascent of a 45° incline based on a video recording in the Canadian Rocky Mountains, and discuss the results in a mechanical context. During the push-off phase, the hindlimb extended and the forelimb was tucked close to the torso. During the pull-up phase, the hindlimb was raised near to the torso, while the forelimb humerus seemed to "lock" in a constant position relative to the torso, allowing the elbow to be held in close proximity to the whole-body center of mass. Extension of the elbow and carpal joints resulted in a vertical translation of the center of mass up the mountain slope. Based on the observations from this naturalistic study, hypotheses for future controlled studies of mountain goat climbing mechanics are proposed.

  1. A descriptive analysis of patients presenting to psychosexual clinic at a tertiary care center

    Directory of Open Access Journals (Sweden)

    Rohit Verma

    2013-01-01

    Full Text Available Background: Psychosexual problems are very common presentation, be it with psychiatric or physical illness but there are very few studies available on psychosexual disorders especially in the Indian context. Indian society is deeply ingrained in customs and several misconceptions, myths, prejudices, and social taboos are attached to sex which makes it further very difficult to tackle. Objectives: The aim of this current study was to descriptively analyze the nature of sexual disorders in a tertiary care center. Materials and Methods: The current retrospective chart review included 698 consecutive subjects seeking treatment for their psychosexual problems at the Sexual Clinic, Department of Psychiatry, Dr. Ram Manohar Hospital, New Delhi (between 2006 and 2010. Results: This study observed erectile dysfunction (ED (29.5%, Premature ejaculation (PME (24.6%, Dhat syndrome (DS (18.1%, and ED with PME (17.5% as the common sexual dysfunctions leading to treatment seeking. DS was the major complaint among younger and unmarried individuals. We observed more married individuals seeking treatment for sexual disorders. Conclusions: These findings provide important information on a relatively under-researched area.

  2. Ampelographic Description and Sanitary Analysis of Four Istrian Grapevine Varieties (Vitis vinifera L.

    Directory of Open Access Journals (Sweden)

    Djordano Persuric

    2014-02-01

    Full Text Available Istrian Peninsula, one of the five districts within viticultural region of Coastal Croatia, provides great geological-reliefal and climatic diversity and various production conditions. This research studied the autochthonous varieties and their sanitary status in old vineyards. Considering the age of vineyard, ten locations were chosen where four autochthonous varieties ‘Malvasia istarska’, ‘Teran’, ‘Borgonja’, and ‘Pergola velika’ were identified using ampelographic description according to OIV descriptors. Morphological characteristics of chosen varieties were described using OIV parameters and must was chemically analysed (pH value, sugar content, titratable acidity. High intra cultivar variability was found for weight of a single bunch especially for ‘Teran’. There were also differences in sugar content of must particularly for ‘Pergola velika’. Must pH was low for all varieties with predominantly low acidity value. Sanitary status of vines was determined by testing the plant samples for the presence of three grapevine viruses (GLRaV-1, GLRaV-3 and GFLV using DAS-ELISA. The percentage of infection for GFLV was 55.6% while for GLRaV-1 and GLRaV-3 it was 61.1%. Results showed that some morphological characteristics differ from characteristics described in literature. With purpose of preserving the biodiversity of autochthonous varieties and for future researches, healthy propagation material will be collected and planted in collection field of autochthonous varieties at the Institute of Agriculture and Tourism, Poreč.

  3. Descriptive distribution and phylogenetic analysis of feline infectious peritonitis virus isolates of Malaysia

    Directory of Open Access Journals (Sweden)

    Arshad Habibah

    2010-01-01

    Full Text Available Abstract The descriptive distribution and phylogeny of feline coronaviruses (FCoVs were studied in cats suspected of having feline infectious peritonitis (FIP in Malaysia. Ascitic fluids and/or biopsy samples were subjected to a reverse transcription polymerase chain reaction (RT-PCR targeted for a conserved region of 3'untranslated region (3'UTR of the FCoV genome. Eighty nine percent of the sampled animals were positive for the presence of FCoV. Among the FCoV positive cats, 80% of cats were males and 64% were below 2 years of age. The FCoV positive cases included 56% domestic short hair (DSH, 40% Persian, and 4% Siamese cats. The nucleotide sequences of 10 selected amplified products from FIP cases were determined. The sequence comparison revealed that the field isolates had 96% homology with a few point mutations. The extent of homology decreased to 93% when compared with reference strains. The overall branching pattern of phylogenetic tree showed two distinct clusters, where all Malaysian isolates fall into one main genetic cluster. These findings provided the first genetic information of FCoV in Malaysia.

  4. Descriptive distribution and phylogenetic analysis of feline infectious peritonitis virus isolates of Malaysia.

    Science.gov (United States)

    Sharif, Saeed; Arshad, Siti S; Hair-Bejo, Mohd; Omar, Abdul R; Zeenathul, Nazariah A; Fong, Lau S; Rahman, Nor-Alimah; Arshad, Habibah; Shamsudin, Shahirudin; Isa, Mohd-Kamarudin A

    2010-01-06

    The descriptive distribution and phylogeny of feline coronaviruses (FCoVs) were studied in cats suspected of having feline infectious peritonitis (FIP) in Malaysia. Ascitic fluids and/or biopsy samples were subjected to a reverse transcription polymerase chain reaction (RT-PCR) targeted for a conserved region of 3'untranslated region (3'UTR) of the FCoV genome. Eighty nine percent of the sampled animals were positive for the presence of FCoV. Among the FCoV positive cats, 80% of cats were males and 64% were below 2 years of age. The FCoV positive cases included 56% domestic short hair (DSH), 40% Persian, and 4% Siamese cats. The nucleotide sequences of 10 selected amplified products from FIP cases were determined. The sequence comparison revealed that the field isolates had 96% homology with a few point mutations. The extent of homology decreased to 93% when compared with reference strains. The overall branching pattern of phylogenetic tree showed two distinct clusters, where all Malaysian isolates fall into one main genetic cluster. These findings provided the first genetic information of FCoV in Malaysia.

  5. Adriatic calcarean sponges (Porifera, Calcarea, with the description of six new species and a richness analysis

    Directory of Open Access Journals (Sweden)

    Michelle Klautau

    2016-03-01

    Full Text Available In this study we analyze the calcarean sponge diversity of the Adriatic Sea, the type locality of some of the first described species of calcarean sponges. Morphological and molecular approaches are combined for the taxonomic identification. Our results reveal six species new to science and provisionally endemic to the Adriatic Sea (Ascandra spalatensis sp. nov., Borojevia croatica sp. nov., Leucandra falakra sp. nov., L. spinifera sp. nov., Paraleucilla dalmatica sp. nov., and Sycon ancora sp. nov., one species previously known only from the Southwestern Atlantic (Clathrina conifera, and three already known from the Adriatic Sea (Ascaltis reticulum, Borojevia cerebrum, and Clathrina primordialis. We confirm the presence of the alien species Paraleucilla magna in the Adriatic and again record Clathrina blanca, C. clathrus, and C. rubra. We emend the description of the genus Ascaltis, propose a lectotype for Borojevia cerebrum and synonymise B. decipiens with B. cerebrum. A checklist of all calcarean species previously and currently known from the Adriatic Sea (39 species is given. The Central Adriatic is indicated as the richest calcarean sponge fauna sector; however, the biodiversity of this class is underestimated in the whole Adriatic Sea and new systematic surveys are desirable.

  6. Pakistani English Newspaper Paid Obituary Announcements: A Descriptive Analysis of the Transliterated Vocabulary

    Directory of Open Access Journals (Sweden)

    Sajid M. Chaudhry

    2016-08-01

    Full Text Available The study, qualitative and descriptive in nature, examines the use of transliteration in the paid Pakistani obituary announcements authored in the English language. Primarily, it identifies the frequently used transliterated vocabulary in these linguistic messages and reconnoiters the functional relationship that emerges in and between the textual moves of these announcements due to the linkage created by these transliterated words and phrases. Additionally, the study sheds light on the motives of the authors of these announcements behind opting for this lexical borrowing. Data, for the purpose, comes from the two prominent Pakistani English newspapers: The Dawn and The News International. The study concludes that the transliterated vocabulary used in the Pakistani English obituary announcements is a need-based, religiously and culturally enthused, lexical borrowing that not only helps the authors of these texts convey their intentional messages effectively but also enhances the exactness and spontaneity of the contents of these announcements. Keywords: Obituary Announcement, Transliteration, Lexical borrowing, Source language, Target language, Cognitive synonyms

  7. Initial Data Analysis Results for ATD-2 ISAS HITL Simulation

    Science.gov (United States)

    Lee, Hanbong

    2017-01-01

    To evaluate the operational procedures and information requirements for the core functional capabilities of the ATD-2 project, such as tactical surface metering tool, APREQ-CFR procedure, and data element exchanges between ramp and tower, human-in-the-loop (HITL) simulations were performed in March, 2017. This presentation shows the initial data analysis results from the HITL simulations. With respect to the different runway configurations and metering values in tactical surface scheduler, various airport performance metrics were analyzed and compared. These metrics include gate holding time, taxi-out in time, runway throughput, queue size and wait time in queue, and TMI flight compliance. In addition to the metering value, other factors affecting the airport performance in the HITL simulation, including run duration, runway changes, and TMI constraints, are also discussed.

  8. Slide track analysis of eight contemporary hip simulator designs.

    Science.gov (United States)

    Calonius, Olof; Saikko, Vesa

    2002-11-01

    In an earlier paper, the authors presented a new method of computation of slide tracks in the relative motion between femoral head and acetabular cup of total hip prostheses. For the first time, computed tracks were verified experimentally and with an alternative method of computation. Besides being an efficient way to illustrate hip kinematics, the shapes of the slide tracks are known to be of fundamental importance regarding the wear behaviour of prostheses. The verified method was now applied to eight contemporary hip simulator designs. The use of correct motion waveforms and an Euler sequence of rotations in each case was again found to be essential. Considerable differences were found between the simulators. For instance, the shapes of the tracks drawn by the resultant contact force included a circle, ellipse, irregular oval, leaf, twig, and straight line. Computation of tracks correctly for the most widely used hip simulator, known as biaxial, was made possible by the insight that the device is actually three-axial. Slide track patterns have now been computed for virtually all contemporary hip simulators, and both for the heads and for the cups. This comparative analysis forms a valuable basis for studies on the relationship between the type of multidirectional motion and wear. These studies can produce useful information for the design of joint simulators, and improve the understanding of wear phenomena in prosthetic joints.

  9. SARA (System ARchitects Apprentice): Modeling, analysis, and simulation support for design of concurrent systems

    Energy Technology Data Exchange (ETDEWEB)

    Estrin, G.; Fenchel, R.S.; Razouk, R.R.; Vernon, M.K.

    1986-02-01

    An environment to support designers in the modeling, analysis and simulation of concurrent systems is described. It is shown how a fully nested structure model supports multilevel design and focuses attention on the interfaces between the modules which serve to encapsulate behavior. Using simple examples the paper indicates how a formal graph model can be used to model behavior in three domains: control flow, data flow, and interpretation. The effectiveness of the explicity environment model in SARA is discussed and the capability to analyze correctness and evaluate performance of a system model are demonstrated. A description of the integral help designed into SARA shows how the designer can be offered consistent use of any new tool introduced to support the design process.

  10. A Spectral Multiscale Method for Wave Propagation Analysis: Atomistic-Continuum Coupled Simulation

    CERN Document Server

    Patra, Amit K; Ganguli, Ranjan

    2014-01-01

    In this paper, we present a new multiscale method which is capable of coupling atomistic and continuum domains for high frequency wave propagation analysis. The problem of non-physical wave reflection, which occurs due to the change in system description across the interface between two scales, can be satisfactorily overcome by the proposed method. We propose an efficient spectral domain decomposition of the total fine scale displacement along with a potent macroscale equation in the Laplace domain to eliminate the spurious interfacial reflection. We use Laplace transform based spectral finite element method to model the macroscale, which provides the optimum approximations for required dynamic responses of the outer atoms of the simulated microscale region very accurately. This new method shows excellent agreement between the proposed multiscale model and the full molecular dynamics (MD) results. Numerical experiments of wave propagation in a 1D harmonic lattice, a 1D lattice with Lennard-Jones potential, a ...

  11. Toward crustacean without chemicals: a descriptive analysis of consumer response using price comparisons.

    Science.gov (United States)

    Okpala, Charles Odilichukwu R; Bono, Gioacchino; Pipitone, Vito; Vitale, Sergio; Cannizzaro, Leonardo

    2016-01-01

    To date, there seems to be limited-to-zero emphasis about how consumers perceive crustacean products subject to either chemical and or non-chemical preservative treatments. In addition, studies that investigated price comparisons of crustacean products subject to either chemical or chemical-free preservative methods seem unreported. This study focused on providing some foundational knowledge about how consumers perceive traditionally harvested crustaceans that are either chemical-treated and or free of chemicals, incorporating price comparisons using a descriptive approach. The study design employed a questionnaire approach via interview using a computer-assisted telephone system and sampled 1,540 participants across five key locations in Italy. To actualize consumer sensitivity, 'price' was the focus given its crucial role as a consumption barrier. Prior to this, variables such as demographic characteristics of participants, frequency of purchasing, quality attributes/factors that limit the consumption of crustaceans were equally considered. By price comparisons, consumers are likely to favor chemical-free (modified atmosphere packaging) crustacean products amid a price increase of up to 15%. But, a further price increase such as by 25% could markedly damage consumers' feelings, which might lead to a considerable number opting out in favor of either chemical-treated or other seafood products. Comparing locations, the studied variables showed no statistical differences (p>0.05). On the contrary, the response weightings fluctuated across the studied categories. Both response weightings and coefficient of variation helped reveal more about how responses deviated per variable categories. This study has revealed some foundational knowledge about how consumers perceive traditionally harvested crustaceans that were either chemical-treated or subject to chemical-free preservative up to price sensitivity using Italy as a reference case, which is applicable to other parts

  12. Toward crustacean without chemicals: a descriptive analysis of consumer response using price comparisons

    Directory of Open Access Journals (Sweden)

    Charles Odilichukwu R. Okpala

    2016-10-01

    Full Text Available Background: To date, there seems to be limited-to-zero emphasis about how consumers perceive crustacean products subject to either chemical and or non-chemical preservative treatments. In addition, studies that investigated price comparisons of crustacean products subject to either chemical or chemical-free preservative methods seem unreported. Objective: This study focused on providing some foundational knowledge about how consumers perceive traditionally harvested crustaceans that are either chemical-treated and or free of chemicals, incorporating price comparisons using a descriptive approach. Design: The study design employed a questionnaire approach via interview using a computer-assisted telephone system and sampled 1,540 participants across five key locations in Italy. To actualize consumer sensitivity, ‘price’ was the focus given its crucial role as a consumption barrier. Prior to this, variables such as demographic characteristics of participants, frequency of purchasing, quality attributes/factors that limit the consumption of crustaceans were equally considered. Results: By price comparisons, consumers are likely to favor chemical-free (modified atmosphere packaging crustacean products amid a price increase of up to 15%. But, a further price increase such as by 25% could markedly damage consumers’ feelings, which might lead to a considerable number opting out in favor of either chemical-treated or other seafood products. Comparing locations, the studied variables showed no statistical differences (p>0.05. On the contrary, the response weightings fluctuated across the studied categories. Both response weightings and coefficient of variation helped reveal more about how responses deviated per variable categories. Conclusions: This study has revealed some foundational knowledge about how consumers perceive traditionally harvested crustaceans that were either chemical-treated or subject to chemical-free preservative up to price

  13. Information Security Analysis Using Game Theory and Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Schlicher, Bob G [ORNL; Abercrombie, Robert K [ORNL

    2012-01-01

    Information security analysis can be performed using game theory implemented in dynamic simulations of Agent Based Models (ABMs). Such simulations can be verified with the results from game theory analysis and further used to explore larger scale, real world scenarios involving multiple attackers, defenders, and information assets. Our approach addresses imperfect information and scalability that allows us to also address previous limitations of current stochastic game models. Such models only consider perfect information assuming that the defender is always able to detect attacks; assuming that the state transition probabilities are fixed before the game assuming that the players actions are always synchronous; and that most models are not scalable with the size and complexity of systems under consideration. Our use of ABMs yields results of selected experiments that demonstrate our proposed approach and provides a quantitative measure for realistic information systems and their related security scenarios.

  14. Simulation analysis of globally integrated logistics and recycling strategies

    Energy Technology Data Exchange (ETDEWEB)

    Song, S.J.; Hiroshi, K. [Hiroshima Inst. of Tech., Graduate School of Mechanical Systems Engineering, Dept. of In formation and Intelligent Systems Engineering, Hiroshima (Japan)

    2004-07-01

    This paper focuses on the optimal analysis of world-wide recycling activities associated with managing the logistics and production activities in global manufacturing whose activities stretch across national boundaries. Globally integrated logistics and recycling strategies consist of the home country and two free trading economic blocs, NAFTA and ASEAN, where significant differences are found in production and disassembly cost, tax rates, local content rules and regulations. Moreover an optimal analysis of globally integrated value-chain was developed by applying simulation optimization technique as a decision-making tool. The simulation model was developed and analyzed by using ProModel packages, and the results help to identify some of the appropriate conditions required to make well-performed logistics and recycling plans in world-wide collaborated manufacturing environment. (orig.)

  15. Visualization and analysis of eddies in a global ocean simulation

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Sean J [Los Alamos National Laboratory; Hecht, Matthew W [Los Alamos National Laboratory; Petersen, Mark [Los Alamos National Laboratory; Strelitz, Richard [Los Alamos National Laboratory; Maltrud, Mathew E [Los Alamos National Laboratory; Ahrens, James P [Los Alamos National Laboratory; Hlawitschka, Mario [UC DAVIS; Hamann, Bernd [UC DAVIS

    2010-10-15

    Eddies at a scale of approximately one hundred kilometers have been shown to be surprisingly important to understanding large-scale transport of heat and nutrients in the ocean. Due to difficulties in observing the ocean directly, the behavior of eddies below the surface is not very well understood. To fill this gap, we employ a high-resolution simulation of the ocean developed at Los Alamos National Laboratory. Using large-scale parallel visualization and analysis tools, we produce three-dimensional images of ocean eddies, and also generate a census of eddy distribution and shape averaged over multiple simulation time steps, resulting in a world map of eddy characteristics. As expected from observational studies, our census reveals a higher concentration of eddies at the mid-latitudes than the equator. Our analysis further shows that mid-latitude eddies are thicker, within a range of 1000-2000m, while equatorial eddies are less than 100m thick.

  16. Simulation and Analysis of Converging Shock Wave Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramsey, Scott D. [Los Alamos National Laboratory; Shashkov, Mikhail J. [Los Alamos National Laboratory

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the original problem, and minimally straining the general credibility of associated analysis and conclusions.

  17. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  18. Complex Network Characteristics and Invulnerability Simulating Analysis of Supply Chain

    OpenAIRE

    Hui-Huang Chen; Ai-Min Lin

    2012-01-01

    To study the characteristics of the complex supply chain, a invulnerability analysis method based on the complex network theory is proposed. The topological structure and dynamic characteristics of the complex supply chain network were analyzed. The fact was found that the network is with general characteristics of the complex network, and with the characteristics of small-world network and scale-free network. A simulation experiment was made on the invulnerability of the supply chain network...

  19. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  20. Simulation methodology development for rotating blade containment analysis

    Institute of Scientific and Technical Information of China (English)

    Qing HE; Hai-jun XUAN; Lian-fang LIAO; Wei-rong HONG; Rong-ren WU

    2012-01-01

    An experimental and numerical investigation on the aeroengine blade/case containment analysis is presented.Blade out containment capability analysis is an essential step in the new aeroengine design,but containment tests are time-consuming and incur significant costs; thus,developing a short-period and low-cost numerical method is warranted.Using explicit nonlinear dynamic finite element analysis software,the present study numerically investigated the high-speed impact process for simulated blade containment tests which were carried out on high-speed spin testing facility.A number of simulations were conducted using finite element models with different mesh sizes and different values of both the contact penalty factor and the friction coefficient.Detailed comparisons between the experimental and numerical results reveal that the mesh size and the friction coefficient have a considerable impact on the results produced.It is shown that a finer mesh will predict lower containment capability of the case,which is closer to the test data.A larger value of the friction coefficient also predicts lower containment capability.However,the contact penalty factor has little effect on the simulation results if it is large enough to avoid false penetration.

  1. Sugarcane spirit market share simulation: an application of conjoint analysis

    Directory of Open Access Journals (Sweden)

    João de Deus Souza Carneiro

    2012-12-01

    Full Text Available This study evaluated the influence of packaging and labeling attributes of sugarcane spirit on consumers' behavior by applying the results of conjoint analysis in sugarcane spirit market share simulation. Firstly, a conjoint analysis was performed aiming to estimate the part-worths of each consumer for some sugarcane spirit packaging and labeling attributes. These part-worths were used in the market share simulation using the maximum utility model. It was observed that some packaging and labeling attributes affected consumer's purchase intention and that most consumers showed a similar preference pattern regarding these attributes. These consumers showed preference for the Seleta brand, which was bottled in 700 mL clear glass bottles with a metal screw cap that bore a label illustration unrelated to sugarcane spirit production process and had the information "aged 36 months in oak barrels". This study also showed that conjoint analysis and the use of its results in the market share simulation proved important tools to better understand consumer behavior towards intention to purchase sugarcane spirit.

  2. Description of a user-oriented geographic information system - The resource analysis program

    Science.gov (United States)

    Tilmann, S. E.; Mokma, D. L.

    1980-01-01

    This paper describes the Resource Analysis Program, an applied geographic information system. Several applications are presented which utilized soil, and other natural resource data, to develop integrated maps and data analyses. These applications demonstrate the methods of analysis and the philosophy of approach used in the mapping system. The applications are evaluated in reference to four major needs of a functional mapping system: data capture, data libraries, data analysis, and mapping and data display. These four criteria are then used to describe an effort to develop the next generation of applied mapping systems. This approach uses inexpensive microcomputers for field applications and should prove to be a viable entry point for users heretofore unable or unwilling to venture into applied computer mapping.

  3. Descriptive analysis of neurological in-hospital consultations in a tertiary hospital.

    Science.gov (United States)

    Aller-Alvarez, J S; Quintana, M; Santamarina, E; Álvarez-Sabín, J

    2017-04-01

    In-hospital consultations (IHC) are essential in clinical practice in tertiary hospitals. The aim of this study is to analyse the impact of neurological IHCs. One-year retrospective descriptive study of neurological IHCs conducted from May 2013 to April 2014 at our tertiary hospital. A total of 472 patients were included (mean age, 62.1 years; male patients, 56.8%) and 24.4% had previously been evaluated by a neurologist. Patients were hospitalised a median of 18 days and 19.7% had been referred by another hospital. The departments requesting the most in-hospital consultations were intensive care (20.1%), internal medicine (14.4%), and cardiology (9.1%). Reasons for requesting an IHC were stroke (26.9%), epilepsy (20.6%), and confusional states (7.6%). An on-call neurologist evaluated 41.9% of the patients. The purpose of the IHC was to provide a diagnosis in 56.3% and treatment in 28.2% of the cases; 69.5% of the patients required additional tests. Treatment was adjusted in 18.9% of patients and additional drugs were administered to 27.3%. While 62.1% of cases required no additional IHCs, 11% required further assessment, and 4.9% were transferred to the neurology department. Of the patient total, 16.9% died during hospitalisation (in 37.5%, the purpose of the consultation was to certify brain death); 45.6% were referred to the neurology department at discharge and 6.1% visited the emergency department due to neurological impairment within 6 months of discharge. IHCs facilitate diagnosis and management of patients with neurological diseases, which may help reduce the number of visits to the emergency department. On-call neurologists are essential in tertiary hospitals, and they are frequently asked to diagnose brain death. Copyright © 2015 Sociedad Española de Neurología. Publicado por Elsevier España, S.L.U. All rights reserved.

  4. Coronary Artery Disease: A Descriptive Analysis of Risk Factors: Before and After Treatment

    Directory of Open Access Journals (Sweden)

    S Dinkar, Suresh Rao, M Vakamudi, R Saldanha, KR Balakrishnan

    2010-12-01

    Full Text Available With the increasing life span of man, the number of ageing people is also increasing, and along with that the number of diseases affecting them is also increasing. Atherosclerotic coronary artery disease is one of them. Coronary revascularization was started in 1960s through the pioneering efforts of David Sabiston and Kolessov. This is a retrospective descriptive study. A total number of 1050 Patients were operated on beating heart surgery from 1998 to Nov. 2002 out of which 852 patients were analysed to know the results and prognostic outcomes. Mean age was 57.8 years (range 31 - 80years with M:F ratio of 7:1. The pre-operative parameters studied were DM, HTN, hyperlipidemia, family history of CAD, smoking and past history of MI. other parameters included pulmonary disease, chronic renal failure, CVA, APD, PVD etc. 55.75% patients were diabetic, 53.99% were hypertensive and 24.4% had history of hyperlipidemia. Family history was positive in 12.9% of the patients, 25% were smokers and 44% had history of previous MI. average number of diseased vessels was 2.34 with triple vessel disease being most common. 6.6% had history of pulmonary diseases, 7.4% had pre-operative renal failure and 2% had past history of CV stroke. Overall in-hospital mortality was 1.4%. relative risk for mortality, morbidity, new onset of renal failure, CVA, arrhythmias, CCF were calculated. Mean hospital stay was 9.83days (range 6 - 41years, mean ICU stay was 74.3 hours (range 73 - 700years. Usage of blood and blood products was significantly less. Freedom from complications was 93%. LVEF<40%, age >70 years, high diastolic PA pressure> 15mm Hg were found to be significant risk factors for mortality. Females were found to be 2.6 times more at risk for mortality and development of complications as compared to males. Patients with previous history of MI were found to be more at risk of developing complications increasing their hospital and ICU stay.

  5. Simulation and analysis of resin flow in injection machine screw

    Institute of Scientific and Technical Information of China (English)

    Ling-feng LI; Samir MEKID

    2008-01-01

    A method with simulation and analysis of the resin flow in a screw is presented to ease the control of some problems that may affect the efficiency and the quality of the product among existing screws in an injection machine. The physical model of a screw is established to represent the stress, the strain, the relationship between velocity and stress, and the temperature of the cells. In this paper, a working case is considered where the velocity and the temperature distributions at any section of the flow are obtained. The analysis of the computational results shows an ability to master various parameters depending on the specifications.

  6. Consistent simulation of bromine chemistry from the marine boundary layer to the stratosphere, Part I: model description, sea salt aerosols and pH

    Directory of Open Access Journals (Sweden)

    A. Kerkweg

    2008-04-01

    Full Text Available This is the first article of a series presenting a detailed analysis of bromine chemistry simulated with the atmospheric chemistry general circulation model ECHAM5/MESSy. Release from sea salt is an important bromine source, hence the model explicitly calculates aerosol chemistry and phase partitioning for coarse mode aerosol particles. Many processes including chemical reaction rates are influenced by the particle size distribution, and aerosol associated water strongly affects the aerosol pH. Knowledge of the aerosol pH is important as it determines the aerosol chemistry, e.g., the efficiency of sulphur oxidation and bromine release. Here, we focus on the simulated sea salt aerosol size distribution and the coarse mode aerosol pH.

    A comparison with available field data shows that the simulated aerosol distributions agree reasonably well within the range of measurements. In spite of the small number of aerosol pH measurements and the uncertainty in its experimental determination, the simulated aerosol pH compares well with the observations. The aerosol pH ranges from alkaline aerosol in areas of strong production down to pH values of 1 over regions of medium sea salt production and high levels of gas phase acids, mostly polluted regions over the oceans in the northern hemisphere.

  7. Consistent simulation of bromine chemistry from the marine boundary layer to the stratosphere – Part 1: Model description, sea salt aerosols and pH

    Directory of Open Access Journals (Sweden)

    A. Kerkweg

    2008-10-01

    Full Text Available This is the first article of a series presenting a detailed analysis of bromine chemistry simulated with the atmospheric chemistry general circulation model ECHAM5/MESSy. Release from sea salt is an important bromine source, hence the model explicitly calculates aerosol chemistry and phase partitioning for coarse mode aerosol particles. Many processes including chemical reaction rates are influenced by the particle size distribution, and aerosol associated water strongly affects the aerosol pH. Knowledge of the aerosol pH is important as it determines the aerosol chemistry, e.g., the efficiency of sulphur oxidation and bromine release. Here, we focus on the simulated sea salt aerosol size distribution and the coarse mode aerosol pH.

    A comparison with available field data shows that the simulated aerosol distributions agree reasonably well within the range of measurements. In spite of the small number of aerosol pH measurements and the uncertainty in its experimental determination, the simulated aerosol pH compares well with the observations. The aerosol pH ranges from alkaline aerosol in areas of strong production down to pH-values of 1 over regions of medium sea salt production and high levels of gas phase acids, mostly polluted regions over the oceans in the Northern Hemisphere.

  8. A Descriptive Analysis of Health-Related Infomercials: Implications for Health Education and Media Literacy

    Science.gov (United States)

    Hill, Susan C.; Lindsay, Gordon B.; Thomsen, Steve R.; Olsen, Astrid M.

    2003-01-01

    Media literacy education helps individuals become discriminating consumers of health information. Informed consumers are less likely to purchase useless health products if informed of misleading and deceptive advertising methods. The purpose of this study was to conduct a content analysis of health-related TV infomercials. An instrument…

  9. The Training of the Battalion Staff Intelligence Officer: A Descriptive Analysis and Sample Program.

    Science.gov (United States)

    1993-11-01

    the fundamental skills and expertise required to perform their jobs ( Thompson , Thompson , Pleban, & Valentine, 1991; Goldsmith & Hodges, 1987). Staff...Staff training analysis by Thompson & Thompson (Dec 8, 1992) has revealed systemic training deficiencies. Surveys of Armor Officer Advanced Course...under demanding CTC conditions and stated the need for home station training with equally demanding conditions ( Thompson , Thompson , Pleban, & Valentine

  10. National Aviation Fuel Scenario Analysis Program (NAFSAP). Volume I. Model Description. Volume II. User Manual.

    Science.gov (United States)

    1980-03-01

    TESI CHART NATIONAI RUREAt (F ANDA[)Rt 1V4 A NATIONAL. AVIATION ~ FUEL SCENARIO.. ANALYSIS PROGRAM 49!! VOLUM I: MODEL DESCRIA~v 4<C VOLUME II: tr)ER...executes post processor which translates results of the graphics program to machine readable code used by the pen plotter) cr (depressing the carriage

  11. On the necessity of stochastic material descriptions in the computational analysis of soils

    NARCIS (Netherlands)

    Gutierrez, M.A.; Borst, R. de

    1999-01-01

    The necessity of considering stochastic imperfections in the numerical analysis of localisation phenomena in soils is demonstrated by means of a biaxial compression test on a viscoplastic material. The material strength, the Young's modulus and the softening modulus are considered to be random field

  12. Quantitative and Descriptive Comparison of Four Acoustic Analysis Systems: Vowel Measurements

    Science.gov (United States)

    Burris, Carlyn; Vorperian, Houri K.; Fourakis, Marios; Kent, Ray D.; Bolt, Daniel M.

    2014-01-01

    Purpose: This study examines accuracy and comparability of 4 trademarked acoustic analysis software packages (AASPs): Praat, WaveSurfer, TF32, and CSL by using synthesized and natural vowels. Features of AASPs are also described. Method: Synthesized and natural vowels were analyzed using each of the AASP's default settings to secure 9…

  13. Mathematical Description of Wafer-1, a Three-Dimensional Code for LWR Fuel Performance Analysis

    DEFF Research Database (Denmark)

    Kjær-Pedersen, Niels

    1975-01-01

    This article describes in detail the mathematical formulation used in the WAFER-1 code, which is presently used for three-dimensional analysis of LWR fuel pin performance. The code aims at a prediction of the local stress-strain history in the cladding, especially with regard to the ridging pheno...

  14. Analysis of airborne radiometric data. Volume 2. Description, listing, and operating instructions for the code DELPHI/MAZAS. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sperling, M.; Shreve, D.C.

    1978-12-01

    The computer code DELPHI is an interactive English language command system for the analysis of airborne radiometric data. The code includes modules for data reduction, data simulation, time filtering, data adjustment and graphical presentation of the results. DELPHI is implemented in FORTRAN on a DEC-10 computer. This volume gives a brief set of operations instructions, samples of the output obtained from hard copies of the display on a Tektronix terminal and finally a listing of the code.

  15. Simulation and template generation for LISA Pathfinder Data Analysis

    Science.gov (United States)

    Rais, Boutheina; Grynagier, Adrien; Diaz-Aguiló, Marc; Armano, Michele

    The LISA PathFinder (LPF) mission is a technology demonstration mission which aims at testing a number of critical technical challenges that the future LISA (Gravitational wave detection in space) mission will face: LPF can be seen as a complex laboratory experiment in space. It is therefore critical to be able to define which measurements and which actuations will be applied during the scientific part of the mission. The LISA Technology Package (LTP), part of ESA's hardware contribution to LPF, outlines hence the importance of developing an appropriate simulation tool in order to test these strate-gies before launch and to analyse the dynamical behaviour of the system during the mission. The detailed model of the simulation can be used in an off-line mode for further planning: cor-rect estimation of timeline priorities, risk factors, duty cycles, data analysis readiness. The Lisa Technology Package Data Analysis (LTPDA) team has developed an object-oriented MATLAB toolbox for general case of data analysis needs. However, to meet specific needs of LPF mis-sion, a template generation tool has been developed. It provides a recognizable data pattern, avoiding the risk of missing the model during mission's analysis. The aim of the template generator tool is to provide tools to analyse LTP system modeled in State Space Model (SSM). The SSM class, the aim of this poster, includes this tools within the LTPDA toolbox. It can be used to generate the time-domain response for any given actuation and/or noise, the frequency response using bode diagrams and the steady state of the system. It allows the user to project noises on system outputs to get spectra of outputs for given input noises spectra. This class is sufficiently general to be used with a variety of systems once the SSM of the system is provided in the library. Furthermore, one of the main objectives of the data analysis for LPF (the estimation of different parameters of the system), can be achieved by a new

  16. Analytical considerations and dimensionless analysis for a description of particle interactions in high pressure processes

    Science.gov (United States)

    Rauh, Cornelia; Delgado, Antonio

    2010-12-01

    High pressures of up to several hundreds of MPa are utilized in a wide range of applications in chemical, bio-, and food engineering, aiming at selective control of (bio-)chemical reactions. Non-uniformity of process conditions may threaten the safety and quality of the resulting products because processing conditions such as pressure, temperature, and treatment history are crucial for the course of (bio-)chemical reactions. Therefore, thermofluid-dynamical phenomena during the high pressure process have to be examined, and numerical tools to predict process uniformity and to optimize the processes have to be developed. Recently applied mathematical models and numerical simulations of laboratory and industrial scale high pressure processes investigating the mentioned crucial phenomena are based on continuum balancing models of thermofluid dynamics. Nevertheless, biological systems are complex fluids containing the relevant (bio-)chemical compounds (enzymes and microorganisms). These compounds are particles that interact with the surrounding medium and between each other. This contribution deals with thermofluid-dynamical interactions of the relevant particulate (bio-)chemical compounds (enzymes and microorganisms) with the surrounding fluid. By consideration of characteristic time and length scales and particle forces, the motion of the (bio-)chemical compounds is characterized.

  17. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    Energy Technology Data Exchange (ETDEWEB)

    FORESTER,JOHN A.; BLEY,DENNIS C.; COOPER,SUSANE; KOLACZKOWSKI,ALAN M.; THOMPSON,CATHERINE; RAMEY-SMITH,ANN; WREATHALL,JOHN

    2000-07-18

    This paper describes the most recent version of a human reliability analysis (HRA) method called ``A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed.

  18. Non-Archimedean mathematical analysis methods in description of deformation of structurally inhomogeneous geomaterials

    Science.gov (United States)

    Lavrikov, SV; Mikenina, OA; Revuzhenko, AF

    2017-02-01

    Under analysis is an approach to mathematical modeling of structurally inhomogeneous rocks considering structural hierarchy and internal self-balanced stresses. The fields of stresses and strains at various scale levels of rock mass medium are characterized using the non-Archimedean analysis methods. It is shown that such model describes accumulationtion of elastic energy in the form of internal self-balanced stresses on a micro-scale. The finite element algorithm and a computer program are developed to solve plane boundary-value problems. The calculated data on compression of a rock specimen are reported. The paper shows that the behavior of plastic strain zones largley depends on the pre-set initital micro-stresses.

  19. Description of practice as an ambulatory care nurse: psychometric properties of a practice-analysis survey.

    Science.gov (United States)

    Baghi, Heibatollah; Panniers, Teresa L; Smolenski, Mary C

    2007-01-01

    Changes within nursing demand that a specialty conduct periodic, appropriate practice analyses to continually validate itself against preset standards. This study explicates practice analysis methods using ambulatory care nursing as an exemplar. Data derived from a focus group technique were used to develop a survey that was completed by 499 ambulatory care nurses. The validity of the instrument was assessed using principal components analysis; reliability was estimated using Cronbach's alpha coefficient. The focus group with ambulatory care experts produced 34 knowledge and activity statements delineating ambulatory care nursing practice. The survey data produced five factors accounting for 71% of variance in the data. The factors were identified as initial patient assessment, professional nursing issues and standards, client care management skills, technical/clinical skills, and system administrative operations. It was concluded that practice analyses delineate a specialty and provide input for certification examinations aimed at measuring excellence in a field of nursing.

  20. Description of a Portable Wireless Device for High-Frequency Body Temperature Acquisition and Analysis

    Science.gov (United States)

    Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau

    2009-01-01

    We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility. PMID:22408473

  1. Detailed description and user`s manual of high burnup fuel analysis code EXBURN-I

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Saitou, Hiroaki

    1997-11-01

    EXBURN-I has been developed for the analysis of LWR high burnup fuel behavior in normal operation and power transient conditions. In the high burnup region, phenomena occur which are different in quality from those expected for the extension of behaviors in the mid-burnup region. To analyze these phenomena, EXBURN-I has been formed by the incorporation of such new models as pellet thermal conductivity change, burnup-dependent FP gas release rate, and cladding oxide layer growth to the basic structure of low- and mid-burnup fuel analysis code FEMAXI-IV. The present report describes in detail the whole structure of the code, models, and materials properties. Also, it includes a detailed input manual and sample output, etc. (author). 55 refs.

  2. A Computational Approach for Probabilistic Analysis of LS-DYNA Water Impact Simulations

    Science.gov (United States)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2010-01-01

    NASA s development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. Because of the computational cost, these tools are often used to evaluate specific conditions and rarely used for statistical analysis. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. For this problem, response surface models are used to predict the system time responses to a water landing as a function of capsule speed, direction, attitude, water speed, and water direction. Furthermore, these models can also be used to ascertain the adequacy of the design in terms of probability measures. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  3. Thermodynamic Analysis of Catalytic Cracking Reactions as the First Stage in the Development of Mathematical Description

    OpenAIRE

    Nazarova, Galina Yurievna; Ivanchina, Emilia Dmitrievna; Ivashkina, Elena Nikolaevna; Kiseleva, Svetlana; Stebeneva, Valeria

    2015-01-01

    In this work thermodynamic analysis of catalytic cracking reaction involving the high molecular weight hydrocarbons was carried out using quantum chemical method of calculation realized in Gaussian software. The method of calculation is DFT (Density Functional Theory), theoretical approximation is B3LYP model, 3-21G basis. The list of catalytic cracking reactions for calculation was prepared on the basis of the theoretical data about catalytic cracking, laboratory and experimental data from t...

  4. A Descriptive Analysis of Relationship between Extraversion-introversion and Behavior and Lifestyle Patterns among Engineering Students in India

    Directory of Open Access Journals (Sweden)

    Abhinaw Kumar Singh

    2014-05-01

    Full Text Available The present study focuses on a descriptive statistical analysis of the relationship between extraversion and introversion tendencies and various factors that can affect extraversion-introversion among engineering students in India. The factors included the native place, age, grades in English, use of social networking websites etc. The results indicated towards some interesting trends like more participants from cities, as compared to metros and villages being extraverts. The younger generation and people from North India were mostly extraverted while aging makes people introverted with time. More of those with better marks in English were extraverts and extraverts were able to speak more number of languages in general as well. A heavy use of Facebook for communication was found to be similar between extraverts and introverts. It was concluded that the study results can be used have better understanding of the relationship between extraversion/introversion and the different variables used in the study.

  5. Self-care 3 months after attending chronic obstructive pulmonary disease patient education: a qualitative descriptive analysis

    DEFF Research Database (Denmark)

    Mousing, Camilla Askov; Lomborg, Kirsten

    2012-01-01

    Purpose: The authors performed a qualitative descriptive analysis to explore how group patient education influences the self-care of patients with chronic obstructive pulmonary disease. Patients and methods: In the period 2009–2010, eleven patients diagnosed with chronic obstructive pulmonary....... Talking to health care professionals focused the patients' attention on their newly acquired skills and the research interview made them more aware of their enhanced self-care. Conclusion: Patients' self-care may be enhanced through group education, even though the patients are not always able to see...... the immediate outcome. Some patients may require professional help to implement their newly acquired knowledge and skills in everyday life. A planned dialogue concentrating on self-care in everyday life 3 months after finishing the course may enhance patients' awareness and appraisal of their newly acquired...

  6. The application of the SXF lattice description and the UAL software environment to the analysis of the LHC

    CERN Document Server

    Fischer, W; Ptitsyn, V I

    1999-01-01

    A software environment for accelerator modeling has been developed which includes the UAL (Unified Accelerator Library), a collection of accelerator physics libraries with a Perl interface for scripting, and the SXF (Standard eX-change Format), a format for accelerator description which extends the MAD sequence by including deviations from design values. SXF interfaces have been written for several programs, including MAD9 and MAD8 via the doom database, Cosy, TevLat and UAL itself, which includes Teapot++. After an overview of the software we describe the application of the tools to the analysis of the LHC lattice stability, in the presence of alignment and coupling errors, and to the correction of the first turn and closed orbit in the machine. (7 refs).

  7. A Descriptive Analysis of Music Therapists' Perceptions of Delivering Services in Inclusive Settings: A Challenge to the Field.

    Science.gov (United States)

    Jones; Cardinal

    1999-01-01

    The purpose of this study was to examine the perceptions of music therapists toward inclusion (providing services within general education settings) and to determine their willingness to provide their services in these settings. A questionnaire was sent to 560 music therapists of which 373 responded (67%). A descriptive analysis indicated that although the vast majority of music therapists are providing their services in a segregated setting, they (a) overwhelmingly know about inclusion, (b) perceive benefits to clients with and without disabilities, and (c) are willing to provide their services within an inclusive setting. Why then do therapists so overwhelmingly provide their services in noninclusive settings? Possible answers to this question as well as the challenge this creates to the field of music therapy are discussed.

  8. Self-care 3 months after attending chronic obstructive pulmonary disease patient education: a qualitative descriptive analysis

    DEFF Research Database (Denmark)

    Mousing, Camilla A; Lomborg, Kirsten

    2012-01-01

    Purpose: The authors performed a qualitative descriptive analysis to explore how group patient education influences the self-care of patients with chronic obstructive pulmonary disease. Patients and methods: In the period 2009–2010, eleven patients diagnosed with chronic obstructive pulmonary....... Talking to health care professionals focused the patients' attention on their newly acquired skills and the research interview made them more aware of their enhanced self-care. Conclusion: Patients' self-care may be enhanced through group education, even though the patients are not always able to see...... disease completed an 8-week group education program in a Danish community health center. The patients were interviewed 3 months after completion of the program. Findings: Patients reported that their knowledge of chronic obstructive pulmonary disease had increased, that they had acquired tools to handle...

  9. Description, field test and data analysis of a controlled-source EM system (EM-60). [Leach Hot Springs, Grass Valley

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, H.F.; Goldstein, N.E.; Hoversten, M.; Oppliger, G.; Riveros, C.

    1978-10-01

    The three sections describe the transmitter, the receiver, and data interpretations and indicate the advances made toward the development of a large moment electromagnetic (EM) system employing a magnetic dipole source. A brief description is given of the EM-60 transmitter, its general design, and the consideration involved in the selection of a practical coil size and weight for routine field operations. A programmable, multichannel, multi-frequency, phase-sensitive receiver is described. A field test of the EM-60, the data analysis and interpretation procedures, and a comparison between the survey results and the results obtained using other electrical techniques are presented. The Leach Hot Springs area in Grass Valley, Pershing County, Nevada, was chosen for the first field site at which the entire system would be tested. The field tests showed the system capable of obtaining well-defined sounding curves (amplitude and phase of magnetic fields) from 1 kHz down to 0.1 Hz. (MHR)

  10. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    Science.gov (United States)

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  11. An advanced machining simulation environment employing workpiece structural analysis

    Directory of Open Access Journals (Sweden)

    A.A. Becker

    2006-04-01

    Full Text Available Purpose: The study aims to reduce the surface dimensional error due to the part deflection during the machining of thin wall structures, thus, reduce machining costs and lead times by producing “right first time” components.Design/methodology/approach: The proposed simulation environment involves a data model, an analytical force prediction model, a material removal model and an FE analysis commercial software package. It focuses on the development of the simulation environment with a multi-level machining error compensation approach.Findings: The developed simulation environment can predict and reduce the form error, which is a limitation of the existing approaches.Research limitations/implications: The energy consumption, temperature change and residual stress are not studied in this research.Practical implications: The developed method provides a platform to deliver new functionality for machining process simulation. The convergence of the proposed integrated system can be achieved quickly after only a few iterations, which makes the methodology reliable and efficient.Originality/value: The study offers an opportunity to satisfy tight tolerances, eliminate hand-finishing processes and assure part-to-part accuracy at the right first time, which is a limitation of previous approaches.

  12. Numerical Zooming Between a NPSS Engine System Simulation and a One-Dimensional High Compressor Analysis Code

    Science.gov (United States)

    Follen, Gregory; auBuchon, M.

    2000-01-01

    Within NASA's High Performance Computing and Communication (HPCC) program, NASA Glenn Research Center is developing an environment for the analysis/design of aircraft engines called the Numerical Propulsion System Simulation (NPSS). NPSS focuses on the integration of multiple disciplines such as aerodynamics, structures, and heat transfer along with the concept of numerical zooming between zero-dimensional to one-, two-, and three-dimensional component engine codes. In addition, the NPSS is refining the computing and communication technologies necessary to capture complex physical processes in a timely and cost-effective manner. The vision for NPSS is to create a "numerical test cell" enabling full engine simulations overnight on cost-effective computing platforms. Of the different technology areas that contribute to the development of the NPSS Environment, the subject of this paper is a discussion on numerical zooming between a NPSS engine simulation and higher fidelity representations of the engine components (fan, compressor, burner, turbines, etc.). What follows is a description of successfully zooming one-dimensional (row-by-row) high-pressure compressor analysis results back to a zero-dimensional NPSS engine simulation and a discussion of the results illustrated using an advanced data visualization tool. This type of high fidelity system-level analysis, made possible by the zooming capability of the NPSS, will greatly improve the capability of the engine system simulation and increase the level of virtual test conducted prior to committing the design to hardware.

  13. Cross-cultural comparisons among the sensory characteristics of fermented soybean using Korean and Japanese descriptive analysis panels.

    Science.gov (United States)

    Chung, L; Chung, S-J

    2007-11-01

    One of the most important initial steps in exporting a food product to another country from the R&D perspective is to describe and translate the sensory characteristics of a food product appropriately into the language of the target country. The objectives of this study were to describe and compare the sensory characteristics of Korean and Japanese style fermented soybean products, and to cross-culturally compare the lexicons of the identical product generated by the Korean and Japanese panelists. Four types of Korean and 4 types of Japanese style fermented soybean consisting of whole bean type and paste type were analyzed. Ten Korean and 9 Japanese panelists were recruited in Korea. Two separate descriptive analyses were conducted, with the panelists differing in their country of origin. Each group was trained, developed lexicon, and conducted descriptive analysis independently. Analysis of variance and various multivariate analyses were applied to delineate the sensory characteristics of the samples and to compare the cross-cultural differences in the usage of lexicon. The Korean and Japanese panelists generated 48 and 36 sensory attributes, respectively. Cross-cultural consensus was shown for evaluating the whole bean type fermented soybean and white miso, which were relatively distinctive samples. However, for the less distinctive samples, the panelists tend to rate higher in negative attributes for the fermented soybeans that originated from the other country. The Japanese panelists grouped the samples by their country of origin and soy sauce flavor was the main attribute for cross-cultural differentiation. However, the Korean panelists did not make a cross-cultural distinction among the samples.

  14. Spatially continuous approach to the description of incoherencies in fast reactor accident analysis

    Energy Technology Data Exchange (ETDEWEB)

    Luck, L B

    1976-12-01

    A generalized cell-type approach is developed in which individual subassemblies are represented as a unit. By appropriate characterization of the results of separate detailed investigations, spatial variations within a cell are represented as a superposition. The advantage of this approach is that costly detailed cell-type information is generated only once or a very few times. Spatial information obtained by the cell treatment is properly condensed in order to drastically reduce the transient computation time. Approximate treatments of transient phenomena are developed based on the use of distributions of volume and reactivity worth with temperature and other reactor parameters. Incoherencies during transient are physically dependent on the detailed variations in the initial state. Therefore, stationary volumetric distributions which contain in condensed form the detailed initial incoherency information provides a proper basis for the transient treatment. Approximate transient volumetric distributions are generated by a suitable transformation of the stationary distribution to reflect the changes in the transient temperature field. Evaluation of transient changes is based on results of conventional uniform channel calculations and a superposition of lateral variations as they are derived from prior cell investigations. Specific formulations are developed for the treatment of reactivity feedback. Doppler and sodium expansion reactivity feedback is related to condensed temperature-worth distributions. Transient evaluation of the worth distribution is based on the relation between stationary and transient volumetric distributions, which contains the condensed temperature field information. Coolant voiding is similarly treated with proper distribution information. Results show that the treatments developed for the transient phase up to and including sodium boiling constitute a fast and effective simulation of inter- and intra-subassembly incoherence effects.

  15. Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

    CERN Document Server

    The ENVISION Collaboration

    2014-01-01

    Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

  16. Simulated, Emulated, and Physical Investigative Analysis (SEPIA) of networked systems.

    Energy Technology Data Exchange (ETDEWEB)

    Burton, David P.; Van Leeuwen, Brian P.; McDonald, Michael James; Onunkwo, Uzoma A.; Tarman, Thomas David; Urias, Vincent E.

    2009-09-01

    This report describes recent progress made in developing and utilizing hybrid Simulated, Emulated, and Physical Investigative Analysis (SEPIA) environments. Many organizations require advanced tools to analyze their information system's security, reliability, and resilience against cyber attack. Today's security analysis utilize real systems such as computers, network routers and other network equipment, computer emulations (e.g., virtual machines) and simulation models separately to analyze interplay between threats and safeguards. In contrast, this work developed new methods to combine these three approaches to provide integrated hybrid SEPIA environments. Our SEPIA environments enable an analyst to rapidly configure hybrid environments to pass network traffic and perform, from the outside, like real networks. This provides higher fidelity representations of key network nodes while still leveraging the scalability and cost advantages of simulation tools. The result is to rapidly produce large yet relatively low-cost multi-fidelity SEPIA networks of computers and routers that let analysts quickly investigate threats and test protection approaches.

  17. Fluid Flow Simulation and Energetic Analysis of Anomalocarididae Locomotion

    Science.gov (United States)

    Mikel-Stites, Maxwell; Staples, Anne

    2014-11-01

    While an abundance of animal locomotion simulations have been performed modeling the motions of living arthropods and aquatic animals, little quantitative simulation and reconstruction of gait parameters has been done to model the locomotion of extinct animals, many of which bear little physical resemblance to their modern descendants. To that end, this project seeks to analyze potential swimming patterns used by the anomalocaridid family, (specifically Anomalocaris canadensis, a Cambrian Era aquatic predator), and determine the most probable modes of movement. This will serve to either verify or cast into question the current assumed movement patterns and properties of these animals and create a bridge between similar flexible-bodied swimmers and their robotic counterparts. This will be accomplished by particle-based fluid flow simulations of the flow around the fins of the animal, as well as an energy analysis of a variety of sample gaits. The energy analysis will then be compared to the extant information regarding speed/energy use curves in an attempt to determine which modes of swimming were most energy efficient for a given range of speeds. These results will provide a better understanding of how these long-extinct animals moved, possibly allowing an improved understanding of their behavioral patterns, and may also lead to a novel potential platform for bio-inspired underwater autonomous vehicles (UAVs).

  18. Automated analysis for detecting beams in laser wakefield simulations

    Energy Technology Data Exchange (ETDEWEB)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-07-03

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets.

  19. SPATIAL CORRELATION DESCRIPTION OF DEFORMATION OBJECT BASED ON FUZZY CLUSTERING AND GEOLOGICAL ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    The methods of deformation analysis and modeling at single point are realized easily now,but available approaches do not make full use of the information from monitoring points and can not reveal integrated deformation regularity of a deformable body.This paper presents a fuzzy clusetering method to analyze the correlative relations of multiple points in space,and then the spatial model for a practical dangerous rockmass in the area of Three Gorges,Yangtze River is established,in which the correlation of six points in space is analyzed by geological investigation and fuzzy set theory.

  20. Performance-Based Technology Selection Filter description report. INEL Buried Waste Integrated Demonstration System Analysis project

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  1. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  2. A descriptive analysis of potential reinforcement contingencies in the preschool classroom.

    Science.gov (United States)

    McKerchar, Paige M; Thompson, Rachel H

    2004-01-01

    In recent years, functional analysis methods have been extended to classroom settings; however, research has not evaluated the extent to which consequences presented during functional analysis are associated with problem behavior under naturalistic classroom conditions. Therefore, the purpose of this study was to determine whether the social consequences commonly manipulated in functional analyses occur in typical preschool classrooms. A total of 14 children attending preschool programs participated in the study. Data were collected on the occurrence of antecedent events (e.g., presentation of tasks), child behaviors (e.g., aggression), and teacher responses (e.g., delivery of attention). The probability of various teacher responses given child behavior was then calculated and compared to the response-independent probabilities of teacher responses. Attention was found to be the most common classroom consequence (100% of children), followed by material presentation (79% of children), and escape from instructional tasks (33% of children). Comparisons of conditional and response-independent probabilities showed that the probability of teacher attention increased given the occurrence of problem behavior for all children, suggesting that a contingency existed between these two events. Results suggest that functional analyses that test the effects of attention, escape, and access to materials on problem behavior may be appropriate for preschool settings.

  3. 3-D description of fracture surfaces and stress-sensitivity analysis for naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, S.Q.; Jioa, D.; Meng, Y.F.; Fan, Y.

    1997-08-01

    Three kinds of reservoir cores (limestone, sandstone, and shale with natural fractures) were used to study the effect of morphology of fracture surfaces on stress sensitivity. The cores, obtained from the reservoirs with depths of 2170 to 2300 m, have fractures which are mated on a large scale, but unmated on a fine scale. A specially designed photoelectric scanner with a computer was used to describe the topography of the fracture surfaces. Then, theoretical analysis of the fracture closure was carried out based on the fracture topography generated. The scanning results show that the asperity has almost normal distributions for all three types of samples. For the tested samples, the fracture closure predicted by the elastic-contact theory is different from the laboratory measurements because plastic deformation of the aspirates plays an important role under the testing range of normal stresses. In this work, the traditionally used elastic-contact theory has been modified to better predict the stress sensitivity of reservoir fractures. Analysis shows that the standard deviation of the probability density function of asperity distribution has a great effect on the fracture closure rate.

  4. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  5. Comments on "Isentropic Analysis of a Simulated Hurricane"

    CERN Document Server

    Marquet, Pascal

    2016-01-01

    This paper describes Comments to the paper of Mrowiec et al. published in the J. Atmos. Sci. in May 2016 (Vol 73, Issue 5, pages 1857-1870) and entitled "Isentropic analysis of a simulated hurricane". It is explained that the plotting of isentropic surfaces (namely the isentropes) requires a precise definition of the specific moist-air entropy, and that most of existing "equivalent potential temperatures" lead to inaccurate definitions of isentropes. It is shown that the use of the third law of thermodynamics leads to a definition of the specific moist-air entropy (and of a corresponding potential temperature) which allow the plotting of unambigous moist-air isentropes. Numerical applications are shown by using a numerical simulation of the hurricane DUMILE.

  6. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2012-06-01

    Full Text Available This paper presents a hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this model, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The model includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for not only hydrological processes, but also for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity.

  7. Simulation of probabilistic wind loads and building analysis

    Science.gov (United States)

    Shah, Ashwin R.; Chamis, Christos C.

    1991-01-01

    Probabilistic wind loads likely to occur on a structure during its design life are predicted. Described here is a suitable multifactor interactive equation (MFIE) model and its use in the Composite Load Spectra (CLS) computer program to simulate the wind pressure cumulative distribution functions on four sides of a building. The simulated probabilistic wind pressure load was applied to a building frame, and cumulative distribution functions of sway displacements and reliability against overturning were obtained using NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), a stochastic finite element computer code. The geometry of the building and the properties of building members were also considered as random in the NESSUS analysis. The uncertainties of wind pressure, building geometry, and member section property were qualified in terms of their respective sensitivities on the structural response.

  8. Theoretical analysis and simulation of thermoelastic deformation of bimorph microbeams

    Institute of Scientific and Technical Information of China (English)

    SHANG; YuanFang; YE; XiongYing; FENG; JinYang

    2013-01-01

    In this paper, a purely mechanical model for the thermoelastic behavior of a bimorph microbeam is presented. The thermoelastic coupling problem of the microbeam is converted to a mechanical problem by simply replacing the thermal stress in the beam with a bulk force and a surface force. Thermoelastic deformation of the bimorph microbeams with constraints frequently used in micro-electro-mechanical systems (MEMS) devices has been derived based on this model and is characterized by FEA simulation. Coincidence of the results from theory and simulation demonstrates the validity of the model. The analysis shows that a bimorph microbeam with a soft constraint and a uniform temperature field has a larger thermoelastic deformation than that with a hard constraint and a linear temperature field. In addition to the adoption of materials with large CTE mismatch,thickness ratio and length ratio of the two layers need to be optimized to get a large thermoelastic deformation.

  9. Bed capacity and surgical waiting lists: a simulation analysis

    Directory of Open Access Journals (Sweden)

    Manel Antelo

    2015-12-01

    Full Text Available Waiting time for elective surgery is a key problem in the current medical world. This paper aims to reproduce, by a Monte Carlo simulation model, the relationship between hospital capacity, inpatient activity, and surgery waiting list size in teaching hospitals. Inpatient activity is simulated by fitting a Normal distribution to real inpatient activity data, and the effect of the number of beds on inpatient activity is modelled with a linear regression model. Analysis is performed with data of the University Multi-Hospital Complex of Santiago de Compostela (Santiago de Compostela, Spain, by considering two scenarios regarding the elastiticity of demand with bed increase. If demand does not grow with an increase on bed capacity, small changes lead to drastic reductions in the waiting lists. However, if demand grows as bed capacity does, adding additional capacity merely makes waiting lists worse.

  10. Modelling, Simulation, and Analysis of HAL BangaloreInternational Airport

    Directory of Open Access Journals (Sweden)

    P. Lathasree

    2007-11-01

    Full Text Available Air traffic density in India and the world at large is growing fast and posing challengingproblems. The problems encountered can be parameterized as flight delay, workload of air trafficcontrollers and noise levels in and around aerodromes. Prediction and quantification of theseparameters aid in developing strategies for efficient air traffic management. In this study, themethod used for quantifying is by simulation and analysis of the selected aerodrome and airspace. This paper presents the results of simulation of HAL Bangalore International Airport,which is used by civil as well as military aircraft. With the test flying of unscheduled militaryaircraft and the increase in the civil air traffic, this airport is hitting the limit of acceptable delay.The workload on air traffic controllers is pushed to high during peak times. The noise contourprediction, especially for the test flying military aircraft is sounding a wake up call to thecommunities living in the vicinity of the Airport.

  11. Contact Thermal Analysis and Wear Simulation of a Brake Block

    Directory of Open Access Journals (Sweden)

    Nándor Békési

    2013-01-01

    Full Text Available The present paper describes an experimental test and a coupled contact-thermal-wear analysis of a railway wheel/brake block system through the braking process. During the test, the friction, the generated heat, and the wear were evaluated. It was found that the contact between the brake block and the wheel occurs in relatively small and slowly moving hot spots, caused by the wear and the thermal effects. A coupled simulation method was developed including numerical frictional contact, transient thermal and incremental wear calculations. In the 3D simulation, the effects of the friction, the thermal expansion, the wear, and the temperature-dependent material properties were also considered. A good agreement was found between the results of the test and the calculations, both for the thermal and wear results. The proposed method is suitable for modelling the slowly oscillating wear caused by the thermal expansions in the contact area.

  12. Swiftvis: Data Analysis And Visualization For Planetary Science Simulations

    Science.gov (United States)

    Lewis, Mark C.; Levison, H. F.; Kavanagh, G.

    2007-07-01

    SwiftVis is a tool originally developed as part of a rewrite of Swift to be used for analysis and plotting of simulations performed with Swift and Swifter. The extensibility built into the design has allowed us to make SwiftVis a general purpose analysis and plotting package customized to be usable by the planetary science community at large. SwiftVis is written in Java and has been tested on Windows, Linux, and Mac platforms. Its graphical interface allows users to do complex analysis and plotting without having to write custom code. The software package and a tutorial can be found at http://www.cs.trinity.edu/ mlewis/SwiftVis/.

  13. The Communication Link and Error ANalysis (CLEAN) simulator

    Science.gov (United States)

    Ebel, William J.; Ingels, Frank M.; Crowe, Shane

    1993-01-01

    During the period July 1, 1993 through December 30, 1993, significant developments to the Communication Link and Error ANalysis (CLEAN) simulator were completed and include: (1) Soft decision Viterbi decoding; (2) node synchronization for the Soft decision Viterbi decoder; (3) insertion/deletion error programs; (4) convolutional encoder; (5) programs to investigate new convolutional codes; (6) pseudo-noise sequence generator; (7) soft decision data generator; (8) RICE compression/decompression (integration of RICE code generated by Pen-Shu Yeh at Goddard Space Flight Center); (9) Markov Chain channel modeling; (10) percent complete indicator when a program is executed; (11) header documentation; and (12) help utility. The CLEAN simulation tool is now capable of simulating a very wide variety of satellite communication links including the TDRSS downlink with RFI. The RICE compression/decompression schemes allow studies to be performed on error effects on RICE decompressed data. The Markov Chain modeling programs allow channels with memory to be simulated. Memory results from filtering, forward error correction encoding/decoding, differential encoding/decoding, channel RFI, nonlinear transponders and from many other satellite system processes. Besides the development of the simulation, a study was performed to determine whether the PCI provides a performance improvement for the TDRSS downlink. There exist RFI with several duty cycles for the TDRSS downlink. We conclude that the PCI does not improve performance for any of these interferers except possibly one which occurs for the TDRS East. Therefore, the usefulness of the PCI is a function of the time spent transmitting data to the WSGT through the TDRS East transponder.

  14. Portable microcomputer for the analysis of plutonium gamma-ray spectra. Volume II. Software description and listings. [IAEAPU

    Energy Technology Data Exchange (ETDEWEB)

    Ruhter, W.D.

    1984-05-01

    A portable microcomputer has been developed and programmed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra. The unit includes a 16-bit LSI-11/2 microprocessor, 32-K words of memory, a 20-character display for user prompting, a numeric keyboard for user responses, and a 20-character thermal printer for hard-copy output of results. The unit weights 11 kg and has dimensions of 33.5 x 30.5 x 23.0 cm. This compactness allows the unit to be stored under an airline seat. Only the positions of the 148-keV /sup 241/Pu and 208-keV /sup 237/U peaks are required for spectral analysis that gives plutonium isotopic ratios and weight percent abundances. Volume I of this report provides a detailed description of the data analysis methodology, operation instructions, hardware, and maintenance and troubleshooting. Volume II describes the software and provides software listings.

  15. Disability in Mexico: a comparative analysis between descriptive models and historical periods using a timeline

    Directory of Open Access Journals (Sweden)

    Hugo Sandoval

    2017-07-01

    Full Text Available Some interpretations frequently argue that three Disability Models (DM (Charity, Medical/Rehabilitation, and Social correspond to historical periods in terms of chronological succession. These views permeate a priori within major official documents on the subject in Mexico. This paper intends to test whether this association is plausible by applying a timeline method. A document search was made with inclusion and exclusion criteria in databases to select representative studies with which to depict milestones in the timelines for each period. The following is demonstrated: 1 models should be considered as categories of analysis and not as historical periods, in that the prevalence of elements of the three models is present to date, and 2 the association between disability models and historical periods results in teleological interpretations of the history of disability in Mexico.

  16. Late onset autosomal dominant cerebellar ataxia a family description and linkage analysis with the hla system

    Directory of Open Access Journals (Sweden)

    Walter O. Arruda

    1991-09-01

    Full Text Available A family suffering an autosomal dominant form of late onset hereditary cerebellar ataxia is described. Eight affected family members were personally studied, and data from another four were obtained through anamnesis. The mean age of onset was 37.1±5.4 years (27-47 years. The clinical picture consisted basically of a pure ataxic cerebellar syndrome. CT-scan disclosed diffuse cerebellar atrophy with relative sparing of the brainstem (and no involvement of supratentorial structures. Neurophysiological studies (nerve conduction, VEP and BAEP were normal. Twenty-six individuals were typed for HLA histocompatibility antigens. Lod scores were calculated with the computer program LINKMAP. Close linkage of the ataxia gene with the HLA system in this family could be excluded - 0==0,02, z=(-2,17 - and the overall analysis of the lod scores suggest another chromossomal location than chromosome 6.

  17. A descriptive analysis of 10 years of research published in the Journal of Health Communication.

    Science.gov (United States)

    Freimuth, Vicki S; Massett, Holly A; Meltzer, Wendy

    2006-01-01

    This article describes the contents of the articles from the first decade of The Journal of Health Communication (JOHC). Three hundred and twenty-one published articles were reviewed and coded to determine the characteristics of the researchers, the types of research presented, the common health topics covered, and the research designs used. The results led to the following profile of a typical article. Its primary author is a U.S. academic. It probably focuses on smoking, HIV/AIDS, or cancer. It is an empirical research study, more likely to use quantitative, specifically survey methods, rather than qualitative methods. It probably is not driven by theory. It is much more likely to examine mass media communication than interpersonal communication. Its purpose is just as likely to be audience analysis as message design, as evaluation of a planned communication intervention. If its purpose is to evaluate a planned communication intervention however, that intervention is almost certainly a successful one.

  18. Descriptive analysis of the inequalities of health information resources between Alberta's rural and urban health regions.

    Science.gov (United States)

    Stieda, Vivian; Colvin, Barb

    2009-01-01

    In an effort to understand the extent of the inequalities in health information resources across Alberta, SEARCH Custom, HKN (Health Knowledge Network) and IRREN (Inter-Regional Research and Evaluation Network) conducted a survey in December 2007 to determine what library resources currently existed in Alberta's seven rural health regions and the two urban health regions. Although anecdotal evidence indicated that these gaps existed, the analysis was undertaken to provide empirical evidence of the exact nature of these gaps. The results, coupled with the published literature on the impact, effectiveness and value of information on clinical practice and administrative decisions in healthcare management, will be used to build momentum among relevant stakeholders to support a vision of equitably funded health information for all healthcare practitioners across the province of Alberta.

  19. Description and analysis of rewritings and revisions of El Cid theme aimed at reading promotion".

    Directory of Open Access Journals (Sweden)

    Aldo Daparte Jorge

    2012-11-01

    Full Text Available The spread of a sort of «Second Degree Literature» in the twentieth and twenty-first centuries constitutes as an unequivocal proof of the importance that acquired distinct literary system factors other than the product have. Of these factors the most unique are the market,the consumer, and the interference with other systems. For example, Cid material, in general terms, and Cantar de Mio Cid, in particular, applies the analysis tools of paratextuality and intertextualityproposed by Genette. The literary system factors are supplemented by a communicative and pragmatic narratological approach. They are also supplemented with a textual tradition in which some versions from different artistic and literary genres have achieved products, that to a greater or lesser extent, have got a social, cultural and educational role.

  20. Scientometric analysis of physics (1979-2008): A quantitative description of scientific impact

    Science.gov (United States)

    Zheng, YanNing; Yuan, JunPeng; Pan, YunTao; Zhao, XiaoYuan

    2011-01-01

    Citations are a way to show how researchers build on existing research to further evolve research. The citation count is an indication of the influence of specific articles. The importance of citations means that it is valuable to analyze the articles that are cited the most. This research investigates highly-cited articles in physics (1979-2008) using citation data from the ISI Web of Science. In this study, 1544205 articles were examined. The objective of the analysis was to identify and list the highly-productive countries, institutions, authors, and fields in physics. Based on the analysis, it was found that the USA is the world leader in physics, and Japan has maintained the highest growth rate in physics research since 1990. Furthermore, the research focus at Bell Labs and IBM has played important roles in physics. A striking fact is that the five most active authors are all Japanese, but the five most active institutions are all in the USA. In fact, only The University of Tokyo is ranked among the top 11 institutions, and only American authors have single-author articles ranked among the top 19 articles. The highest-impact articles are distributed across 25 subjects categories. Physics, Multidisciplinary has 424 articles, and is ranked at No. 1 in total articles; followed by Physics, Condensed Matter. The study can provide science policy makers with a picture of innovation capability in this field and help them make better decisions. Hopefully, this study will stimulate useful discussion among scientists and research managers about future research directions.

  1. DESCRIPTIVE AND DISCRIMINATORY SIGNIFICANCE OF POD PHENOTYPIC TRAITS FOR DIVERSITY ANALYSIS OF COCOA GENOTYPES

    Directory of Open Access Journals (Sweden)

    Daniel B. Adewale

    2013-12-01

    Full Text Available Intra-specific genetic diversity analysis precedes crop breeding proposal for species improvement. Sixteen and twenty-four parental and hybrid cocoa genotypes were respectively laid out in a randomized complete block design of six replications at Ibadan, Nigeria. A sampling unit of fifteen uniformly ripe pods was collected for assessment from each plot. Six quantitative data from the pods were subjected to statistical analysis. Highly significant (P < 0.0001 variability existed among the 40 genotypes. Range of performance of the genotypes were: pod weight (0.43 – 0.86kg, pod length (15.9 – 27.96cm, pod girth (21.51 – 34.07cm, pod thickness (1.26 – 5.71cm, number of beans per pod (20 - 51 and bean weight per pod (0.017 - 0.41kg. Positive and significant (P < 0.001 correlation existed between pod weight and length, pod girth and bean number/pod. The mean Gower genetic distance among the 40 genotypes was 0.228; the least (0.023 existed between G25 and G30 while the highest (0.529 was between G17 and G35. The first three principal component axes explained 73% of the total variation. Three distinct groups emerged from the Ward clustering technique. Significant (P<0.05 intra and inter cluster variability existed in the study. High genetic diversity lies within the studied population. Pod traits were important descriptors for cocoa genotypes classification.

  2. McMaster Mesonet soil moisture dataset: description and spatio-temporal variability analysis

    Directory of Open Access Journals (Sweden)

    K. C. Kornelsen

    2013-04-01

    Full Text Available This paper introduces and describes the hourly, high-resolution soil moisture dataset continuously recorded by the McMaster Mesonet located in the Hamilton-Halton Watershed in Southern Ontario, Canada. The McMaster Mesonet consists of a network of time domain reflectometer (TDR probes collecting hourly soil moisture data at six depths between 10 cm and 100 cm at nine locations per site, spread across four sites in the 1250 km2 watershed. The sites for the soil moisture arrays are designed to further improve understanding of soil moisture dynamics in a seasonal climate and to capture soil moisture transitions in areas that have different topography, soil and land cover. The McMaster Mesonet soil moisture constitutes a unique database in Canada because of its high spatio-temporal resolution. In order to provide some insight into the dominant processes at the McMaster Mesonet sites, a spatio-temporal and temporal stability analysis were conducted to identify spatio-temporal patterns in the data and to suggest some physical interpretation of soil moisture variability. It was found that the seasonal climate of the Great Lakes Basin causes a transition in soil moisture patterns at seasonal timescales. During winter and early spring months, and at the meadow sites, soil moisture distribution is governed by topographic redistribution, whereas following efflorescence in the spring and summer, soil moisture spatial distribution at the forested site was also controlled by vegetation canopy. Analysis of short-term temporal stability revealed that the relative difference between sites was maintained unless there was significant rainfall (> 20 mm or wet conditions a priori. Following a disturbance in the spatial soil moisture distribution due to wetting, the relative soil moisture pattern re-emerged in 18 to 24 h. Access to the McMaster Mesonet data can be provided by visiting www.hydrology.mcmaster.ca/mesonet.

  3. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  4. Data Science Programs in U.S. Higher Education: An Exploratory Content Analysis of Program Description, Curriculum Structure, and Course Focus

    Science.gov (United States)

    Tang, Rong; Sae-Lim, Watinee

    2016-01-01

    In this study, an exploratory content analysis of 30 randomly selected Data Science (DS) programs from eight disciplines revealed significant gaps in current DS education in the United States. The analysis centers on linguistic patterns of program descriptions, curriculum requirements, and DS course focus as pertaining to key skills and domain…

  5. Descriptive Research

    DEFF Research Database (Denmark)

    Wigram, Anthony Lewis

    2003-01-01

    Descriptive research is described by Lathom-Radocy and Radocy (1995) to include Survey research, ex post facto research, case studies and developmental studies. Descriptive research also includes a review of the literature in order to provide both quantitative and qualitative evidence of the effect...... of music therapy with a specific population (Gold, Voracek & Wigram, Wigram, 2002). The collection of such evidence, through surveys of the literature and documentation of music therapy studies that show effect with a specified population are becoming increasingly important in order to underpin music...

  6. Comparative analysis of the influence of turbulence models on the description of the nitrogen oxides formation during the combustion of swirling pulverized coal flow

    Science.gov (United States)

    Kuznetsov, V.; Chernetskaya, N.; Chernetskiy, M.

    2016-10-01

    The paper presents the results of numerical research on the influence of the two- parametric k-ε, and k-ω SST turbulence models as well as Reynolds stress model (RSM) on the description of the nitrogen oxides formation during the combustion of pulverized coal in swirling flow. For the numerical simulation of turbulent flow of an incompressible liquid, we used the Reynolds equation taking into account the interfacial interactions. To solve the equation of thermal radiation transfer, the P1 approximation of spherical harmonics method was employed. The optical properties of gases were described based on the sum of gray gases model. To describe the motion of coal particles we used the method of Lagrange multipliers. Burning of coke residue was considered based on diffusion - kinetic approximation. Comparative analysis has shown that the choice of turbulence model has a significant impact on the root mean square (RMS) values of the velocity and temperature pulsation components. This leads to significant differences in the calculation of the nitrogen oxides formation process during the combustion of pulverized coal.

  7. Visualization and Analysis of Climate Simulation Performance Data

    Science.gov (United States)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  8. Exosystem Modeling for Mission Simulation and Survey Analysis

    Science.gov (United States)

    Savransky, Dmitry

    In the last twenty years, the existence of exoplanets (planets orbiting stars other than our own sun) has gone from conjecture to established fact. The accelerating rate of exoplanet discovery has generated a wealth of important new knowledge, and is due mainly to the development and maturation of a large number of technologies that drive a variety of planet detection and observation methods. The overall goal of the exoplanet community is to study planets around all types of stars, and across all ranges of planetary mass and orbit size. With this capability we will be able to build confidence in planet formation and evolution theories and learn how our solar system came to exist. Achieving this goal requires creating dedicated instrumentation capable of detecting signals that are a small fraction of the magnitude of signals we can observe today. It also requires analyzing highly noisy data sets for the faint patterns that represent the presence of planets. Accurate modeling and simulation are necessary for both these tasks. With detailed planetary and observation models we can predict the type of data that will be generated when a specific instrument observes a specific planetary system. This allows us to evaluate the performance of both the instrument and the data analysis methods used to extract planet signals from observational data. The same simulations can help optimize observation scheduling and statistical analysis of data sets. The purpose of this thesis is to lay down the groundwork necessary for building simulations of this type, and to demonstrate a few of their many possible applications. First, we show how each of four different detection methods (astrometry, doppler spectroscopy, transit photometry and direct imaging) can be described using a common parameter set which also encodes sufficient information to propagate the described exosystem in time. We analyze this parameter set and derive the distribution functions of several of its elements. These

  9. Comparative analysis as a management tool for broiler breeder farms: simulated individual farm analysis (IFAS)

    NARCIS (Netherlands)

    Yassin, H.; Velthuis, A.G.J.; Giesen, G.W.J.; Oude Lansink, A.G.J.M.

    2012-01-01

    The objective of this study was to develop a management information system to evaluate the tactical management of a breeder flock using individual farm analysis with a deterministic simulation model (IFAS). Individual farm analysis is a method that evaluates the performance of individual farms by

  10. Dynamic Response of Linear Mechanical Systems Modeling, Analysis and Simulation

    CERN Document Server

    Angeles, Jorge

    2012-01-01

    Dynamic Response of Linear Mechanical Systems: Modeling, Analysis and Simulation can be utilized for a variety of courses, including junior and senior-level vibration and linear mechanical analysis courses. The author connects, by means of a rigorous, yet intuitive approach, the theory of vibration with the more general theory of systems. The book features: A seven-step modeling technique that helps structure the rather unstructured process of mechanical-system modeling A system-theoretic approach to deriving the time response of the linear mathematical models of mechanical systems The modal analysis and the time response of two-degree-of-freedom systems—the first step on the long way to the more elaborate study of multi-degree-of-freedom systems—using the Mohr circle Simple, yet powerful simulation algorithms that exploit the linearity of the system for both single- and multi-degree-of-freedom systems Examples and exercises that rely on modern computational toolboxes for both numerical and symbolic compu...

  11. Health information systems in Africa: descriptive analysis of data sources, information products and health statistics.

    Science.gov (United States)

    Mbondji, Peter Ebongue; Kebede, Derege; Soumbey-Alley, Edoh William; Zielinski, Chris; Kouvividila, Wenceslas; Lusamba-Dikassa, Paul-Samson

    2014-05-01

    To identify key data sources of health information and describe their availability in countries of the World Health Organization (WHO) African Region. An analytical review on the availability and quality of health information data sources in countries; from experience, observations, literature and contributions from countries. Forty-six Member States of the WHO African Region. No participants. The state of data sources, including censuses, surveys, vital registration and health care facility-based sources. In almost all countries of the Region, there is a heavy reliance on household surveys for most indicators, with more than 121 household surveys having been conducted in the Region since 2000. Few countries have civil registration systems that permit adequate and regular tracking of mortality and causes of death. Demographic surveillance sites function in several countries, but the data generated are not integrated into the national health information system because of concerns about representativeness. Health management information systems generate considerable data, but the information is rarely used because of concerns about bias, quality and timeliness. To date, 43 countries in the Region have initiated Integrated Disease Surveillance and Response. A multitude of data sources are used to track progress towards health-related goals in the Region, with heavy reliance on household surveys for most indicators. Countries need to develop comprehensive national plans for health information that address the full range of data needs and data sources and that include provision for building national capacities for data generation, analysis, dissemination and use. © The Royal Society of Medicine.

  12. Monitoring and descriptive analysis of radon in relation to seismic activity of Northern Pakistan.

    Science.gov (United States)

    Jilani, Zeeshan; Mehmood, Tahir; Alam, Aftab; Awais, Muhammad; Iqbal, Talat

    2017-06-01

    Earthquakes are one of the major causes of natural disasters and its forecasting is challenging task. Some precursory phenomenon exists in theory in relation to earthquakes occurrence. The emission of radioactive gas named 'radon' before the earthquakes is a potential earthquake precursory candidate. The study aims to monitor and to analyze the radon in relation to seismic activity in Northern Pakistan. For this purpose RTM-2200 has been used to monitor the changes in radon concentration from August 01, 2014 to January 31, 2015 in Northern Pakistan. Significant temporal variations has been observed in radon concentration. The bivariate analysis of radon with other variables manifests its positive relationship with air pressure and relative humidity and negative relationship with temperature. 2σ upper control limit on monthly basis are computed for detection of anomalous trends in the data. Overall increasing trend is detected in radon concentration. Five earthquakes from August 01, 2014 to January 31, 2015 have been selected from earthquake catalogue, depending upon their magnitude and distance from monitoring station and out of which radon concentration can be associated with only two earthquakes correlated with tectonic effect of radon concentration. Both of events have same magnitude 5.5 and occurred on September 13 and October 14, 2014 respectively. Very large variations have been observed in radon for the last two months of the study period, which may be occurred due to some other geological and environmental changes, but are not related to the earthquake activity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. A descriptive analysis of personality and gender at the Louisiana State University School of Veterinary Medicine.

    Science.gov (United States)

    Johnson, Stephanie W; Gill, Marjorie S; Grenier, Charles; Taboada, Joseph

    2009-01-01

    The goals of this study were to explore the Myers-Briggs Type Indicator profile and gender differences of Louisiana State University veterinary students. A 12-year composite sample (N = 935) revealed that the personality profile was different from the published US population norm, but similar to the bimodal ESTJ-ISTJ profile found in Louisiana medical students. Significant gender differences were found among six of the 16 types. A 12-year trend analysis revealed a significant shift away from the prototypical ESTJ-ISTJ profile, culminating in a discernable heterogeneous profile for both males and females in the last four years. Composite scores for the 2004-2007 cohort (N = 331) revealed that the predominant types for women were ENFP, ESFJ, ESTJ, ISFJ, and ISTJ. For men, the predominant types were ESTJ, ESTP, INTP, and ISTJ. Post hoc tests confirmed significant gender differences for ESTP, INTP, ISTP, and ESFJ types. The evidence of significant gender differences and confirmation that personality profiles have begun to vary widely across the Myers-Briggs Type Indicator spectrum in the last four years have implications at the practical and theoretical levels. This could have profound effects on pedagogical considerations for faculty involved in veterinary medical education.

  14. Dental Blogs, Podcasts, and Associated Social Media: Descriptive Mapping and Analysis.

    Science.gov (United States)

    Melkers, Julia; Hicks, Diana; Rosenblum, Simone; Isett, Kimberley R; Elliott, Jacqueline

    2017-07-26

    Studies of social media in both medicine and dentistry have largely focused on the value of social media for marketing to and communicating with patients and for clinical education. There is limited evidence of how dental clinicians contribute to and use social media to disseminate and access information relevant to clinical care. The purpose of this study was to inventory and assess the entry, growth, sources, and content of clinically relevant social media in dentistry. We developed an inventory of blogs, podcasts, videos, and associated social media disseminating clinical information to dentists. We assessed hosts' media activity in terms of their combinations of modalities, entry and exit dates, frequency of posting, types of content posted, and size of audience. Our study showed that clinically relevant information is posted by dentists and hygienists on social media. Clinically relevant information was provided in 89 blogs and podcasts, and topic analysis showed motives for blogging by host type: 55% (49 hosts) were practicing dentists or hygienists, followed by consultants (27 hosts, 30%), media including publishers and discussion board hosts (8 hosts, 9%), and professional organizations and corporations. We demonstrated the participation of and potential for practicing dentists and hygienists to use social media to share clinical and other information with practicing colleagues. There is a clear audience for these social media sites, suggesting a changing mode of information diffusion in dentistry. This study was a first effort to fill the gap in understanding the nature and potential role of social media in clinical dentistry.

  15. Fractal description and quantitative analysis of normal brain development in infants

    Institute of Scientific and Technical Information of China (English)

    Hehong Li; Liangping Luo; Li Huang

    2011-01-01

    We examined the fractal pattern of cerebral computerized tomography images in 158 normal infants, aged 0-3 years, based on the quantitative analysis of chaotic theory. Results showed that the fractal dimension of cerebral computerized tomography images in normal infants remained stable from 1.86-1.91. The normal distribution range in the neonatal period, 1-2 months old infants, 1-2 year old infants, and of 2-3 year old infants was 1.88-1.90 (mean: 1.891 3 ± 0.006 4), 1.89-1.90 (mean: 1.892 7 ± 0.004 5), 1.86-1.90 (mean: 1.886 3 ± 0.008 5), and 1.88-1.91 (mean: 1.895 8 ± 0.008 3), respectively. The spectrum width of the multifractal spectrum (△α) in normal infants was 1.4618. These data suggest that the spectral width parameters of the multifractal spectrum and the fractal dimension criteria in normal children may be useful as a practical specific parameter for assessing the fractal mode of brain development in normal infants.

  16. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    Science.gov (United States)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  17. Safety in ready mixed concrete industry: descriptive analysis of injuries and development of preventive measures.

    Science.gov (United States)

    Akboğa, Özge; Baradan, Selim

    2017-02-07

    Ready mixed concrete (RMC) industry, one of the barebones of construction sector, has its distinctive occupational safety and health (OSH) risks. Employees experience risks that emerge during the fabrication of concrete, as well as its delivery to the construction site. Statistics show that usage and demand of RMC have been increasing along with the number of producers and workers. Unfortunately, adequate OSH measures to meet this rapid growth are not in place even in top RMC producing countries, such as Turkey. Moreover, lack of statistical data and academic research in this sector exacerbates this problem. This study aims to fill this gap by conducting data mining in Turkish Social Security Institution archives and performing univariate frequency and cross tabulation analysis on 71 incidents that RMC truck drivers were involved. Also, investigations and interviews were conducted in seven RMC plants in Turkey and Netherlands with OSH point of view. Based on the results of this research, problem areas were determined such as; cleaning truck mixer/pump is a hazardous activity where operators get injured frequently, and struck by falling objects is a major hazard at RMC industry. Finally, Job Safety Analyses were performed on these areas to suggest mitigation methods.

  18. Barrier and operational risk analysis of hydrocarbon releases (BORA-Release). Part I. Method description.

    Science.gov (United States)

    Aven, Terje; Sklet, Snorre; Vinnem, Jan Erik

    2006-09-21

    Investigations of major accidents show that technical, human, operational, as well as organisational factors influence the accident sequences. In spite of these facts, quantitative risk analyses of offshore oil and gas production platforms have focused on technical safety systems. This paper presents a method (called BORA-Release) for qualitative and quantitative risk analysis of the platform specific hydrocarbon release frequency. By using BORA-Release it is possible to analyse the effect of safety barriers introduced to prevent hydrocarbon releases, and how platform specific conditions of technical, human, operational, and organisational risk influencing factors influence the barrier performance. BORA-Release comprises the following main steps: (1) development of a basic risk model including release scenarios, (2) modelling the performance of safety barriers, (3) assignment of industry average probabilities/frequencies and risk quantification based on these probabilities/frequencies, (4) development of risk influence diagrams, (5) scoring of risk influencing factors, (6) weighting of risk influencing factors, (7) adjustment of industry average probabilities/frequencies, and (8) recalculation of the risk in order to determine the platform specific risk related to hydrocarbon release. The various steps in BORA-Release are presented and discussed. Part II of the paper presents results from a case study where BORA-Release is applied.

  19. Multilocus sequence analysis of the genus Citrobacter and description of Citrobacter pasteurii sp. nov.

    Science.gov (United States)

    Clermont, Dominique; Motreff, Laurence; Passet, Virginie; Fernandez, José-Carlos; Bizet, Chantal; Brisse, Sylvain

    2015-05-01

    Strains originating from various sources and classified as members of the genus Citrobacter within the family Enterobacteriaceae were characterized by sequencing internal portions of genes rpoB, fusA, pyrG and leuS, 16S rRNA gene sequencing, average nucleotide identity (ANI) of genomic sequences and biochemical tests. Phylogenetic analysis based on the four housekeeping genes showed that the 11 species of the genus Citrobacter with validly published names are well demarcated. Strains CIP 55.13(T) and CIP 55.9 formed a distinct branch associated with Citrobacter youngae. The ANI between CIP 55.9 and CIP 55.13(T) was 99.19%, whereas it was 94.75% between CIP 55.13(T) and strain CIP 105016(T) of the species C. youngae, the most closely related species. Biochemical characteristics consolidated the fact that the two isolates represent a separate species, for which the name Citrobacter pasteurii sp. nov. is proposed. The type strain is CIP 55.13(T) ( =DSM 28879(T) =Na 1a(T)).

  20. The Green Bank Northern Celestial Cap Pulsar Survey - I: Survey Description, Data Analysis, and Initial Results

    CERN Document Server

    Stovall, K; Ransom, S M; Archibald, A M; Banaszak, S; Biwer, C M; Boyles, J; Dartez, L P; Day, D; Ford, A J; Flanigan, J; Garcia, A; Hessels, J W T; Hinojosa, J; Jenet, F A; Kaplan, D L; Karako-Argaman, C; Kaspi, V M; Kondratiev, V I; Leake, S; Lorimer, D R; Lunsford, G; Martinez, J G; Mata, A; McLaughlin, M A; Roberts, M S E; Rohr, M D; Siemens, X; Stairs, I H; van Leeuwen, J; Walker, A N; Wells, B L

    2014-01-01

    We describe an ongoing search for pulsars and dispersed pulses of radio emission, such as those from rotating radio transients (RRATs) and fast radio bursts (FRBs), at 350 MHz using the Green Bank Telescope. With the Green Bank Ultimate Pulsar Processing Instrument, we record 100 MHz of bandwidth divided into 4,096 channels every 81.92 $\\mu s$. This survey will cover the entire sky visible to the Green Bank Telescope ($\\delta > -40^\\circ$, or 82% of the sky) and outside of the Galactic Plane will be sensitive enough to detect slow pulsars and low dispersion measure ($<$30 $\\mathrm{pc\\,cm^{-3}}$) millisecond pulsars (MSPs) with a 0.08 duty cycle down to 1.1 mJy. For pulsars with a spectral index of $-$1.6, we will be 2.5 times more sensitive than previous and ongoing surveys over much of our survey region. Here we describe the survey, the data analysis pipeline, initial discovery parameters for 62 pulsars, and timing solutions for 5 new pulsars. PSR J0214$+$5222 is an MSP in a long-period (512 days) orbit a...

  1. National Geo-Database for Biofuel Simulations and Regional Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Izaurralde, Roberto C.; Zhang, Xuesong; Sahajpal, Ritvik; Manowitz, David H.

    2012-04-01

    performance of EPIC and, when necessary, improve its parameterization. We investigated three scenarios. In the first, we simulated a historical (current) baseline scenario composed mainly of corn-, soybean-, and wheat-based rotations as grown existing croplands east of the Rocky Mountains in 30 states. In the second scenario, we simulated a modified baseline in which we harvested corn and wheat residues to supply feedstocks to potential cellulosic ethanol biorefineries distributed within the study area. In the third scenario, we simulated the productivity of perennial cropping systems such as switchgrass or perennial mixtures grown on either marginal or Conservation Reserve Program (CRP) lands. In all cases we evaluated the environmental impacts (e.g., soil carbon changes, soil erosion, nitrate leaching, etc.) associated with the practices. In summary, we have reported on the development of a spatially explicit national geodatabase to conduct biofuel simulation studies and provided initial simulation results on the potential of annual and perennial cropping systems to serve as feedstocks for the production of cellulosic ethanol. To accomplish this, we have employed sophisticated spatial analysis methods in combination with the process-based biogeochemical model EPIC. This work provided the opportunity to test the hypothesis that marginal lands can serve as sources of cellulosic feedstocks and thus contribute to avoid potential conflicts between bioenergy and food production systems. This work, we believe, opens the door for further analysis on the characteristics of cellulosic feedstocks as major contributors to the development of a sustainable bioenergy economy.

  2. Wavelet analysis on paleomagnetic (and computer simulated VGP time series

    Directory of Open Access Journals (Sweden)

    A. Siniscalchi

    2003-06-01

    Full Text Available We present Continuous Wavelet Transform (CWT data analysis of Virtual Geomagnetic Pole (VGP latitude time series. The analyzed time series are sedimentary paleomagnetic and geodynamo simulated data. Two mother wavelets (the Morlet function and the first derivative of a Gaussian function are used in order to detect features related to the spectral content as well as polarity excursions and reversals. By means of the Morlet wavelet, we estimate both the global spectrum and the time evolution of the spectral content of the paleomagnetic data series. Some peaks corresponding to the orbital components are revealed by the spectra and the local analysis helped disclose their statistical significance. Even if this feature could be an indication of orbital influence on geodynamo, other interpretations are possible. In particular, we note a correspondence of local spectral peaks with the appearance of the excursions in the series. The comparison among the paleomagnetic and simulated spectra shows a similarity in the high frequency region indicating that their degree of regularity is analogous. By means of Gaussian first derivative wavelet, reversals and excursions of polarity were sought. The analysis was performed first on the simulated data, to have a guide in understanding the features present in the more complex paleomagnetic data. Various excursions and reversals have been identified, despite of the prevalent normality of the series and its inherent noise. The found relative chronology of the paleomagnetic data reversals was compared with a coeval global polarity time scale (Channel et al., 1995. The relative lengths of polarity stability intervals are found similar, but a general shift appears between the two scales, that could be due to the datation uncertainties of the Hauterivian/Barremian boundary.

  3. Simulations

    CERN Document Server

    Ngada, N M

    2015-01-01

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  4. A comparison of two follow-up analyses after multiple analysis of variance, analysis of variance, and descriptive discriminant analysis: A case study of the program effects on education-abroad programs

    Science.gov (United States)

    Alvin H. Yu; Garry. Chick

    2010-01-01

    This study compared the utility of two different post-hoc tests after detecting significant differences within factors on multiple dependent variables using multivariate analysis of variance (MANOVA). We compared the univariate F test (the Scheffé method) to descriptive discriminant analysis (DDA) using an educational-tour survey of university study-...

  5. Descriptive analysis of childbirth healthcare costs in an area with high levels of immigration in Spain

    Directory of Open Access Journals (Sweden)

    Sala Maria

    2011-04-01

    Full Text Available Abstract Background The aim of this study was to estimate the cost of childbirth in a teaching hospital in Barcelona, Spain, including the costs of prenatal care, delivery and postnatal care (3 months. Costs were assessed by taking into account maternal origin and delivery type. Methods We performed a cross-sectional study of all deliveries in a teaching hospital to mothers living in its catchment area between October 2006 and September 2007. A process cost analysis based on a full cost accounting system was performed. The main information sources were the primary care program for sexual and reproductive health, and hospital care and costs records. Partial and total costs were compared according to maternal origin and delivery type. A regression model was fit to explain the total cost of the childbirth process as a function of maternal age and origin, prenatal care, delivery type, maternal and neonatal severity, and multiple delivery. Results The average cost of childbirth was 4,328€, with an average of 18.28 contacts between the mother or the newborn and the healthcare facilities. The delivery itself accounted for more than 75% of the overall cost: maternal admission accounted for 57% and neonatal admission for 20%. Prenatal care represented 18% of the overall cost and 75% of overall acts. The average overall cost was 5,815€ for cesarean sections, 4,064€ for vaginal instrumented deliveries and 3,682€ for vaginal non-instrumented deliveries (p Conclusions Neither immigration nor prenatal care were associated with a substantial difference in costs. The most important predictors of cost were delivery type and neonatal severity. Given the impact of cesarean sections on the overall cost of childbirth, attempts should be made to take into account its higher cost in the decision of performing a cesarean section.

  6. Developing a Performance Brain Training™ approach for baseball: a process analysis with descriptive data.

    Science.gov (United States)

    Sherlin, Leslie H; Larson, Noel C; Sherlin, Rebecca M

    2013-03-01

    Neurofeedback may be useful for improving sports performance but few studies have examined this potential. Here we present data of five development players from a major league baseball team. The aims were to evaluate the feasibility of conducting sessions within a professional organization, assess changes in quantitative electroencephalograph (QEEG), NeuroPerformance Profile™, and report qualitative self-report data before and after brain training. The EEG was recorded with 19 electrodes for 20 min of baseline conditions and approximately 21 min of a continuous performance test. The fast Fourier transform analysis provided average cross-spectral matrices for bands delta (1-3.5 Hz), theta (4-7.5 Hz), alpha (8-12 Hz), low beta (13-16 Hz), beta 1 (13-21 Hz), beta 2 (22-32 Hz), and gamma (32-45 Hz) from the pre and post intervention evaluations in the baseline condition of eyes open. The continuous performance test metrics included the errors of omission, errors of commission, response time and response time variability. The 9 scales of the NeuroPerformance Profile™ were examined. The QEEG data, CPT data and NeuroPerformance Profile™ data were all compared between the pre and post 15 sessions of brain training using a within subject paired t test design corrected for multiple comparisons using false discovery rate method. Following brain training, comparative QEEG, CPT and NeuroPerformance Profile™ analyses illustrated significant differences. The QEEG findings of all participants illustrated significant changes within the training parameters but also across other frequency bands and electrode sites. Overall, the positive findings in both objective and subjective measures suggest further inquiry into the utility of brain training for performance enhancement with the specific application of sport is warranted. Particularly QEEG and CPT gains were noted in the areas that correspond to client self-report data demonstrating improvement in attention, decreased

  7. Catastrophic antiphospholipid syndrome (CAPS): Descriptive analysis of 500 patients from the International CAPS Registry.

    Science.gov (United States)

    Rodríguez-Pintó, Ignasi; Moitinho, Marta; Santacreu, Irene; Shoenfeld, Yehuda; Erkan, Doruk; Espinosa, Gerard; Cervera, Ricard

    2016-12-01

    To analyze the clinical and immunologic manifestations of patients with catastrophic antiphospholipid syndrome (CAPS) from the "CAPS Registry". The demographic, clinical and serological features of 500 patients included in the website-based "CAPS Registry" were analyzed. Frequency distribution and measures of central tendency were used to describe the cohort. Comparison between groups regarding qualitative variables was undertaken by chi-square or Fisher exact test while T-test for independent variables was used to compare groups regarding continuous variables. 500 patients (female: 343 [69%]; mean age 38±17) accounting for 522 episodes of CAPS were included in the analysis. Forty percent of patients had an associated autoimmune disease, mainly systemic lupus erythematosus (SLE) (75%). The majority of CAPS episodes were triggered by a precipitating factor (65%), mostly infections (49%). Clinically, CAPS was characterized by several organ involvement affecting kidneys (73%), lungs (60%), brain (56%), heart (50%), and skin (47%). Lupus anticoagulant, IgG anticardiolipin and IgG anti-β2-glycprotein antibodies were the most often implicated antiphospholipid antibodies (83%, 81% and 78% respectively). Mortality accounted for 37% of episodes of CAPS. Several clinical differences could be observed based on the age of presentation and its association to SLE. Those cases triggered by a malignancy tended to occur in older patients, while CAPS episodes in young patients were associated with an infectious trigger and peripheral vessels involvement. Additionally, CAPS associated with SLE were more likely to have severe cardiac and brain involvement leading to a higher mortality (48%). Although the presentation of CAPS is characterized by multiorgan thrombosis and failure, clinical differences among patients exist based on age and underlying chronic diseases, e.g. malignancy and SLE. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Descriptive analysis of medication errors reported to the Egyptian national online reporting system during six months.

    Science.gov (United States)

    Shehata, Zahraa Hassan Abdelrahman; Sabri, Nagwa Ali; Elmelegy, Ahmed Abdelsalam

    2016-03-01

    This study analyzes reports to the Egyptian medication error (ME) reporting system from June to December 2014. Fifty hospital pharmacists received training on ME reporting using the national reporting system. All received reports were reviewed and analyzed. The pieces of data analyzed were patient age, gender, clinical setting, stage, type, medication(s), outcome, cause(s), and recommendation(s). Over the course of 6 months, 12,000 valid reports were gathered and included in this analysis. The majority (66%) came from inpatient settings, while 23% came from intensive care units, and 11% came from outpatient departments. Prescribing errors were the most common type of MEs (54%), followed by monitoring (25%) and administration errors (16%). The most frequent error was incorrect dose (20%) followed by drug interactions, incorrect drug, and incorrect frequency. Most reports were potential (25%), prevented (11%), or harmless (51%) errors; only 13% of reported errors lead to patient harm. The top three medication classes involved in reported MEs were antibiotics, drugs acting on the central nervous system, and drugs acting on the cardiovascular system. Causes of MEs were mostly lack of knowledge, environmental factors, lack of drug information sources, and incomplete prescribing. Recommendations for addressing MEs were mainly staff training, local ME reporting, and improving work environment. There are common problems among different healthcare systems, so that sharing experiences on the national level is essential to enable learning from MEs. Internationally, there is a great need for standardizing ME terminology, to facilitate knowledge transfer. Underreporting, inaccurate reporting, and a lack of reporter diversity are some limitations of this study. Egypt now has a national database of MEs that allows researchers and decision makers to assess the problem, identify its root causes, and develop preventive strategies. © The Author 2015. Published by Oxford University

  9. Dental Blogs, Podcasts, and Associated Social Media: Descriptive Mapping and Analysis

    Science.gov (United States)

    Hicks, Diana; Rosenblum, Simone; Isett, Kimberley R; Elliott, Jacqueline

    2017-01-01

    Background Studies of social media in both medicine and dentistry have largely focused on the value of social media for marketing to and communicating with patients and for clinical education. There is limited evidence of how dental clinicians contribute to and use social media to disseminate and access information relevant to clinical care. Objective The purpose of this study was to inventory and assess the entry, growth, sources, and content of clinically relevant social media in dentistry. Methods We developed an inventory of blogs, podcasts, videos, and associated social media disseminating clinical information to dentists. We assessed hosts’ media activity in terms of their combinations of modalities, entry and exit dates, frequency of posting, types of content posted, and size of audience. Results Our study showed that clinically relevant information is posted by dentists and hygienists on social media. Clinically relevant information was provided in 89 blogs and podcasts, and topic analysis showed motives for blogging by host type: 55% (49 hosts) were practicing dentists or hygienists, followed by consultants (27 hosts, 30%), media including publishers and discussion board hosts (8 hosts, 9%), and professional organizations and corporations. Conclusions We demonstrated the participation of and potential for practicing dentists and hygienists to use social media to share clinical and other information with practicing colleagues. There is a clear audience for these social media sites, suggesting a changing mode of information diffusion in dentistry. This study was a first effort to fill the gap in understanding the nature and potential role of social media in clinical dentistry. PMID:28747291

  10. Analysis and numerical simulation of the dynamics of bubbles

    OpenAIRE

    Méndez Rodríguez, Num

    2010-01-01

    This project will consist of the following tasks: - analysis of the mathematical models for oscillating bubbles (axisymmetric and non-axisymmetric cases). - numerical simulation of different phenomena related with oscillating bubbles. Este trabajo tiene como objetivo el estudio y simulación de la diámica de burbujas. Inicialmente se introducen los modelos matemáticos de burbujas esféricas, para dar paso a la formulacón tridimensional basada en el método de los elementos de contorno. Para l...

  11. Simulation results of the grasping analysis of an underactuated finger

    Directory of Open Access Journals (Sweden)

    Niola Vincenzo

    2016-01-01

    Full Text Available The results of a number of simulations concerning the grasping analysis is presented. The grasping device consist in an under-actuated finger driven by un-extendible tendon that is one of the fingers of a mechanical prosthesis that was principally conceived as human prosthesis. The results, however, are useful for any similar finger to be used in grasping devices for industrial and agricultural applications, Aanalysis maps of the grasping were obtained which show the “robustness” of the socket. The method seems to be a suitable tool for the optimum design of such under-actuated fingers for grasping devices.

  12. Theoretical Analysis and Simulation of BJFET Obstructive Characteristics

    Institute of Scientific and Technical Information of China (English)

    ZENG Yun; YAN Min; YAN Yong-hong; FAN Wei

    2005-01-01

    A new bipolar junction field-effect transistor (BJFET) was described. The theoretical analysis and computer simulation of BJFET obstructive characteristic are achieved. The gate bias voltage affects the BJFET obstructive voltage greatly. The BJFET obstructive characteristic is relevant to structure parameters of channel width W and channel length L.The decrease-bias-voltage operation can weaken the device obstructive characteristic. The forward turn in device forward obstructive region can also affect the BJFET obstructive characteristic. BJFET has a good high temperature obstructive characteristic and can be applying to high temperature status as high voltage switch devices.

  13. Energy and thermal analysis of glazed office buildings using a dynamic energy simulation tool

    Energy Technology Data Exchange (ETDEWEB)

    Poirazis, H.; Blomsterberg, A. [Lund Inst. of Technology, Lund (Sweden). Div. of Energy and Building Design

    2005-07-01

    Although highly glazed buildings have more access to daylight than traditional buildings their energy efficiency is sometimes questionable. This paper presented energy and indoor climate simulations of single skin office buildings in Sweden with the use of a dynamic energy simulation tool. An analysis of building alternatives with 30, 60 and 100 per cent window areas were investigated. Parameters concerning the buildings' orientation, plan type, control set points and facade type were varied in the simulations. A virtual reference building was created as representative of Swedish office buildings constructed in the late 1990s. The design was determined by various Swedish agencies. Detailed performance specifications for energy and indoor climate were established and typical construction methods were determined. System descriptions and drawings were prepared. A validation of the simulated performance of the building showed that the performance specifications were accurate. A parametric study of energy use and indoor climate was conducted. Heating, ventilation and air conditioning (HVAC) systems and control systems were described in detail. Orientation, plan type, control set points, and facade elements were changed while other parameters such as the shape of the building and occupant activity levels remained the same. A sensitivity analysis was conducted regarding occupant comfort levels and the energy used for operating the building. It was concluded that the energy efficiency of a building depends on facade construction. It was suggested that highly glazed buildings will benefit through the use of advanced simulation tools during the design stage. It was also noted that the main aim when designing glazed buildings should be to avoid a high cooling demand. The impact of control set points on heating and cooling is also crucial for energy use, as well as the orientation of rooms. It was suggested that an increase in glazing area does not necessarily mean higher

  14. Improved spectrum simulation for validating SEM-EDS analysis

    Science.gov (United States)

    Statham, P.; Penman, C.; Duncumb, P.

    2016-02-01

    X-ray microanalysis by SEM-EDS requires corrections for the many physical processes that affect emitted intensity for elements present in the material. These corrections will only be accurate provided a number of conditions are satisfied and it is essential that the correct elements are identified. As analysis is pushed to achieve results on smaller features and more challenging samples it becomes increasingly difficult to determine if all conditions are upheld and whether the analysis results are valid. If a theoretical simulated spectrum based on the measured analysis result is compared with the measured spectrum, any marked differences will indicate problems with the analysis and can prevent serious mistakes in interpretation. To achieve the necessary accuracy a previous theoretical model has been enhanced to incorporate new line intensity measurements, differential absorption and excitation of emission lines, including the effect of Coster-Kronig transitions and an improved treatment of bremsstrahlung for compounds. The efficiency characteristic has been measured for a large area SDD detector and data acquired from an extensive set of standard materials at both 5 kV and 20 kV. The parameterized model has been adjusted to fit measured characteristic intensities and both background shape and intensity at the same beam current. Examples are given to demonstrate how an overlay of an accurate theoretical simulation can expose some non-obvious mistakes and provide some expert guidance towards a valid analysis result. A new formula for calculating the effective mean atomic number for compounds has also been derived that is appropriate and should help improve accuracy in techniques that calculate the bremsstrahlung or use a bremsstrahlung measurement for calibration.

  15. Admission to acute care hospitals for adolescent substance abuse: a national descriptive analysis

    Directory of Open Access Journals (Sweden)

    Chisolm Deena J

    2006-07-01

    Full Text Available Abstract Background Use of alcohol and illicit drugs by adolescents remains a problem in the U.S. Case identification and early treatment can occur within a broad variety of healthcare and non-healthcare settings, including acute care hospitals. The objective of this study is to describe the extent and nature of adolescent admissions to the acute inpatient setting for substance abuse (SA. We use the Agency for Healthcare Research and Quality (AHRQ 2000 Healthcare Cost and Utilization Project Kids Inpatient Database (HCUP-KID which includes over 2.5 million admissions for youth age 20 and under to 2,784 hospitals in 27 states in the year 2000. Specifically, this analysis estimates national number of admissions, mean total charges, and mean lengths of stay for adolescents between the ages of 12 and 17 admitted to an acute care hospital for the following diagnostic categories from the AHRQ's Clinical Classifications Software categories: "alcohol-related mental disorders" and "substance-related mental disorders". Frequency and percentage of total admissions were calculated for demographic variables of age, gender and income and for hospital characteristic variables of urban/rural designation and children's hospital designation. Results SA admissions represented 1.25 percent of adolescent admissions to acute care hospitals. Nearly 90 percent of the admission occurred in non-Children's hospitals. Most were for drug dependence (38% or non-dependent use of alcohol or drugs (35%. Costs were highest for drug dependence admissions. Nearly half of admissions had comorbid mental health diagnoses. Higher rates of admission were seen in boys, in older adolescents, and in "self-pay" patients. Alcohol and drug rehabilitation/detoxification, alone or in combination with psychological and psychiatric evaluation and therapy, was documented for 38 percent of admissions. Over 50 percent of cases had no documentation of treatment specific to substance use behavior

  16. NUMERICAL SIMULATION FOR A PROCESS ANALYSIS OF A COKE OVEN

    Institute of Scientific and Technical Information of China (English)

    Zhancheng Guo; Huiqing Tang

    2005-01-01

    A computational fluid dynamic model is established for a coking process analysis of a coke oven using PHOENICS CFD package. The model simultaneously calculates the transient composition, temperatures of the gas and the solid phases, velocity of the gas phase and porosity and density of the semi-coke phase. Numerical simulation is illustrated in predicting the evolution of volatile gases, gas flow paths, profiles of density, porosity of the coke oven charge,profiles of temperatures of the coke oven gas and the semi-coke bed. On the basis of above modeling, the flow of coke oven gas (COG) blown from the bottom of the coke oven into the porous semi-coke bed is simulated to reveal whether or not and when the blown COG can uniformly flow through the porous semi-coke bed for the purpose of desulfurizing the semi-coke by recycling the COG. The simulation results show that the blown COG can uniformly flow through the semi-coke bed only after the temperature at the center of the semi-coke bed has risen to above 900 ℃.

  17. Axisymmetric Plume Simulations with NASA's DSMC Analysis Code

    Science.gov (United States)

    Stewart, B. D.; Lumpkin, F. E., III

    2012-01-01

    A comparison of axisymmetric Direct Simulation Monte Carlo (DSMC) Analysis Code (DAC) results to analytic and Computational Fluid Dynamics (CFD) solutions in the near continuum regime and to 3D DAC solutions in the rarefied regime for expansion plumes into a vacuum is performed to investigate the validity of the newest DAC axisymmetric implementation. This new implementation, based on the standard DSMC axisymmetric approach where the representative molecules are allowed to move in all three dimensions but are rotated back to the plane of symmetry by the end of the move step, has been fully integrated into the 3D-based DAC code and therefore retains all of DAC s features, such as being able to compute flow over complex geometries and to model chemistry. Axisymmetric DAC results for a spherically symmetric isentropic expansion are in very good agreement with a source flow analytic solution in the continuum regime and show departure from equilibrium downstream of the estimated breakdown location. Axisymmetric density contours also compare favorably against CFD results for the R1E thruster while temperature contours depart from equilibrium very rapidly away from the estimated breakdown surface. Finally, axisymmetric and 3D DAC results are in very good agreement over the entire plume region and, as expected, this new axisymmetric implementation shows a significant reduction in computer resources required to achieve accurate simulations for this problem over the 3D simulations.

  18. Dimensional analysis, similarity, analogy, and the simulation theory

    Energy Technology Data Exchange (ETDEWEB)

    Davis, A.A.

    1978-01-01

    Dimensional analysis, similarity, analogy, and cybernetics are shown to be four consecutive steps in application of the simulation theory. This paper introduces the classes of phenomena which follow the same formal mathematical equations as models of the natural laws and the interior sphere of restraints groups of phenomena in which one can introduce simplfied nondimensional mathematical equations. The simulation by similarity in a specific field of physics, by analogy in two or more different fields of physics, and by cybernetics in nature in two or more fields of mathematics, physics, biology, economics, politics, sociology, etc., appears as a unique theory which permits one to transport the results of experiments from the models, convenably selected to meet the conditions of researches, constructions, and measurements in the laboratories to the originals which are the primary objectives of the researches. Some interesting conclusions which cannot be avoided in the use of simplified nondimensional mathematical equations as models of natural laws are presented. Interesting limitations on the use of simulation theory based on assumed simplifications are recognized. This paper shows as necessary, in scientific research, that one write mathematical models of general laws which will be applied to nature in its entirety. The paper proposes the extent of the second law of thermodynamics as the generalized law of entropy to model life and its activities. This paper shows that the physical studies and philosophical interpretations of phenomena and natural laws cannot be separated in scientific work; they are interconnected and one cannot be put above the others.

  19. Quantitative descriptive sensory analysis of buffalo meat from animals fed with a diet containing different amounts of vitamin E

    Directory of Open Access Journals (Sweden)

    M.P. Graziani

    2010-02-01

    Full Text Available The objective of our study is the sensory characterisation of buffalo meat from animals fed with different diets. The sensory evaluation was carried out on frozen rump meat samples from 12 animals. Control group (N=4 received a normal diet, a low vitamin E diet group (LVE (N=4 and a high vitamin E diet group (HVE (N=4. The sensory profiles of the different samples were obtained by Quantitative Descriptive Analysis. Sensory evaluation was initially carried out on the raw meat. The samples were cooked using an electrical oven reaching an inner temperature of 80°C. The slices were cut into squared pieces of 2x2 cm for tasting. A detailed analysis of the profiles for each attribute shows that the “tenderness”, “juiciness” and “visible fat” of the LVE group are significantly greater when compared to the Control and HVE groups. While HVE animal rump has a higher “cohesivity” and “colour uniformity” values compared to the LVE animal rump.

  20. A database of high-impact weather events in Greece: a descriptive impact analysis for the period 2001–2011

    Directory of Open Access Journals (Sweden)

    K. Papagiannaki

    2013-03-01

    Full Text Available This paper introduces the development of a database of high-impact weather events that occurred in Greece since 2001. The selected events are related to the occurrence of floods, flash floods, hail, snow/frost, tornados, windstorms, heat waves and lightning with adverse consequences (excluding those related to agriculture. The database includes, among others, the geographical distribution of the recorded events, relevant meteorological data, a brief description of the induced impacts and references in the press. This paper further offers an extensive analysis of the temporal and spatial distribution of high-impact weather events for the period 2001–2011, taking into account the intensity of weather conditions and the consequent impact on the society. Analysis of the monthly distribution of high-impact weather events showed that they are more frequent during October and November. More than 80 people lost their lives, half of which due to flash floods. In what concerns the spatial distribution of high-impact weather events, among the 51 prefectures of the country, Attica, Thessaloniki, Elia and Halkidiki were the most frequently affected areas, mainly by flash floods. Significant was also the share of tornados in Elia, of windstorms in Attica, of lightning and hail events in Halkidiki and of snow/frost events in Thessaloniki.

  1. A data integration approach for cell cycle analysis oriented to model simulation in systems biology

    Directory of Open Access Journals (Sweden)

    Mosca Ettore

    2007-08-01

    Full Text Available Abstract Background The cell cycle is one of the biological processes most frequently investigated in systems biology studies and it involves the knowledge of a large number of genes and networks of protein interactions. A deep knowledge of the molecular aspect of this biological process can contribute to making cancer research more accurate and innovative. In this context the mathematical modelling of the cell cycle has a relevant role to quantify the behaviour of each component of the systems. The mathematical modelling of a biological process such as the cell cycle allows a systemic description that helps to highlight some features such as emergent properties which could be hidden when the analysis is performed only from a reductionism point of view. Moreover, in modelling complex systems, a complete annotation of all the components is equally important to understand the interaction mechanism inside the network: for this reason data integration of the model components has high relevance in systems biology studies. Description In this work, we present a resource, the Cell Cycle Database, intended to support systems biology analysis on the Cell Cycle process, based on two organisms, yeast and mammalian. The database integrates information about genes and proteins involved in the cell cycle process, stores complete models of the interaction networks and allows the mathematical simulation over time of the quantitative behaviour of each component. To accomplish this task, we developed, a web interface for browsing information related to cell cycle genes, proteins and mathematical models. In this framework, we have implemented a pipeline which allows users to deal with the mathematical part of the models, in order to solve, using different variables, the ordinary differential equation systems that describe the biological process. Conclusion This integrated system is freely available in order to support systems biology research on the cell cycle and

  2. Posterior tibial tendon dysfunction and flatfoot: analysis with simulated walking.

    Science.gov (United States)

    Watanabe, Kota; Kitaoka, Harold B; Fujii, Tadashi; Crevoisier, Xavier; Berglund, Lawrence J; Zhao, Kristin D; Kaufman, Kenton R; An, Kai-Nan

    2013-02-01

    Many biomechanical studies investigated pathology of flatfoot and effects of operations on flatfoot. The majority of cadaveric studies are limited to the quasistatic response to static joint loads. This study examined the unconstrained joint motion of the foot and ankle during stance phase utilizing a dynamic foot-ankle simulator in simulated stage 2 posterior tibial tendon dysfunction (PTTD). Muscle forces were applied on the extrinsic tendons of the foot using six servo-pneumatic cylinders to simulate their action. Vertical and fore-aft shear forces were applied and tibial advancement was performed with the servomotors. Three-dimensional movements of multiple bones of the foot were monitored with a magnetic tracking system. Twenty-two fresh-frozen lower extremities were studied in the intact condition, then following sectioning peritalar constraints to create a flatfoot and unloading the posterior tibial muscle force. Kinematics in the intact condition were consistent with gait analysis data for normals. There were altered kinematics in the flatfoot condition, particularly in coronal and transverse planes. Calcaneal eversion relative to the tibia averaged 11.1±2.8° compared to 5.8±2.3° in the normal condition. Calcaneal-tibial external rotation was significantly increased in flatfeet from mean of 2.3±1.7° to 8.1±4.0°. There were also significant changes in metatarsal-tibial eversion and external rotation in the flatfoot condition. The simulated PTTD with flatfoot was consistent with previous data obtained in patients with PTTD. The use of a flatfoot model will enable more detailed study on the flatfoot condition and/or effect of surgical treatment.

  3. Dynamic simulation tools for the analysis and optimization of novel collection, filtration and sample preparation systems

    Energy Technology Data Exchange (ETDEWEB)

    Clague, D; Weisgraber, T; Rockway, J; McBride, K

    2006-02-12

    The focus of research effort described here is to develop novel simulation tools to address design and optimization needs in the general class of problems that involve species and fluid (liquid and gas phases) transport through sieving media. This was primarily motivated by the heightened attention on Chem/Bio early detection systems, which among other needs, have a need for high efficiency filtration, collection and sample preparation systems. Hence, the said goal was to develop the computational analysis tools necessary to optimize these critical operations. This new capability is designed to characterize system efficiencies based on the details of the microstructure and environmental effects. To accomplish this, new lattice Boltzmann simulation capabilities where developed to include detailed microstructure descriptions, the relevant surface forces that mediate species capture and release, and temperature effects for both liquid and gas phase systems. While developing the capability, actual demonstration and model systems (and subsystems) of national and programmatic interest were targeted to demonstrate the capability. As a result, where possible, experimental verification of the computational capability was performed either directly using Digital Particle Image Velocimetry or published results.

  4. Probability theory versus simulation of petroleum potential in play analysis

    Science.gov (United States)

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  5. Dynamic Simulation and Analysis of Human Walking Mechanism

    Science.gov (United States)

    Azahari, Athirah; Siswanto, W. A.; Ngali, M. Z.; Salleh, S. Md.; Yusup, Eliza M.

    2017-01-01

    Behaviour such as gait or posture may affect a person with the physiological condition during daily activities. The characteristic of human gait cycle phase is one of the important parameter which used to described the human movement whether it is in normal gait or abnormal gait. This research investigates four types of crouch walking (upright, interpolated, crouched and severe) by simulation approach. The assessment are conducting by looking the parameters of hamstring muscle joint, knee joint and ankle joint. The analysis results show that based on gait analysis approach, the crouch walking have a weak pattern of walking and postures. Short hamstring and knee joint is the most influence factor contributing to the crouch walking due to excessive hip flexion that typically accompanies knee flexion.

  6. Structural Simulations and Conservation Analysis -Historic Building Information Model (HBIM

    Directory of Open Access Journals (Sweden)

    C. Dore

    2015-02-01

    Full Text Available In this paper the current findings to date of the Historic Building Information Model (HBIM of the Four Courts in Dublin are presented. The Historic Building Information Model (HBIM forms the basis for both structural and conservation analysis to measure the impact of war damage which still impacts on the building. The laser scan survey was carried out in the summer of 2014 of the internal and external structure. After registration and processing of the laser scan survey, the HBIM was created of the damaged section of the building and is presented as two separate workflows in this paper. The first is the model created from historic data, the second a procedural and segmented model developed from laser scan survey of the war damaged drum and dome. From both models structural damage and decay simulations will be developed for documentation and conservation analysis.

  7. MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.

    Science.gov (United States)

    Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver

    2011-07-30

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com.

  8. The PandaRoot framework for simulation, reconstruction and analysis

    Science.gov (United States)

    Spataro, Stefano; PANDA Collaboration

    2011-12-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  9. HEAT TRANSFER EXPERIMENTS AND ANALYSIS OF A SIMULATED HTS CABLE

    Energy Technology Data Exchange (ETDEWEB)

    Demko, J. A. [Oak Ridge National Laboratory (ORNL); Duckworth, R. C. [Oak Ridge National Laboratory (ORNL); Gouge, M. J. [Oak Ridge National Laboratory (ORNL); Knoll, D. [Oak Ridge National Laboratory (ORNL)

    2010-01-01

    Long-length high temperature superconducting (HIS) cable projects, over 1 km, are being designed that are cooled by flowing liquid nitrogen. The compact counter-flow cooling arrangement which has the supply and return stream in a single cryostat offers several advantages including smallest space requirement, least heat load, and reduced cost since a return cryostat is not required. One issue in long length HIS cable systems is the magnitude of the heat transfer radially through the cable. It is extremely difficult to instrument an HIS cable in service on the grid with the needed thermometry because of the issues associated with installing thermometers on high voltage components. A 5-meter long test system has been built that simulates a counter-flow cooled, HIS cable using a heated tube to simulate the cable. Measurements of the temperatures in the flow stream and on the tube wall can be made and compared to analysis. These data can be used to benchmark different HIS cable heat transfer and fluid flow analysis approaches.

  10. Heat Transfer Experiments and Analysis of a Simulated HTS

    Energy Technology Data Exchange (ETDEWEB)

    Demko, Jonathan A [ORNL; Duckworth, Robert C [ORNL; Gouge, Michael J [ORNL; Knoll, David [Ultera – A Southwire / nkt cables Joint Venture

    2010-01-01

    Long-length high temperature superconducting (HTS) cable projects, over 1 km, are being designed that are cooled by flowing liquid nitrogen. The compact counter-flow cooling arrangement which has the supply and return stream in a single cryostat offers several advantages including smallest space requirement, least heat load, and reduced cost since a return cryostat is not required. One issue in long length HTS cable systems is the magnitude of the heat transfer radially through the cable. It is extremely difficult to instrument an HTS cable in service on the grid with the needed thermometry because of the issues associated with installing thermometers on high voltage components. A 5-meter long test system has been built that simulates a counter-flow cooled, HTS cable using a heated tube to simulate the cable. Measurements of the temperatures in the flow stream and on the tube wall are presented and compared to analysis. These data can be used to benchmark different HTS cable heat transfer and fluid flow analysis approaches.

  11. LOOS: an extensible platform for the structural analysis of simulations.

    Science.gov (United States)

    Romo, Tod D; Grossfield, Alan

    2009-01-01

    We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms.

  12. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  13. GRAFTED - GRAphical Fault Tree EDitor: A Fault Tree Description Program For Target Vulnerability/Survivability Analysis. User Manual

    Science.gov (United States)

    1993-12-01

    9: Fault Tree Description of Generic Missile With Power Module. 3.4 Summary The remainder of the Generic Missile model will not be described in full...if the changes should be saved, and the Fault Tree Description files re-compiled. The next time GRAFTED is run from this directory, the Generic Missile model will

  14. The frequency of, and adherence to, single maintenance and reliever therapy instructions in asthma: a descriptive analysis.

    Science.gov (United States)

    DiSantostefano, Rachael L; Boudiaf, Nada; Stempel, David A; Barnes, Neil C; Greening, Andrew P

    2016-07-21

    Inhaled corticosteroid/long-acting β2-agonist (ICS/LABA) fixed-dose combinations are recommended regular maintenance options for asthma. ICS/LABAs containing formoterol may also be indicated for single maintenance and reliever therapy (SMART). This analysis evaluated the frequency of SMART dosing of budesonide/formoterol fixed-dose combination (BFC) in the United Kingdom. Secondary objectives were to assess adherence and use of short-acting ß2-agonists (SABAs). This was a descriptive analysis of treatment patterns using the UK Clinical Practice Research Datalink-GP OnLine Database data (2009-2013). SMART dosing was determined when prescription instructions contained guidance for daily dosing plus 'and when required'. Treatment and prescription refill patterns of BFC and SABA were described in the year following the index date to identify adherence and SMART dosing instructions versus other dosing regimens. Of 14,818 patients identified, 173 (1.2%) had evidence of prescriptions for SMART dosing at their index BFC prescription. Despite being prescribed SMART dosing, 91 of 173 patients (53%) were additionally dispensed SABA in the year following the index date. The mean number of BFC inhalers used was less than required for daily treatment for SMART and non-SMART dosing groups (4.7 and 4.8, respectively).This analysis suggests that SMART dosing is infrequent when examining dosing instructions. Therefore, results of randomised clinical trials using SMART dosing may not translate to clinical practice in the United Kingdom because of the low level of SMART prescription, concurrent use of SABA, and inadequate refill persistence observed. Further research is needed to understand SMART dosing in real-world clinical practice.

  15. A new version of the CNRM Chemistry-Climate Model, CNRM-CCM: description and improvements from the CCMVal-2 simulations

    Directory of Open Access Journals (Sweden)

    M. Michou

    2011-10-01

    Full Text Available This paper presents a new version of the Météo-France CNRM Chemistry-Climate Model, so-called CNRM-CCM. It includes some fundamental changes from the previous version (CNRM-ACM which was extensively evaluated in the context of the CCMVal-2 validation activity. The most notable changes concern the radiative code of the GCM, and the inclusion of the detailed stratospheric chemistry of our Chemistry-Transport model MOCAGE on-line within the GCM. A 47-yr transient simulation (1960–2006 is the basis of our analysis. CNRM-CCM generates satisfactory dynamical and chemical fields in the stratosphere. Several shortcomings of CNRM-ACM simulations for CCMVal-2 that resulted from an erroneous representation of the impact of volcanic aerosols as well as from transport deficiencies have been eliminated.

    Remaining problems concern the upper stratosphere (5 to 1 hPa where temperatures are too high, and where there are biases in the NO2, N2O5 and O3 mixing ratios. In contrast, temperatures at the tropical tropopause are too cold. These issues are addressed through the implementation of a more accurate radiation scheme at short wavelengths. Despite these problems we show that this new CNRM CCM is a useful tool to study chemistry-climate applications.

  16. Simulation and Flexibility Analysis of Milk Production Process

    DEFF Research Database (Denmark)

    Cheng, Hongyuan; Friis, Alan

    In this work, process simulation method is used to simulate pasteurised market milk production line. A commercial process simulation tool - Pro/II from Simulation Science Inc. is used in the simulation work. In the simulation, a new model is used to calculate the thermal property of milk....... In this work, a simulator is obtained for the milk production line. Using the simulator, different milk processing situation can be quantitatively simulated investigated, such as different products production, capacity changes, fat content changes in raw milk, energy cost at different operation conditions etc....... As the pasteurised market milk production line involves typical milk processing steps, such as pasteurisation, centrifugal separation, standardisation, the simulator can be modified to simulate similar milk processing lines. In many cases, the rapidly changed market requires a flexible milk production line...

  17. Simulation and Flexibility Analysis of Milk Production Process

    DEFF Research Database (Denmark)

    Cheng, Hongyuan; Friis, Alan

    . In this work, a simulator is obtained for the milk production line. Using the simulator, different milk processing situation can be quantitatively simulated investigated, such as different products production, capacity changes, fat content changes in raw milk, energy cost at different operation conditions etc...... processing conditions are investigated through the simulator. A flexible operation range or ‘operation window’ is obtained from the simulation for the milk production line. The study gives both the operation feasibility and detailed operation cost.......In this work, process simulation method is used to simulate pasteurised market milk production line. A commercial process simulation tool - Pro/II from Simulation Science Inc. is used in the simulation work. In the simulation, a new model is used to calculate the thermal property of milk...

  18. Descriptive analysis of context evaluation instrument for technical oral presentation skills evaluation: A case study in English technical communication course

    Science.gov (United States)

    Mohamed, Abdullah-Adnan; Asmawi, Adelina; Hamid, Mohd Rashid Ab; Mustafa, Zainol bin

    2015-02-01

    This paper reports a pilot study of Context Evaluation using a self-developed questionnaire distributed among engineering undergraduates at a university under study. The study aims to validate the self-developed questionnaires used in the Context evaluation, a component in the CIPP Model. The Context evaluation assesses background information for needs, assets, problems and opportunities relevant to beneficiaries of the study in a defined environment. Through the questionnaire, background information for the assessment of needs, assets and problems related to the engineering undergraduates' perceptions on the teaching and learning of technical oral presentation skills was collected and analysed. The questionnaire was developed using 5-points Likert scale to measure the constructs under study. They were distributed to 100 respondents with 79 returned. The respondents consisted of engineering undergraduates studied at various faculties at one technical university in Malaysia. The descriptive analysis of data for each item which makes up the construct for Context evaluation is found to be high. This implied that engineering undergraduates showed high interest in teaching and learning of technical oral presentation skills, thus their needs are met. Also, they agreed that assets and facilities are conducive to their learning. In conclusion, the context evaluation involving needs and assets factors are both considerably important; their needs are met and the assets and facilities do support their technical oral presentation skills learning experience.

  19. Sensory descriptive quantitative analysis of unpasteurized and pasteurized juçara pulp (Euterpe edulis) during long-term storage.

    Science.gov (United States)

    da Silva, Paula Porrelli Moreira; Casemiro, Renata Cristina; Zillo, Rafaela Rebessi; de Camargo, Adriano Costa; Prospero, Evanilda Teresinha Perissinotto; Spoto, Marta Helena Fillet

    2014-07-01

    This study evaluated the effect of pasteurization followed by storage under different conditions on the sensory attributes of frozen juçara pulp using quantitative descriptive analysis (QDA). Pasteurization of packed frozen pulp was performed by its immersion in stainless steel tank containing water (80°C) for 5 min, followed by storage under refrigerated and frozen conditions. A trained sensory panel evaluated the samples (6°C) on day 1, 15, 30, 45, 60, 75, and 90. Sensory attributes were separated as follows: appearance (foamy, heterogeneous, purple, brown, oily, and creamy), aroma (sweet and fermented), taste (astringent, bitter, and sweet), and texture (oily and consistent), and compared to a reference material. In general, unpasteurized frozen pulp showed the highest score for foamy appearance, and pasteurized samples showed highest scores to creamy appearance. Pasteurized samples remained stable regarding brown color development while unpasteurized counterparts presented increase. Color is an important attribute related to the product identity. All attributes related to taste and texture remained constant during storage for all samples. Pasteurization followed by storage under frozen conditions has shown to be the best conservation method as samples submitted to such process received the best sensory evaluation, described as foamy, slightly heterogeneous, slightly bitter, and slightly astringent.

  20. Student Teachers’ Beliefs about Teaching and Their Sense of Self-Efficacy: A Descriptive and Comparative Analysis

    Directory of Open Access Journals (Sweden)

    Oğuz GÜRBÜZTÜRK

    2009-02-01

    Full Text Available This study aims at investigating the student teachers’ traditional versus constructivist educational beliefs and their sense of self-efficacy by some variables: gender, grade, and department. Also it is intended to examine the association between them. The population of the study is 3.817 (1.822 female, 1955 male student teachers in Faculty of Education at İnönü University during the first semester of 2007-2008 academic year. The sample of the study comprises 411 students chosen using proportional stratified sampling technique. Participants were given “Teachers Belief Survey” and “Teachers’ Sense of Efficacy Scale”. The data obtained were analyzed using descriptive statistical techniques, t-test, ANOVA, Kruskal Wallis, LSD, Mann Whitney U and Pearson correlation. The analysis revealed that participants’ professional self-efficacy levels were moderately over average and they had both constructivist and traditional beliefs, the former being moderately more dominant. The comparisons between independent groups (gender, grade, and department gave some results partly consistent with the relevant literature. Also a positive correlation was found between constructivist teacher beliefs and self-efficacy beliefs about student engagement, and between traditional teacher beliefs and self-efficacy beliefs about class management, instruction, and overall self-efficacy.

  1. Exploring the Concern about Food Allergies among Secondary School and University Students in Ontario, Canada: A Descriptive Analysis

    Directory of Open Access Journals (Sweden)

    Shannon E. Majowicz

    2017-01-01

    Full Text Available Our objective was to explore the perceived risk of food allergies among students in Ontario, Canada. We analyzed blinding questions (“I am concerned about food allergies”; “food allergies are currently a big threat to my health” from three existing food safety surveys, given to high school and university undergraduate students (n=3,451 circa February 2015, using descriptive analysis, and explored how concern related to demographics and self-reported cooking ability using linear regression. Overall, high school students were neutral in their concern, although Food and Nutrition students specifically were significantly less concerned (p=0.002 than high school students overall. University undergraduates were moderately unconcerned about food allergies. Concern was highest in younger students, decreasing between 13 and 18 years of age and plateauing between 19 and 23 years. Among students aged 13–18 years, concern was higher among those who worked or volunteered in a daycare and who had previously taken a food preparation course. Among students aged 19–23 years, concern was higher among females and those with less advanced cooking abilities. Concern was significantly correlated with perceiving food allergies as a personal threat. This study offers a first exploration of perceived risk of food allergies among this demographic and can guide future, more rigorous assessments.

  2. Nurses' attitudes and knowledge regarding organ and tissue donation and transplantation in a provincial hospital: A descriptive and multivariate analysis.

    Science.gov (United States)

    Lomero, Maria Del Mar; Jiménez-Herrera, María F; Rasero, Maria José; Sandiumenge, Alberto

    2017-09-01

    The attitudes and knowledge of nursing personnel regarding organ and tissue donation can influence the decision to donate. This study aimed to determine these two factors among nurses at a district hospital in Barcelona, Spain. A survey was carried out using a 35 item questionnaire. Results were subjected to descriptive and comparative statistical analyses using bivariate and multivariate analyses to examine the relation between demographic data and attitudes toward donation. The completion rate was 68.2%, with 98.6% of those responding stating that they were in favor of organ donation. The respondents were unsure as to whether the criteria for inclusion in transplant waiting lists were appropriate (57.5%), whereas 72.2% agreed that brain death is equivalent to death. The bivariate analysis revealed a significant association between a positive attitude toward donation and working on permanent night shift no religious beliefs. Attitudes toward donation among nurses were generally positive; a negative attitude, although attitudes towards donation among the nurses participating in the study were generally positive, it should be pointed out that when a negative attitude does exist this affects significant aspects such as belief in the diagnosis of brain death or the criteria for inclusion on the waiting list, amongst others, which reflects that specific training in donation focused on nurses continues to be needed. © 2017 John Wiley & Sons Australia, Ltd.

  3. Effect of preservative addition on sensory and dynamic profile of Lucanian dry-sausages as assessed by quantitative descriptive analysis and temporal dominance of sensations.

    Science.gov (United States)

    Braghieri, Ada; Piazzolla, Nicoletta; Galgano, Fernanda; Condelli, Nicola; De Rosa, Giuseppe; Napolitano, Fabio

    2016-12-01

    The quantitative descriptive analysis (QDA) was combined with temporal dominance of sensations (TDS) to assess the sensory properties of Lucanian dry-sausages either added with nitrate, nitrite and l-ascorbic acid (NS), or not (NNS). Both QDA and TDS differentiated the two groups of sausages. NNS products were perceived with higher intensity of hardness (Pdescription and differentiation of Lucanian sausages.

  4. Analysis of errors occurring in large eddy simulation.

    Science.gov (United States)

    Geurts, Bernard J

    2009-07-28

    We analyse the effect of second- and fourth-order accurate central finite-volume discretizations on the outcome of large eddy simulations of homogeneous, isotropic, decaying turbulence at an initial Taylor-Reynolds number Re(lambda)=100. We determine the implicit filter that is induced by the spatial discretization and show that a higher order discretization also induces a higher order filter, i.e. a low-pass filter that keeps a wider range of flow scales virtually unchanged. The effectiveness of the implicit filtering is correlated with the optimal refinement strategy as observed in an error-landscape analysis based on Smagorinsky's subfilter model. As a point of reference, a finite-volume method that is second-order accurate for both the convective and the viscous fluxes in the Navier-Stokes equations is used. We observe that changing to a fourth-order accurate convective discretization leads to a higher value of the Smagorinsky coefficient C(S) required to achieve minimal total error at given resolution. Conversely, changing only the viscous flux discretization to fourth-order accuracy implies that optimal simulation results are obtained at lower values of C(S). Finally, a fully fourth-order discretization yields an optimal C(S) that is slightly lower than the reference fully second-order method.

  5. Proteomic analysis of zebrafish embryos exposed to simulated-microgravity

    Science.gov (United States)

    Hang, Xiaoming; Ma, Wenwen; Wang, Wei; Liu, Cong; Sun, Yeqing

    Microgravity can induce a serial of physiological and pathological changes in human body, such as cardiovascular functional disorder, bone loss, muscular atrophy and impaired immune system function, etc. In this research, we focus on the influence of microgravity to vertebrate embryo development. As a powerful model for studying vertebrate development, zebrafish embryos at 8 hpf (hour past fertilization) and 24 hpf were placed into a NASA developed bioreac-tor (RCCS) to simulate microgravity for 64 and 48 hours, respectively. The same number of control embryos from the same parents were placed in a tissue culture dish at the same temper-ature of 28° C. Each experiment was repeated 3 times and analyzed by two-dimensional (2-D) gel electrophoresis. Image analysis of silver stained 2-D gels revealed that 64 from total 292 protein spots showed quantitative and qualitative variations that were significantly (Pprotein spots with significant expression alteration (Pproteins, 3 down-regulated proteins were identified as bectin 2, centrosomal protein of 135kDa and tropomyosin 4, while the up-regulated protein was identified as creatine kinase muscle B. Other protein spots showed significant expression alteration will be identified successively and the corresponding genes expression will also be measured by Q-PCR method at different development stages. The data presented in this study illustrate that zebrafish embryo can be significantly induced by microgravity on the expression of proteins involved in bone and muscle formation. Key Words: Danio rerio; Simulated-microgravity; Proteomics

  6. Simulation of springback and microstructural analysis of dual phase steels

    Science.gov (United States)

    Kalyan, T. Sri.; Wei, Xing; Mendiguren, Joseba; Rolfe, Bernard

    2013-12-01

    With increasing demand for weight reduction and better crashworthiness abilities in car development, advanced high strength Dual Phase (DP) steels have been progressively used when making automotive parts. The higher strength steels exhibit higher springback and lower dimensional accuracy after stamping. This has necessitated the use of simulation of each stamped component prior to production to estimate the part's dimensional accuracy. Understanding the micro-mechanical behaviour of AHSS sheet may provide more accuracy to stamping simulations. This work can be divided basically into two parts: first modelling a standard channel forming process; second modelling the micro-structure of the process. The standard top hat channel forming process, benchmark NUMISHEET'93, is used for investigating springback effect of WISCO Dual Phase steels. The second part of this work includes the finite element analysis of microstructures to understand the behaviour of the multi-phase steel at a more fundamental level. The outcomes of this work will help in the dimensional control of steels during manufacturing stage based on the material's microstructure.

  7. Direct Numerical Simulation of Combustion Using Principal Component Analysis

    Science.gov (United States)

    Owoyele, Opeoluwa; Echekki, Tarek

    2016-11-01

    We investigate the potential of accelerating chemistry integration during the direct numerical simulation (DNS) of complex fuels based on the transport equations of representative scalars that span the desired composition space using principal component analysis (PCA). The transported principal components (PCs) offer significant potential to reduce the computational cost of DNS through a reduction in the number of transported scalars, as well as the spatial and temporal resolution requirements. The strategy is demonstrated using DNS of a premixed methane-air flame in a 2D vortical flow and is extended to the 3D geometry to further demonstrate the computational efficiency of PC transport. The PCs are derived from a priori PCA of a subset of the full thermo-chemical scalars' vector. The PCs' chemical source terms and transport properties are constructed and tabulated in terms of the PCs using artificial neural networks (ANN). Comparison of DNS based on a full thermo-chemical state and DNS based on PC transport based on 6 PCs shows excellent agreement even for species that are not included in the PCA reduction. The transported PCs reproduce some of the salient features of strongly curved and strongly strained flames. The 2D DNS results also show a significant reduction of two orders of magnitude in the computational cost of the simulations, which enables an extension of the PCA approach to 3D DNS under similar computational requirements. This work was supported by the National Science Foundation Grant DMS-1217200.

  8. Information system analysis of an e-learning system used for dental restorations simulation.

    Science.gov (United States)

    Bogdan, Crenguţa M; Popovici, Dorin M

    2012-09-01

    The goal of using virtual and augmented reality technologies in therapeutic interventions simulation, in the fixed prosthodontics (VirDenT) project, is to increase the quality of the educational process in dental faculties, by assisting students in learning how to prepare teeth for all-ceramic restorations. Its main component is an e-learning virtual reality-based software system that will be used for the developing skills in grinding teeth, needed in all-ceramic restorations. The complexity of the domain problem that the software system dealt with made the analysis of the information system supported by VirDenT necessary. The analysis contains the following activities: identification and classification of the system stakeholders, description of the business processes, formulation of the business rules, and modelling of business objects. During this stage, we constructed the context diagram, the business use case diagram, the activity diagrams and the class diagram of the domain model. These models are useful for the further development of the software system that implements the VirDenT information system. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Simulated annealing with probabilistic analysis for solving traveling salesman problems

    Science.gov (United States)

    Hong, Pei-Yee; Lim, Yai-Fung; Ramli, Razamin; Khalid, Ruzelan

    2013-09-01

    Simulated Annealing (SA) is a widely used meta-heuristic that was inspired from the annealing process of recrystallization of metals. Therefore, the efficiency of SA is highly affected by the annealing schedule. As a result, in this paper, we presented an empirical work to provide a comparable annealing schedule to solve symmetric traveling salesman problems (TSP). Randomized complete block design is also used in this study. The results show that different parameters do affect the efficiency of SA and thus, we propose the best found annealing schedule based on the Post Hoc test. SA was tested on seven selected benchmarked problems of symmetric TSP with the proposed annealing schedule. The performance of SA was evaluated empirically alongside with benchmark solutions and simple analysis to validate the quality of solutions. Computational results show that the proposed annealing schedule provides a good quality of solution.

  10. Sensitivity analysis for oblique incidence reflectometry using Monte Carlo simulations

    DEFF Research Database (Denmark)

    Kamran, Faisal; Andersen, Peter E.

    2015-01-01

    Oblique incidence reflectometry has developed into an effective, noncontact, and noninvasive measurement technology for the quantification of both the reduced scattering and absorption coefficients of a sample. The optical properties are deduced by analyzing only the shape of the reflectance...... profiles. This article presents a sensitivity analysis of the technique in turbid media. Monte Carlo simulations are used to investigate the technique and its potential to distinguish the small changes between different levels of scattering. We present various regions of the dynamic range of optical...... properties in which system demands vary to be able to detect subtle changes in the structure of the medium, translated as measured optical properties. Effects of variation in anisotropy are discussed and results presented. Finally, experimental data of milk products with different fat content are considered...

  11. Analysis of Boundary Conditions for Crystal Defect Atomistic Simulations

    Science.gov (United States)

    Ehrlacher, V.; Ortner, C.; Shapeev, A. V.

    2016-12-01

    Numerical simulations of crystal defects are necessarily restricted to finite computational domains, supplying artificial boundary conditions that emulate the effect of embedding the defect in an effectively infinite crystalline environment. This work develops a rigorous framework within which the accuracy of different types of boundary conditions can be precisely assessed. We formulate the equilibration of crystal defects as variational problems in a discrete energy space and establish qualitatively sharp regularity estimates for minimisers. Using this foundation we then present rigorous error estimates for (i) a truncation method (Dirichlet boundary conditions), (ii) periodic boundary conditions, (iii) boundary conditions from linear elasticity, and (iv) boundary conditions from nonlinear elasticity. Numerical results confirm the sharpness of the analysis.

  12. Integrated simulation and data envelopment analysis models in emergency department

    Science.gov (United States)

    Aminuddin, Wan Malissa Wan Mohd; Ismail, Wan Rosmanira

    2016-11-01

    This study aims to determine the best resource allocation and to increase the efficiency service of an emergency department in a public hospital in Kuala Lumpur. We integrate Discrete Event Simulation (DES) and three models of Data Envelopment Analysis (DEA); Input-oriented CCR model, Input-oriented BCC model and Super-Efficiency model to fulfill such objective. Based on the comparison of results taken from the DEA models, the combination of DES, Input-oriented BCC model and Super-Efficiency BCC model is seen to be the best resource allocation technique to be used for enhancing the hospital efficiency. The combination has reduced patients waiting time while improving the average utilization rate of hospital resources compared to the current situation.

  13. Motion Simulation Analysis of Rail Weld CNC Fine Milling Machine

    Science.gov (United States)

    Mao, Huajie; Shu, Min; Li, Chao; Zhang, Baojun

    CNC fine milling machine is a new advanced equipment of rail weld precision machining with high precision, high efficiency, low environmental pollution and other technical advantages. The motion performance of this machine directly affects its machining accuracy and stability, which makes it an important consideration for its design. Based on the design drawings, this article completed 3D modeling of 60mm/kg rail weld CNC fine milling machine by using Solidworks. After that, the geometry was imported into Adams to finish the motion simulation analysis. The displacement, velocity, angular velocity and some other kinematical parameters curves of the main components were obtained in the post-processing and these are the scientific basis for the design and development for this machine.

  14. Irrigation water policy analysis using a business simulation game

    Science.gov (United States)

    Buchholz, M.; Holst, G.; Musshoff, O.

    2016-10-01

    Despite numerous studies on farmers' responses to changing irrigation water policies, uncertainties remain about the potential of water pricing schemes and water quotas to reduce irrigation. Thus far, policy impact analysis is predominantly based upon rational choice models that assume behavioral assumptions, such as a perfectly rational profit-maximizing decision maker. Also, econometric techniques are applied which could lack internal validity due to uncontrolled field data. Furthermore, such techniques are not capable of identifying ill-designed policies prior to their implementation. With this in mind, we apply a business simulation game for ex ante policy impact analysis of irrigation water policies at the farm level. Our approach has the potential to reveal the policy-induced behavioral change of the participants in a controlled environment. To do so, we investigate how real farmers from Germany, in an economic experiment, respond to a water pricing scheme and a water quota intending to reduce irrigation. In the business simulation game, the participants manage a "virtual" cash-crop farm for which they make crop allocation and irrigation decisions during several production periods, while facing uncertain product prices and weather conditions. The results reveal that a water quota is able to reduce mean irrigation applications, while a water pricing scheme does not have an impact, even though both policies exhibit equal income effects for the farmers. However, both policies appear to increase the variation of irrigation applications. Compared to a perfectly rational profit-maximizing decision maker, the participants apply less irrigation on average, both when irrigation is not restricted and when a water pricing scheme applies. Moreover, the participants' risk attitude affects the irrigation decisions.

  15. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    Science.gov (United States)

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  16. Hamiltonian replica exchange combined with elastic network analysis to enhance global domain motions in atomistic molecular dynamics simulations.

    Science.gov (United States)

    Ostermeir, Katja; Zacharias, Martin

    2014-12-01

    Coarse-grained elastic network models (ENM) of proteins offer a low-resolution representation of protein dynamics and directions of global mobility. A Hamiltonian-replica exchange molecular dynamics (H-REMD) approach has been developed that combines information extracted from an ENM analysis with atomistic explicit solvent MD simulations. Based on a set of centers representing rigid segments (centroids) of a protein, a distance-dependent biasing potential is constructed by means of an ENM analysis to promote and guide centroid/domain rearrangements. The biasing potentials are added with different magnitude to the force field description of the MD simulation along the replicas with one reference replica under the control of the original force field. The magnitude and the form of the biasing potentials are adapted during the simulation based on the average sampled conformation to reach a near constant biasing in each replica after equilibration. This allows for canonical sampling of conformational states in each replica. The application of the methodology to a two-domain segment of the glycoprotein 130 and to the protein cyanovirin-N indicates significantly enhanced global domain motions and improved conformational sampling compared with conventional MD simulations.

  17. Analysis of ground-motion simulation big data

    Science.gov (United States)

    Maeda, T.; Fujiwara, H.

    2016-12-01

    We developed a parallel distributed processing system which applies a big data analysis to the large-scale ground motion simulation data. The system uses ground-motion index values and earthquake scenario parameters as input. We used peak ground velocity value and velocity response spectra as the ground-motion index. The ground-motion index values are calculated from our simulation data. We used simulated long-period ground motion waveforms at about 80,000 meshes calculated by a three dimensional finite difference method based on 369 earthquake scenarios of a great earthquake in the Nankai Trough. These scenarios were constructed by considering the uncertainty of source model parameters such as source area, rupture starting point, asperity location, rupture velocity, fmax and slip function. We used these parameters as the earthquake scenario parameter. The system firstly carries out the clustering of the earthquake scenario in each mesh by the k-means method. The number of clusters is determined in advance using a hierarchical clustering by the Ward's method. The scenario clustering results are converted to the 1-D feature vector. The dimension of the feature vector is the number of scenario combination. If two scenarios belong to the same cluster the component of the feature vector is 1, and otherwise the component is 0. The feature vector shows a `response' of mesh to the assumed earthquake scenario group. Next, the system performs the clustering of the mesh by k-means method using the feature vector of each mesh previously obtained. Here the number of clusters is arbitrarily given. The clustering of scenarios and meshes are performed by parallel distributed processing with Hadoop and Spark, respectively. In this study, we divided the meshes into 20 clusters. The meshes in each cluster are geometrically concentrated. Thus this system can extract regions, in which the meshes have similar `response', as clusters. For each cluster, it is possible to determine

  18. Space Debris Attitude Simulation - IOTA (In-Orbit Tumbling Analysis)

    Science.gov (United States)

    Kanzler, R.; Schildknecht, T.; Lips, T.; Fritsche, B.; Silha, J.; Krag, H.

    Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA's Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. The In-Orbit Tumbling Analysis tool (IOTA) is a prototype software, currently in development within the framework of ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), which is led by the Astronomical Institute of the University of Bern (AIUB). The project goal is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). Developed by Hyperschall Technologie Göttingen GmbH (HTG), IOTA will be a highly modular software tool to perform short- (days), medium- (months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour

  19. A new computational approach to cracks quantification from 2D image analysis: Application to micro-cracks description in rocks

    Science.gov (United States)

    Arena, Alessio; Delle Piane, Claudio; Sarout, Joel

    2014-05-01

    In this paper we propose a crack quantification method based on 2D image analysis. This technique is applied to a gray level Scanning Electron Microscope (SEM) images, segmented and converted in Black and White (B/W) images using the Trainable Segmentation plugin of Fiji. Resulting images are processed using a novel Matlab script composed of three different algorithms: the separation algorithm, the filtering and quantification algorithm and the orientation one. Initially the input image is enhanced via 5 morphological processes. The resulting lattice is “cut” into single cracks using 1 pixel-wide bisector lines originated from every node. Cracks are labeled using the connected-component method, then the script computes geometrical parameters, such as width, length, area, aspect ratio and orientation. A filtering is performed using a user-defined value of aspect ratio, followed by a statistical analysis of remaining cracks. In the last part of this paper we discuss about the efficiency of this script, introducing an example of analysis of two datasets with different dimension and resolution; these analyses are performed using a notebook and a high-end professional desktop solution, in order to simulate different working environments.

  20. Microstructure-based analysis and simulation of flow and mass transfer in chromatographic stationary phases

    Science.gov (United States)

    Koku, Harun

    -dimensional pore size distributions obtained for the CIM disk using image processing algorithms were found to deviate significantly from the manufacturer-reported experimental mercury intrusion results, the difference being attributed to the local nature of the image-based methods or assumptions and limitations inherent in the experimental mercury intrusion method. A probe placement algorithm was introduced to estimate solute capacity from the explicit geometry of the monolith. To enable a precise description of both flow and the geometry for a rigorous analysis of dispersion, a three-dimensional sample block for the CIM disk was reconstructed using serial imaging and sectioning, and the flow and mass transfer were simulated using a lattice-Boltzmann method and a particle-based random-walk method, respectively. Flow simulations hinted at the partitioning of flow into high- and low-velocity regions and the dispersion simulations obtained on top of the velocity field revealed artifacts in particle trajectories due to the symmetry of the lateral flow with respect to the periodic boundaries. Constraining the simulation length to reduce the effect of this symmetry yielded dispersion behavior suggestive of channeling, hinting that the sample geometry might not be representative of the macroscopic structure. Simulations of the local behavior of finite particles predicted the experimentally observed entrapment behavior, as well as the increase of the entrapment level with flow rate. Analysis of trajectories provided support for a previously hypothesized mechanism for entrapment.