WorldWideScience

Sample records for description analysis simulation

  1. An activity theory perspective of how scenario-based simulations support learning: a descriptive analysis.

    Science.gov (United States)

    Battista, Alexis

    2017-01-01

    The dominant frameworks for describing how simulations support learning emphasize increasing access to structured practice and the provision of feedback which are commonly associated with skills-based simulations. By contrast, studies examining student participants' experiences during scenario-based simulations suggest that learning may also occur through participation. However, studies directly examining student participation during scenario-based simulations are limited. This study examined the types of activities student participants engaged in during scenario-based simulations and then analyzed their patterns of activity to consider how participation may support learning. Drawing from Engeström's first-, second-, and third-generation activity systems analysis, an in-depth descriptive analysis was conducted. The study drew from multiple qualitative methods, namely narrative, video, and activity systems analysis, to examine student participants' activities and interaction patterns across four video-recorded simulations depicting common motivations for using scenario-based simulations (e.g., communication, critical patient management). The activity systems analysis revealed that student participants' activities encompassed three clinically relevant categories, including (a) use of physical clinical tools and artifacts, (b) social interactions, and (c) performance of structured interventions. Role assignment influenced participants' activities and the complexity of their engagement. Importantly, participants made sense of the clinical situation presented in the scenario by reflexively linking these three activities together. Specifically, student participants performed structured interventions, relying upon the use of physical tools, clinical artifacts, and social interactions together with interactions between students, standardized patients, and other simulated participants to achieve their goals. When multiple student participants were present, such as in a

  2. Multi-Level Simulated Fault Injection for Data Dependent Reliability Analysis of RTL Circuit Descriptions

    Directory of Open Access Journals (Sweden)

    NIMARA, S.

    2016-02-01

    Full Text Available This paper proposes data-dependent reliability evaluation methodology for digital systems described at Register Transfer Level (RTL. It uses a hybrid hierarchical approach, combining the accuracy provided by Gate Level (GL Simulated Fault Injection (SFI and the low simulation overhead required by RTL fault injection. The methodology comprises the following steps: the correct simulation of the RTL system, according to a set of input vectors, hierarchical decomposition of the system into basic RTL blocks, logic synthesis of basic RTL blocks, data-dependent SFI for the GL netlists, and RTL SFI. The proposed methodology has been validated in terms of accuracy on a medium sized circuit – the parallel comparator used in Check Node Unit (CNU of the Low-Density Parity-Check (LDPC decoders. The methodology has been applied for the reliability analysis of a 128-bit Advanced Encryption Standard (AES crypto-core, for which the GL simulation was prohibitive in terms of required computational resources.

  3. Operationalizing Healthcare Simulation Psychological Safety: A Descriptive Analysis of an Intervention.

    Science.gov (United States)

    Henricksen, Jared W; Altenburg, Catherine; Reeder, Ron W

    2017-10-01

    Despite efforts to prepare a psychologically safe environment, simulation participants are occasionally psychologically distressed. Instructing simulation educators about participant psychological risks and having a participant psychological distress action plan available to simulation educators may assist them as they seek to keep all participants psychologically safe. A Simulation Participant Psychological Safety Algorithm was designed to aid simulation educators as they debrief simulation participants perceived to have psychological distress and categorize these events as mild (level 1), moderate (level 2), or severe (level 3). A prebrief dedicated to creating a psychologically safe learning environment was held constant. The algorithm was used for 18 months in an active pediatric simulation program. Data collected included level of participant psychological distress as perceived and categorized by the simulation team using the algorithm, type of simulation that participants went through, who debriefed, and timing of when psychological distress was perceived to occur during the simulation session. The Kruskal-Wallis test was used to evaluate the relationship between events and simulation type, events and simulation educator team who debriefed, and timing of event during the simulation session. A total of 3900 participants went through 399 simulation sessions between August 1, 2014, and January 26, 2016. Thirty-four (simulation participants from 27 sessions (7%) were perceived to have an event. One participant was perceived to have a severe (level 3) psychological distress event. Events occurred more commonly in high-intensity simulations, with novice learners and with specific educator teams. Simulation type and simulation educator team were associated with occurrence of events (P simulation personnel using the Simulation Participant Psychological Safety Algorithm is rare, with mild and moderate events being more common. The algorithm was used to teach

  4. Descriptive data analysis.

    Science.gov (United States)

    Thompson, Cheryl Bagley

    2009-01-01

    This 13th article of the Basics of Research series is first in a short series on statistical analysis. These articles will discuss creating your statistical analysis plan, levels of measurement, descriptive statistics, probability theory, inferential statistics, and general considerations for interpretation of the results of a statistical analysis.

  5. Multidimensional nonlinear descriptive analysis

    CERN Document Server

    Nishisato, Shizuhiko

    2006-01-01

    Quantification of categorical, or non-numerical, data is a problem that scientists face across a wide range of disciplines. Exploring data analysis in various areas of research, such as the social sciences and biology, Multidimensional Nonlinear Descriptive Analysis presents methods for analyzing categorical data that are not necessarily sampled randomly from a normal population and often involve nonlinear relations. This reference not only provides an overview of multidimensional nonlinear descriptive analysis (MUNDA) of discrete data, it also offers new results in a variety of fields. The first part of the book covers conceptual and technical preliminaries needed to understand the data analysis in subsequent chapters. The next two parts contain applications of MUNDA to diverse data types, with each chapter devoted to one type of categorical data, a brief historical comment, and basic skills peculiar to the data types. The final part examines several problems and then concludes with suggestions for futu...

  6. The Total In-Flight Simulator (TIFS) aerodynamics and systems: Description and analysis. [maneuver control and gust alleviators

    Science.gov (United States)

    Andrisani, D., II; Daughaday, H.; Dittenhauser, J.; Rynaski, E.

    1978-01-01

    The aerodynamics, control system, instrumentation complement and recording system of the USAF Total In/Flight Simulator (TIFS) airplane are described. A control system that would allow the ailerons to be operated collectively, as well as, differentially to entrance the ability of the vehicle to perform the dual function of maneuver load control and gust alleviation is emphasized. Mathematical prediction of the rigid body and the flexible equations of longitudinal motion using the level 2.01 FLEXSTAB program are included along with a definition of the vehicle geometry, the mass and stiffness distribution, the calculated mode frequencies and mode shapes, and the resulting aerodynamic equations of motion of the flexible vehicle. A complete description of the control and instrumentation system of the aircraft is presented, including analysis, ground test and flight data comparisons of the performance and bandwidth of the aerodynamic surface servos. Proposed modification for improved performance of the servos are also presented.

  7. The College of Anaesthetists of Ireland Simulation Training programme: a descriptive report and analysis of course participants' feedback.

    Science.gov (United States)

    Cafferkey, Aine; Coyle, Elizabeth; Greaney, David; Harte, Sinead; Hayes, Niamh; Langdon, Miriam; Straub, Birgitt; Burlacu, Crina

    2018-03-20

    Simulation-based education is a modern training modality that allows healthcare professionals to develop knowledge and practice skills in a safe learning environment. The College of Anaesthetists of Ireland (CAI) was the first Irish postgraduate medical training body to introduce mandatory simulation training into its curriculum. Extensive quality assurance and improvement data has been collected on all simulation courses to date. Describe The College of Anaesthetists of Ireland Simulation Training (CAST) programme and report the analysis of course participants' feedback. A retrospective review of feedback forms from four simulation courses from March 2010 to August 2016 took place. Qualitative and quantitative data from 1069 participants who attended 112 courses was analysed. Feedback was overall very positive. Course content and delivery were deemed to be appropriate. Participants agreed that course participation would influence their future practice. A statistically significant difference (P simulation training in specialist anaesthesia training in Ireland.

  8. System description and analysis. Part 1: Feasibility study for helicopter/VTOL wide-angle simulation image generation display system

    Science.gov (United States)

    1977-01-01

    A preliminary design for a helicopter/VSTOL wide angle simulator image generation display system is studied. The visual system is to become part of a simulator capability to support Army aviation systems research and development within the near term. As required for the Army to simulate a wide range of aircraft characteristics, versatility and ease of changing cockpit configurations were primary considerations of the study. Due to the Army's interest in low altitude flight and descents into and landing in constrained areas, particular emphasis is given to wide field of view, resolution, brightness, contrast, and color. The visual display study includes a preliminary design, demonstrated feasibility of advanced concepts, and a plan for subsequent detail design and development. Analysis and tradeoff considerations for various visual system elements are outlined and discussed.

  9. Simulation framework and XML detector description for the CMS experiment

    CERN Document Server

    Arce, P; Boccali, T; Case, M; de Roeck, A; Lara, V; Liendl, M; Nikitenko, A N; Schröder, M; Strässner, A; Wellisch, H P; Wenzel, H

    2003-01-01

    Currently CMS event simulation is based on GEANT3 while the detector description is built from different sources for simulation and reconstruction. A new simulation framework based on GEANT4 is under development. A full description of the detector is available, and the tuning of the GEANT4 performance and the checking of the ability of the physics processes to describe the detector response is ongoing. Its integration on the CMS mass production system and GRID is also currently under development. The Detector Description Database project aims at providing a common source of information for Simulation, Reconstruction, Analysis, and Visualisation, while allowing for different representations as well as specific information for each application. A functional prototype, based on XML, is already released. Also examples of the integration of DDD in the GEANT4 simulation and in the reconstruction applications are provided.

  10. SIMON. A computer program for reliability and statistical analysis using Monte Carlo simulation. Program description and manual

    International Nuclear Information System (INIS)

    Kongsoe, H.E.; Lauridsen, K.

    1993-09-01

    SIMON is a program for calculation of reliability and statistical analysis. The program is of the Monte Carlo type, and it is designed with high flexibility, and has a large potential for application to complex problems like reliability analyses of very large systems and of systems, where complex modelling or knowledge of special details are required. Examples of application of the program, including input and output, for reliability and statistical analysis are presented. (au) (3 tabs., 3 ills., 5 refs.)

  11. Description of the grout system dynamic simulation

    International Nuclear Information System (INIS)

    Zimmerman, B.D.

    1993-07-01

    The grout system dynamic computer simulation was created to allow investigation of the ability of the grouting system to meet established milestones, for various assumed system configurations and parameters. The simulation simulates the movement of tank waste through the system versus time, from initial storage tanks, through feed tanks and the grout plant, then finally to a grout vault. The simulation properly accounts for the following (1) time required to perform various actions or processes, (2) delays involved in gaining regulatory approval, (3) random system component failures, (4) limitations on equipment capacities, (5) available parallel components, and (6) different possible strategies for vault filling. The user is allowed to set a variety of system parameters for each simulation run. Currently, the output of a run primarily consists of a plot of projected grouting campaigns completed versus time, for comparison with milestones. Other outputs involving any model component can also be quickly created or deleted as desired. In particular, sensitivity runs where the effect of varying a model parameter (flow rates, delay times, number of feed tanks available, etc.) on the ability of the system to meet milestones can be made easily. The grout system simulation was implemented using the ITHINK* simulation language for Macintosh** computers

  12. Physics detector simulation facility system software description

    International Nuclear Information System (INIS)

    Allen, J.; Chang, C.; Estep, P.; Huang, J.; Liu, J.; Marquez, M.; Mestad, S.; Pan, J.; Traversat, B.

    1991-12-01

    Large and costly detectors will be constructed during the next few years to study the interactions produced by the SSC. Efficient, cost-effective designs for these detectors will require careful thought and planning. Because it is not possible to test fully a proposed design in a scaled-down version, the adequacy of a proposed design will be determined by a detailed computer model of the detectors. Physics and detector simulations will be performed on the computer model using high-powered computing system at the Physics Detector Simulation Facility (PDSF). The SSCL has particular computing requirements for high-energy physics (HEP) Monte Carlo calculations for the simulation of SSCL physics and detectors. The numerical calculations to be performed in each simulation are lengthy and detailed; they could require many more months per run on a VAX 11/780 computer and may produce several gigabytes of data per run. Consequently, a distributed computing environment of several networked high-speed computing engines is envisioned to meet these needs. These networked computers will form the basis of a centralized facility for SSCL physics and detector simulation work. Our computer planning groups have determined that the most efficient, cost-effective way to provide these high-performance computing resources at this time is with RISC-based UNIX workstations. The modeling and simulation application software that will run on the computing system is usually written by physicists in FORTRAN language and may need thousands of hours of supercomputing time. The system software is the ''glue'' which integrates the distributed workstations and allows them to be managed as a single entity. This report will address the computing strategy for the SSC

  13. Simulator training analysis

    International Nuclear Information System (INIS)

    Hollnagel, E.; Rasmussen, J.

    1981-08-01

    This paper presents a suggestion for systematic collection of data during the normal use of training simulators, with the double purpose of supporting trainee debriefing and providing data for further theoretical studies of operator performance. The method is based on previously described models of operator performance and decision-making, and is a specific instance of the general method for analysis of operator performance data. The method combines a detailed transient-specific description of the expected performance with transient-independent tools for observation of critical activities. (author)

  14. Production Logistics Simulation Supported by Process Description Languages

    Directory of Open Access Journals (Sweden)

    Bohács Gábor

    2016-03-01

    Full Text Available The process description languages are used in the business may be useful in the optimization of logistics processes too. The process description languages would be the obvious solution for process control, to handle the main sources of faults and to give a correct list of what to do during the logistics process. Related to this, firstly, the paper presents the main features of the frequent process description languages. The following section describes the currently most used process modelling languages, in the areas of production and construction logistics. In addition, the paper gives some examples of logistics simulation, as another very important field of logistics system modelling. The main edification of the paper, the logistics simulation supported by process description languages. The paper gives a comparison of a Petri net formal representation and a Simul8 model, through a construction logistics model, as the major contribution of the research.

  15. BWR Full Integral Simulation Test (FIST) program: facility description report

    International Nuclear Information System (INIS)

    Stephens, A.G.

    1984-09-01

    A new boiling water reactor safety test facility (FIST, Full Integral Simulation Test) is described. It will be used to investigate small breaks and operational transients and to tie results from such tests to earlier large-break test results determined in the TLTA. The new facility's full height and prototypical components constitute a major scaling improvement over earlier test facilities. A heated feedwater system, permitting steady-state operation, and a large increase in the number of measurements are other significant improvements. The program background is outlined and program objectives defined. The design basis is presented together with a detailed, complete description of the facility and measurements to be made. An extensive component scaling analysis and prediction of performance are presented

  16. Reproducible computational biology experiments with SED-ML--the Simulation Experiment Description Markup Language.

    Science.gov (United States)

    Waltemath, Dagmar; Adams, Richard; Bergmann, Frank T; Hucka, Michael; Kolpakov, Fedor; Miller, Andrew K; Moraru, Ion I; Nickerson, David; Sahle, Sven; Snoep, Jacky L; Le Novère, Nicolas

    2011-12-15

    The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from different fields of research

  17. Reproducible computational biology experiments with SED-ML - The Simulation Experiment Description Markup Language

    Science.gov (United States)

    2011-01-01

    Background The increasing use of computational simulation experiments to inform modern biological research creates new challenges to annotate, archive, share and reproduce such experiments. The recently published Minimum Information About a Simulation Experiment (MIASE) proposes a minimal set of information that should be provided to allow the reproduction of simulation experiments among users and software tools. Results In this article, we present the Simulation Experiment Description Markup Language (SED-ML). SED-ML encodes in a computer-readable exchange format the information required by MIASE to enable reproduction of simulation experiments. It has been developed as a community project and it is defined in a detailed technical specification and additionally provides an XML schema. The version of SED-ML described in this publication is Level 1 Version 1. It covers the description of the most frequent type of simulation experiments in the area, namely time course simulations. SED-ML documents specify which models to use in an experiment, modifications to apply on the models before using them, which simulation procedures to run on each model, what analysis results to output, and how the results should be presented. These descriptions are independent of the underlying model implementation. SED-ML is a software-independent format for encoding the description of simulation experiments; it is not specific to particular simulation tools. Here, we demonstrate that with the growing software support for SED-ML we can effectively exchange executable simulation descriptions. Conclusions With SED-ML, software can exchange simulation experiment descriptions, enabling the validation and reuse of simulation experiments in different tools. Authors of papers reporting simulation experiments can make their simulation protocols available for other scientists to reproduce the results. Because SED-ML is agnostic about exact modeling language(s) used, experiments covering models from

  18. Eportfolios: From description to analysis

    Directory of Open Access Journals (Sweden)

    Gabriella Minnes Brandes

    2008-06-01

    Full Text Available In recent years, different professional and academic settings have been increasingly utilizing ePortfolios to serve multiple purposes from recruitment to evaluation. This p aper analyzes ePortfolios created by graduate students at a Canadian university. Demonstrated is how students’ constructions can, and should, be more than a simple compilation of artifacts. Examined is an online learning environment whereby we shared knowledge, supported one another in knowledge construction, developed collective expertise, and engaged in progressive discourse. In our analysis of the portfolios, we focused on reflection and deepening understanding of learning. We discussed students’ use of metaphors and hypertexts as means of making cognitive connections. We found that when students understood technological tools and how to use them to substantiate their thinking processes and to engage the readers/ viewers, their ePortfolios were richer and more complex in their illustrations of learning. With more experience and further analysis of exemplars of existing portfolios, students became more nuanced in their organization of their ePortfolios, reflecting the messages they conveyed. Metaphors and hypertexts became useful vehicles to move away from linearity and chronology to new organizational modes that better illustrated students’ cognitive processes. In such a community of inquiry, developed within an online learning space, the instructor and peers had an important role in enhancing reflection through scaffolding. We conclude the paper with a call to explore the interactions between viewer/ reader and the materials presented in portfolios as part of learning occasions.

  19. VAL language: description and analysis

    International Nuclear Information System (INIS)

    McGraw, J.R.

    1982-01-01

    VAL is a high-level, function-based language designed for use on data flow computers. A data flow computer has many small processors organized to cooperate in the executive of a single computation. A computation is represented by its data flow graph; each operator in a graph is scheduled for execution on one of the processors after all of its operands' values are known. VAL promotes the indentification of concurrency in algorithms and simplifies the mapping into data graphs. This paper presents a detailed introduction to VAL and analyzes its usefulness for programming in a highly concurrent environment. VAL provides implicit concurrency (operations that can execute simultaneously are evident without the need for any explicit language notation). The language uses function- and expression-based features that prohibit all side effects, which simplifies translation to graphs. The salient language features are described and illustrated through examples taken from a complete VAL program for adaptive quadrature. Analysis of the language shows that VAL meets the critical needs for a data flow environment. The language encourages programmers to think in terms of general concurrency, enhances readability (due to the absence of side effects), and possesses a structure amenable to verification techniques. However, VAL is still evolving. The language definition needs refining, and more support tools for programmer use need to be developed. Also, some new kinds of optimization problems should be addressed

  20. Simulation Use in Paramedic Education Research (SUPER): A Descriptive Study.

    Science.gov (United States)

    McKenna, Kim D; Carhart, Elliot; Bercher, Daniel; Spain, Andrew; Todaro, John; Freel, Joann

    2015-01-01

    The purpose of this research was to characterize the use of simulation in initial paramedic education programs in order assist stakeholders' efforts to target educational initiatives and resources. This group sought to provide a snapshot of what simulation resources programs have or have access to and how they are used; faculty perceptions about simulation; whether program characteristics, resources, or faculty training influence simulation use; and if simulation resources are uniform for patients of all ages. This was a cross-sectional census survey of paramedic programs that were accredited or had a Letter of Review from the Committee on Accreditation of Educational Programs for the EMS Professions at the time of the study. The data were analyzed using descriptive statistics and chi-square analyses. Of the 638 surveys sent, 389 valid responses (61%) were analyzed. Paramedic programs reported they have or have access to a wide range of simulation resources (task trainers [100%], simple manikins [100%], intermediate manikins [99%], advanced/fully programmable manikins [91%], live simulated patients [83%], computer-based [71%], and virtual reality [19%]); however, they do not consistently use them, particularly advanced (71%), live simulated patients (66%), computer-based (games, scenarios) (31%), and virtual reality (4%). Simulation equipment (of any type) reportedly sits idle and unused in (31%) of programs. Lack of training was cited as the most common reason. Personnel support specific to simulation was available in 44% of programs. Programs reported using simulation to replace skills more frequently than to replace field or clinical hours. Simulation goals included assessment, critical thinking, and problem-solving most frequently, and patient and crew safety least often. Programs using advanced manikins report manufacturers as their primary means of training (87%) and that 19% of faculty had no training specific to those manikins. Many (78%) respondents felt

  1. Description of textures by a structural analysis.

    Science.gov (United States)

    Tomita, F; Shirai, Y; Tsuji, S

    1982-02-01

    A structural analysis system for describing natural textures is introduced. The analyzer automatically extracts the texture elements in an input image, measures their properties, classifies them into some distinctive classes (one ``ground'' class and some ``figure'' classes), and computes the distributions of the gray level, the shape, and the placement of the texture elements in each class. These descriptions are used for classification of texture images. An analysis-by-synthesis method for evaluating texture analyzers is also presented. We propose a synthesizer which generates a texture image based on the descriptions. By comparing the reconstructed image with the original one, we can see what information is preserved and what is lost in the descriptions.

  2. Descriptions of positron defect analysis capabilities

    International Nuclear Information System (INIS)

    Howell, R.H.

    1994-10-01

    A series of descriptive papers and graphics appropriate for distribution to potential collaborators has been assembled. These describe the capabilities for defect analysis using positron annihilation spectroscopy. The application of positrons to problems in the polymer and semiconductor industries is addressed

  3. A computational description of simple mediation analysis

    Directory of Open Access Journals (Sweden)

    Caron, Pier-Olivier

    2018-04-01

    Full Text Available Simple mediation analysis is an increasingly popular statistical analysis in psychology and in other social sciences. However, there is very few detailed account of the computations within the model. Articles are more often focusing on explaining mediation analysis conceptually rather than mathematically. Thus, the purpose of the current paper is to introduce the computational modelling within simple mediation analysis accompanied with examples with R. Firstly, mediation analysis will be described. Then, the method to simulate data in R (with standardized coefficients will be presented. Finally, the bootstrap method, the Sobel test and the Baron and Kenny test all used to evaluate mediation (i.e., indirect effect will be developed. The R code to implement the computation presented is offered as well as a script to carry a power analysis and a complete example.

  4. Water Quality Analysis Simulation

    Science.gov (United States)

    The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural phenomena and man-made pollution for variious pollution management decisions.

  5. Water Quality Analysis Simulation

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Water Quality analysis simulation Program, an enhancement of the original WASP. This model helps users interpret and predict water quality responses to natural...

  6. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 2.

    Science.gov (United States)

    Bergmann, Frank T; Cooper, Jonathan; Le Novère, Nicolas; Nickerson, David; Waltemath, Dagmar

    2015-09-04

    The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE) guidelines. This document presents Level 1 Version 2 of the Simulation Experiment Description Markup Language (SED-ML), a computer-readable format for encoding simulation and analysis experiments to apply to computational models. SED-ML files are encoded in the Extensible Markup Language (XML) and can be used in conjunction with any XML-based model encoding format, such as CellML or SBML. A SED-ML file includes details of which models to use, how to modify them prior to executing a simulation, which simulation and analysis procedures to apply, which results to extract and how to present them. Level 1 Version 2 extends the format by allowing the encoding of repeated and chained procedures.

  7. Research reactor job analysis - A project description

    International Nuclear Information System (INIS)

    Yoder, John; Bessler, Nancy J.

    1988-01-01

    Addressing the need of the improved training in nuclear industry, nuclear utilities established training program guidelines based on Performance-Based Training (PBT) concepts. The comparison of commercial nuclear power facilities with research and test reactors owned by the U.S. Department of Energy (DOE), made in an independent review of personnel selection, training, and qualification requirements for DOE-owned reactors pointed out that the complexity of the most critical tasks in research reactors is less than that in power reactors. The U.S. Department of Energy (DOE) started a project by commissioning Oak Ridge Associated Universities (ORAU) to conduct a job analysis survey of representative research reactor facilities. The output of the project consists of two publications: Volume 1 - Research Reactor Job Analysis: Overview, which contains an Introduction, Project Description, Project Methodology,, and. An Overview of Performance-Based Training (PBT); and Volume 2 - Research Reactor Job Analysis: Implementation, which contains Guidelines for Application of Preliminary Task Lists and Preliminary Task Lists for Reactor Operators and Supervisory Reactor Operators

  8. High-Alpha Research Vehicle Lateral-Directional Control Law Description, Analyses, and Simulation Results

    Science.gov (United States)

    Davidson, John B.; Murphy, Patrick C.; Lallman, Frederick J.; Hoffler, Keith D.; Bacon, Barton J.

    1998-01-01

    This report contains a description of a lateral-directional control law designed for the NASA High-Alpha Research Vehicle (HARV). The HARV is a F/A-18 aircraft modified to include a research flight computer, spin chute, and thrust-vectoring in the pitch and yaw axes. Two separate design tools, CRAFT and Pseudo Controls, were integrated to synthesize the lateral-directional control law. This report contains a description of the lateral-directional control law, analyses, and nonlinear simulation (batch and piloted) results. Linear analysis results include closed-loop eigenvalues, stability margins, robustness to changes in various plant parameters, and servo-elastic frequency responses. Step time responses from nonlinear batch simulation are presented and compared to design guidelines. Piloted simulation task scenarios, task guidelines, and pilot subjective ratings for the various maneuvers are discussed. Linear analysis shows that the control law meets the stability margin guidelines and is robust to stability and control parameter changes. Nonlinear batch simulation analysis shows the control law exhibits good performance and meets most of the design guidelines over the entire range of angle-of-attack. This control law (designated NASA-1A) was flight tested during the Summer of 1994 at NASA Dryden Flight Research Center.

  9. Manifest domains:analysis and description

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2017-01-01

    _static_attribute, is_dynamic_attribute, is_inert_attribute, is_reactive_attribute, is_active_attribute, is_autonomous_attribute, is_biddable_attribute and is_programmable_attribute. The twist suggests ways of modeling “access” to the values of these kinds of attributes: the static attributes by simply “copying” them...... processes. C.A.R. Hoare series in computer science. Prentice-Hall International, London, 2004). We show how to model essential aspects of perdurants in terms of their signatures based on the concepts of endurants. And we show how one can “compile” descriptions of endurant parts into descriptions...

  10. Domain Endurants: An Analysis and Description Process Model

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2014-01-01

    We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. We...

  11. Simulation of containment atmosphere stratification experiment using local instantaneous description

    International Nuclear Information System (INIS)

    Babic, M.; Kljenak, I.

    2004-01-01

    An experiment on mixing and stratification in the atmosphere of a nuclear power plant containment at accident conditions was simulated with the CFD code CFX4.4. The original experiment was performed in the TOSQAN experimental facility. Simulated nonhomogeneous temperature, species concentration and velocity fields are compared to experimental results. (author)

  12. System Design Description Salt Well Liquid Pumping Dynamic Simulation

    International Nuclear Information System (INIS)

    HARMSEN, R.W.

    1999-01-01

    The Salt Well Liquid (SWL) Pumping Dynamic Simulation used by the single-shell tank (SST) Interim Stabilization Project is described. A graphical dynamic simulation predicts SWL removal from 29 SSTs using an exponential function and unique time constant for each SST. Increasing quarterly efficiencies are applied to adjust the pumping rates during fiscal year 2000

  13. Risk Characterization uncertainties associated description, sensitivity analysis

    International Nuclear Information System (INIS)

    Carrillo, M.; Tovar, M.; Alvarez, J.; Arraez, M.; Hordziejewicz, I.; Loreto, I.

    2013-01-01

    The power point presentation is about risks to the estimated levels of exposure, uncertainty and variability in the analysis, sensitivity analysis, risks from exposure to multiple substances, formulation of guidelines for carcinogenic and genotoxic compounds and risk subpopulations

  14. Simulation program description for the fusion by inertial confinement

    International Nuclear Information System (INIS)

    Ferro Fontan, C.; Mancini, R.C.

    1982-01-01

    The physical model and the numerical technique used to describe the target evolution with plane, cylindrical or spherical symmetry, irradiate with a laser pulse are presented. As a simulation example, the results obtained for the irradiation of a aluminium plane target illuminated with a short time pulse are shown. (L.C.) [pt

  15. Description of waste pretreatment and interfacing systems dynamic simulation model

    International Nuclear Information System (INIS)

    Garbrick, D.J.; Zimmerman, B.D.

    1995-05-01

    The Waste Pretreatment and Interfacing Systems Dynamic Simulation Model was created to investigate the required pretreatment facility processing rates for both high level and low level waste so that the vitrification of tank waste can be completed according to the milestones defined in the Tri-Party Agreement (TPA). In order to achieve this objective, the processes upstream and downstream of the pretreatment facilities must also be included. The simulation model starts with retrieval of tank waste and ends with vitrification for both low level and high level wastes. This report describes the results of three simulation cases: one based on suggested average facility processing rates, one with facility rates determined so that approximately 6 new DSTs are required, and one with facility rates determined so that approximately no new DSTs are required. It appears, based on the simulation results, that reasonable facility processing rates can be selected so that no new DSTs are required by the TWRS program. However, this conclusion must be viewed with respect to the modeling assumptions, described in detail in the report. Also included in the report, in an appendix, are results of two sensitivity cases: one with glass plant water recycle steams recycled versus not recycled, and one employing the TPA SST retrieval schedule versus a more uniform SST retrieval schedule. Both recycling and retrieval schedule appear to have a significant impact on overall tank usage

  16. A uniform geometry description for simulation, reconstruction and visualization in the BESIII experiment

    Energy Technology Data Exchange (ETDEWEB)

    Liang Yutie [School of Physics and State Key Laboratory of Nuclear Physics and Technology, Peking University, Beijing 100871 (China)], E-mail: liangyt@hep.pku.edu.cn; Zhu Bo; You Zhengyun; Liu Kun; Ye Hongxue; Xu Guangming; Wang Siguang [School of Physics and State Key Laboratory of Nuclear Physics and Technology, Peking University, Beijing 100871 (China); Li Weidong; Liu Huaimin; Mao Zepu [Institute of High Energy Physics, CAS, Beijing 100049 (China); Mao Yajun [School of Physics and State Key Laboratory of Nuclear Physics and Technology, Peking University, Beijing 100871 (China)

    2009-05-21

    In the BESIII experiment, the simulation, reconstruction and visualization were designed to use the same geometry description in order to ensure the consistency of the geometry for different applications. Geometry Description Markup Language (GDML), an application-independent persistent format for describing the geometries of detectors, was chosen and met our requirement. The detector of BESIII was described with GDML and then used in Geant4-based simulation and ROOT-based reconstruction and visualization.

  17. A uniform geometry description for simulation, reconstruction and visualization in the BESIII experiment

    International Nuclear Information System (INIS)

    Liang Yutie; Zhu Bo; You Zhengyun; Liu Kun; Ye Hongxue; Xu Guangming; Wang Siguang; Li Weidong; Liu Huaimin; Mao Zepu; Mao Yajun

    2009-01-01

    In the BESIII experiment, the simulation, reconstruction and visualization were designed to use the same geometry description in order to ensure the consistency of the geometry for different applications. Geometry Description Markup Language (GDML), an application-independent persistent format for describing the geometries of detectors, was chosen and met our requirement. The detector of BESIII was described with GDML and then used in Geant4-based simulation and ROOT-based reconstruction and visualization.

  18. COPD phenotype description using principal components analysis

    DEFF Research Database (Denmark)

    Roy, Kay; Smith, Jacky; Kolsum, Umme

    2009-01-01

    BACKGROUND: Airway inflammation in COPD can be measured using biomarkers such as induced sputum and Fe(NO). This study set out to explore the heterogeneity of COPD using biomarkers of airway and systemic inflammation and pulmonary function by principal components analysis (PCA). SUBJECTS...... AND METHODS: In 127 COPD patients (mean FEV1 61%), pulmonary function, Fe(NO), plasma CRP and TNF-alpha, sputum differential cell counts and sputum IL8 (pg/ml) were measured. Principal components analysis as well as multivariate analysis was performed. RESULTS: PCA identified four main components (% variance...... associations between the variables within components 1 and 2. CONCLUSION: COPD is a multi dimensional disease. Unrelated components of disease were identified, including neutrophilic airway inflammation which was associated with systemic inflammation, and sputum eosinophils which were related to increased Fe...

  19. A Description of the Ship Combat System Simulation

    Science.gov (United States)

    1984-09-01

    developed by NWC, NOSC, NSWC, and CACI, Inc. SCSS is supported by naval and industrial laboratories throughout the country. The users of the...34’. 1lhutlon/ *A i ~n o 1. * Ruido i/iI NSWC TR 84-182 CONTENTS Page INTRODUCTION................................ .. . . .. . .. .. .. ... 1 SIMULATION...Isuch as a contract), Navy or private industry , can obtain the mulation by becoming a member of this Users’ Group. Additional information garding the

  20. Description and preliminary evaluation of a diabetes technology simulation course.

    Science.gov (United States)

    Wilson, Rebecca D; Bailey, Marilyn; Boyle, Mary E; Seifert, Karen M; Cortez, Karla Y; Baker, Leslie J; Hovan, Michael J; Stepanek, Jan; Cook, Curtiss B

    2013-11-01

    We aim to provide data on a diabetes technology simulation course (DTSC) that instructs internal medicine residents in the use of continuous subcutaneous insulin infusion (CSII) and continuous glucose monitoring system (CGMS) devices. The DTSC was implemented during calendar year 2012 and conducted in the institution's simulation center. It consisted of a set of prerequisites, a practicum, and completion of a web-based inpatient CSII-ordering simulation. DTSC participants included only those residents in the outpatient endocrinology rotation. Questionnaires were used to determine whether course objectives were met and to assess the satisfaction of residents with the course. Questionnaires were also administered before and after the endocrine rotation to gauge improvement in familiarity with CSII and CGMS technologies. During the first year, 12 of 12 residents in the outpatient endocrinology rotation completed the DTSC. Residents reported that the course objectives were fully met. The mean satisfaction score with the course ranged from 4.0 to 4.9 (maximum, 5), with most variables rated above 4.5. Self-reported familiarity with the operation of CSII and CGMS devices increased significantly in the postrotation survey compared with that on the prerotation survey (both p technologies. In light of these preliminary findings, the course will continue to be offered, with further data accrual. Future work will involve piloting the DTSC approach among other types of providers, such as residents in other specialties or inpatient nursing staff. © 2013 Diabetes Technology Society.

  1. Descriptive Topology in Selected Topics of Functional Analysis

    CERN Document Server

    Kakol, J; Pellicer, Manuel Lopez

    2011-01-01

    "Descriptive Topology in Selected Topics of Functional Analysis" is a collection of recent developments in the field of descriptive topology, specifically focused on the classes of infinite-dimensional topological vector spaces that appear in functional analysis. Such spaces include Frechet spaces, (LF)-spaces and their duals, and the space of continuous real-valued functions C(X) on a completely regular Hausdorff space X, to name a few. These vector spaces appear in functional analysis in distribution theory, differential equations, complex analysis, and various other analytical set

  2. Description of Simulated Small Satellite Operation Data Sets

    Science.gov (United States)

    Kulkarni, Chetan S.; Guarneros Luna, Ali

    2018-01-01

    A set of two BP930 batteries (Identified as PK31 and PK35) were operated continuously for a simulated satellite operation profile completion for single cycle. The battery packs were charged to an initial voltage of around 8.35 V for 100% SOC before the experiment was started. This document explains the structure of the battery data sets. Please cite this paper when using this dataset: Z. Cameron, C. Kulkarni, A. Guarneros, K. Goebel, S. Poll, "A Battery Certification Testbed for Small Satellite Missions", IEEE AUTOTESTCON 2015, Nov 2-5, 2015, National Harbor, MA

  3. HANFORD TANK WASTE OPERATIONS SIMULATOR VERSION DESCRIPTION DOCUMENT

    International Nuclear Information System (INIS)

    ALLEN, G.K.

    2003-01-01

    This document describes the software version controls established for the Hanford Tank Waste Operations Simulator (HTWOS). It defines: the methods employed to control the configuration of HTWOS; the version of each of the 26 separate modules for the version 1.0 of HTWOS; the numbering rules for incrementing the version number of each module; and a requirement to include module version numbers in each case results documentation. Version 1.0 of HTWOS is the first version under formal software version control. HTWOS contains separate revision numbers for each of its 26 modules. Individual module version numbers do not reflect the major release HTWOS configured version number

  4. Physics Detector Simulation Facility Phase II system software description

    International Nuclear Information System (INIS)

    Scipioni, B.; Allen, J.; Chang, C.; Huang, J.; Liu, J.; Mestad, S.; Pan, J.; Marquez, M.; Estep, P.

    1993-05-01

    This paper presents the Physics Detector Simulation Facility (PDSF) Phase II system software. A key element in the design of a distributed computing environment for the PDSF has been the separation and distribution of the major functions. The facility has been designed to support batch and interactive processing, and to incorporate the file and tape storage systems. By distributing these functions, it is often possible to provide higher throughput and resource availability. Similarly, the design is intended to exploit event-level parallelism in an open distributed environment

  5. Description and Analysis Pattern for Theses and Dissertations

    Directory of Open Access Journals (Sweden)

    Sirous Alidousti

    2009-07-01

    Full Text Available Dissertations and theses that are generated in course of research at PhD and Masters levels are considered to be important scientific documents in every country. Data description and analysis of such documents collected together, could automatically - especially when compared with data from other resources - provide new information that is very valuable. Nevertheless, no comprehensive, integrated pattern exists for such description and analysis. The present paper offers the findings of a research conducted for devising an information analysis pattern for dissertations and theses. It also puts forward information categories derived from such documents that could be described and analyzed.

  6. Summary description of the RELAP5 Koeberg-1 simulation model

    International Nuclear Information System (INIS)

    D'Arcy, A.J.

    1990-11-01

    The main features of the RELAP5 code and the model are summarized. The model has been quality-assured in accordance with a QA programme used in the Reactor Theory Group of the Atomic Energy Corporation of SA Ltd. The RELAP5 code is based on a non-homogeneous, non-equilibrium model for the two-phase system that is solved by a fast, partially implicit numerical scheme. The objective of the development effort from the outset has been to produce a code that includes important first-order effects necessary for accurate prediction of system transients, but is sufficiently simple and cost-effective so that parametric or sensitivity studies are possible. The code includes many generic component models from which general systems can be simulated. Special process models are included for effects such as form losses, flow at an abrupt area change, branching, choked flow, boron tracking and a non-condensible gas. The RELAP5 modelling and computational aspects covered are: hydrodynamic models, constitutive package, special process models, and user conveniences. 25 tabs., 8 figs., 17 refs

  7. The electricity portfolio simulation model (EPSim) technical description.

    Energy Technology Data Exchange (ETDEWEB)

    Drennen, Thomas E.; Klotz, Richard (Hobart and William Smith Colleges, Geneva, NY)

    2005-09-01

    Stakeholders often have competing interests when selecting or planning new power plants. The purpose of developing this preliminary Electricity Portfolio Simulation Model (EPSim) is to provide a first cut, dynamic methodology and approach to this problem, that can subsequently be refined and validated, that may help energy planners, policy makers, and energy students better understand the tradeoffs associated with competing electricity portfolios. EPSim allows the user to explore competing electricity portfolios annually from 2002 to 2025 in terms of five different criteria: cost, environmental impacts, energy dependence, health and safety, and sustainability. Four additional criteria (infrastructure vulnerability, service limitations, policy needs and science and technology needs) may be added in future versions of the model. Using an analytic hierarchy process (AHP) approach, users or groups of users apply weights to each of the criteria. The default energy assumptions of the model mimic Department of Energy's (DOE) electricity portfolio to 2025 (EIA, 2005). At any time, the user can compare alternative portfolios to this reference case portfolio.

  8. Microscopic description and simulation of ultracold atoms in optical resonators

    International Nuclear Information System (INIS)

    Niedenzu, W.

    2012-01-01

    Ultracold atoms in optical resonators are an ideal system to investigate the full quantum regime of light-matter interaction. Microscopic insight into the underlying processes can nowadays easily be obtained from numerical calculations, e.g. with Monte Carlo wave function simulations. In the first part we discuss cold atoms in ring resonators, where the modified boundary conditions significantly alter the dynamics as compared to the standing-wave case. Quantum jumps induce momentum correlations and entanglement between the particles. We observe strong non-classical motional correlations, cooling and entanglement heralded by single photon measurements. For deeply trapped particles the complex system Hamiltonian can be mapped onto a generic optomechanical model, allowing for analytical microscopic insight into the dynamics. The rates of cavity-mediated correlated heating and cooling processes are obtained by adiabatically eliminating the cavity field from the dynamics and can be directly related to the steady-state momentum correlation coefficient. The second part is devoted to cooling and self-organisation of a cold gas in a transversally pumped standing-wave resonator, in which the atoms are directly illuminated by a laser beam. Above a certain critical laser intensity the atoms order in a specific pattern, maximising light scattering into the cavity. The particles thus create and sustain their own trap. We derive a nonlinear Fokker-Planck equation for the one-particle distribution function describing the gas dynamics below and above threshold. This kinetic theory predicts dissipation-induced self-organisation and q-Gaussian velocity distributions in steady state. (author)

  9. Efficient generation of connectivity in neuronal networks from simulator-independent descriptions

    Directory of Open Access Journals (Sweden)

    Mikael eDjurfeldt

    2014-04-01

    Full Text Available Simulator-independent descriptions of connectivity in neuronal networks promise greater ease of model sharing, improved reproducibility of simulation results, and reduced programming effort for computational neuroscientists. However, until now, enabling the use of such descriptions in a given simulator in a computationally efficient way has entailed considerable work for simulator developers, which must be repeated for each new connectivity-generating library that is developed.We have developed a generic connection generator interface that provides a standard way to connect a connectivity-generating library to a simulator, such that one library can easily be replaced by another, according to the modeller's needs. We have used the connection generator interface to connect C++ and Python implementations of the connection-set algebra to the NEST simulator. We also demonstrate how the simulator-independent modelling framework PyNN can transparently take advantage of this, passing a connection description through to the simulator layer for rapid processing in C++ where a simulator supports the connection generator interface and falling-back to slower iteration in Python otherwise. A set of benchmarks demonstrates the good performance of the interface.

  10. Uncertainty analysis of a one-dimensional constitutive model for shape memory alloy thermomechanical description

    DEFF Research Database (Denmark)

    Oliveira, Sergio A.; Savi, Marcelo A.; Santos, Ilmar F.

    2014-01-01

    The use of shape memory alloys (SMAs) in engineering applications has increased the interest of the accuracy analysis of their thermomechanical description. This work presents an uncertainty analysis related to experimental tensile tests conducted with shape memory alloy wires. Experimental data...... are compared with numerical simulations obtained from a constitutive model with internal constraints employed to describe the thermomechanical behavior of SMAs. The idea is to evaluate if the numerical simulations are within the uncertainty range of the experimental data. Parametric analysis is also developed...

  11. Simulation Experiment Description Markup Language (SED-ML Level 1 Version 3 (L1V3

    Directory of Open Access Journals (Sweden)

    Bergmann Frank T.

    2018-03-01

    Full Text Available The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML is an XML-based format that encodes, for a given simulation experiment, (i which models to use; (ii which modifications to apply to models before simulation; (iii which simulation procedures to run on each model; (iv how to post-process the data; and (v how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1 implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  12. Simulation Experiment Description Markup Language (SED-ML) Level 1 Version 3 (L1V3).

    Science.gov (United States)

    Bergmann, Frank T; Cooper, Jonathan; König, Matthias; Moraru, Ion; Nickerson, David; Le Novère, Nicolas; Olivier, Brett G; Sahle, Sven; Smith, Lucian; Waltemath, Dagmar

    2018-03-19

    The creation of computational simulation experiments to inform modern biological research poses challenges to reproduce, annotate, archive, and share such experiments. Efforts such as SBML or CellML standardize the formal representation of computational models in various areas of biology. The Simulation Experiment Description Markup Language (SED-ML) describes what procedures the models are subjected to, and the details of those procedures. These standards, together with further COMBINE standards, describe models sufficiently well for the reproduction of simulation studies among users and software tools. The Simulation Experiment Description Markup Language (SED-ML) is an XML-based format that encodes, for a given simulation experiment, (i) which models to use; (ii) which modifications to apply to models before simulation; (iii) which simulation procedures to run on each model; (iv) how to post-process the data; and (v) how these results should be plotted and reported. SED-ML Level 1 Version 1 (L1V1) implemented support for the encoding of basic time course simulations. SED-ML L1V2 added support for more complex types of simulations, specifically repeated tasks and chained simulation procedures. SED-ML L1V3 extends L1V2 by means to describe which datasets and subsets thereof to use within a simulation experiment.

  13. Application of descriptive statistics in analysis of experimental data

    OpenAIRE

    Mirilović Milorad; Pejin Ivana

    2008-01-01

    Statistics today represent a group of scientific methods for the quantitative and qualitative investigation of variations in mass appearances. In fact, statistics present a group of methods that are used for the accumulation, analysis, presentation and interpretation of data necessary for reaching certain conclusions. Statistical analysis is divided into descriptive statistical analysis and inferential statistics. The values which represent the results of an experiment, and which are the subj...

  14. Stochastic modeling analysis and simulation

    CERN Document Server

    Nelson, Barry L

    1995-01-01

    A coherent introduction to the techniques for modeling dynamic stochastic systems, this volume also offers a guide to the mathematical, numerical, and simulation tools of systems analysis. Suitable for advanced undergraduates and graduate-level industrial engineers and management science majors, it proposes modeling systems in terms of their simulation, regardless of whether simulation is employed for analysis. Beginning with a view of the conditions that permit a mathematical-numerical analysis, the text explores Poisson and renewal processes, Markov chains in discrete and continuous time, se

  15. Job Analysis, Job Descriptions, and Performance Appraisal Systems.

    Science.gov (United States)

    Sims, Johnnie M.; Foxley, Cecelia H.

    1980-01-01

    Job analysis, job descriptions, and performance appraisal can benefit student services administration in many ways. Involving staff members in the development and implementation of these techniques can increase commitment to and understanding of the overall objectives of the office, as well as communication and cooperation among colleagues.…

  16. A Descriptive Analysis of Instructional Coaches' Data Use in Science

    Science.gov (United States)

    Snodgrass Rangel, Virginia; Bell, Elizabeth R.; Monroy, Carlos

    2017-01-01

    A key assumption of accountability policies is that educators will use data to improve their instruction. In practice, however, data use is quite hard, and more districts are looking to instructional coaches to support their teachers. The purpose of this descriptive analysis is to examine how instructional coaches in elementary and middle school…

  17. Simulation Experiment Description Markup Language (SED-ML Level 1 Version 2

    Directory of Open Access Journals (Sweden)

    Bergmann Frank T.

    2015-06-01

    Full Text Available The number, size and complexity of computational models of biological systems are growing at an ever increasing pace. It is imperative to build on existing studies by reusing and adapting existing models and parts thereof. The description of the structure of models is not sufficient to enable the reproduction of simulation results. One also needs to describe the procedures the models are subjected to, as recommended by the Minimum Information About a Simulation Experiment (MIASE guidelines.

  18. Evaluating best educational practices, student satisfaction, and self-confidence in simulation: A descriptive study.

    Science.gov (United States)

    Zapko, Karen A; Ferranto, Mary Lou Gemma; Blasiman, Rachael; Shelestak, Debra

    2018-01-01

    The National League for Nursing (NLN) has endorsed simulation as a necessary teaching approach to prepare students for the demanding role of professional nursing. Questions arise about the suitability of simulation experiences to educate students. Empirical support for the effect of simulation on patient outcomes is sparse. Most studies on simulation report only anecdotal results rather than data obtained using evaluative tools. The aim of this study was to examine student perception of best educational practices in simulation and to evaluate their satisfaction and self-confidence in simulation. This study was a descriptive study designed to explore students' perceptions of the simulation experience over a two-year period. Using the Jeffries framework, a Simulation Day was designed consisting of serial patient simulations using high and medium fidelity simulators and live patient actors. The setting for the study was a regional campus of a large Midwestern Research 2 university. The convenience sample consisted of 199 participants and included sophomore, junior, and senior nursing students enrolled in the baccalaureate nursing program. The Simulation Days consisted of serial patient simulations using high and medium fidelity simulators and live patient actors. Participants rotated through four scenarios that corresponded to their level in the nursing program. Data was collected in two consecutive years. Participants completed both the Educational Practices Questionnaire (Student Version) and the Student Satisfaction and Self-Confidence in Learning Scale. Results provide strong support for using serial simulation as a learning tool. Students were satisfied with the experience, felt confident in their performance, and felt the simulations were based on sound educational practices and were important for learning. Serial simulations and having students experience simulations more than once in consecutive years is a valuable method of clinical instruction. When

  19. Intensive care nursing students' perceptions of simulation for learning confirming communication skills: A descriptive qualitative study.

    Science.gov (United States)

    Karlsen, Marte-Marie Wallander; Gabrielsen, Anita Kristin; Falch, Anne Lise; Stubberud, Dag-Gunnar

    2017-10-01

    The aim of this study was to explore intensive care nursing students experiences with confirming communication skills training in a simulation-based environment. The study has a qualitative, exploratory and descriptive design. The participants were students in a post-graduate program in intensive care nursing, that had attended a one day confirming communication course. Three focus group interviews lasting between 60 and 80min were conducted with 14 participants. The interviews were transcribed verbatim. Thematic analysis was performed, using Braun & Clark's seven steps. The analysis resulted in three main themes: "awareness", "ice-breaker" and "challenging learning environment". The participants felt that it was a challenge to see themselves on the video-recordings afterwards, however receiving feedback resulted in better self-confidence in mastering complex communication. The main finding of the study is that the students reported improved communication skills after the confirming communication course. However; it is uncertain how these skills can be transferred to clinical practice improving patient outcomes. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Lone ranger decision making versus consensus decision making: Descriptive analysis

    OpenAIRE

    Maite Sara Mashego

    2015-01-01

    Consensus decision making, concerns group members make decisions together with the requirement of reaching a consensus that is all members abiding by the decision outcome. Lone ranging worked for sometime in a autocratic environment. Researchers are now pointing to consensus decision-making in organizations bringing dividend to many organizations. This article used a descriptive analysis to compare the goodness of consensus decision making and making lone ranging decision management. This art...

  1. Nucleonica. Web-based software tools for simulation and analysis

    International Nuclear Information System (INIS)

    Magill, J.; Dreher, R.; Soti, Z.

    2014-01-01

    The authors present a description of the Nucleonica web-based portal for simulation and analysis for a wide range of commonly encountered nuclear science applications. Advantages of a web-based approach include availability wherever there is internet access, intuitive user-friendly interface, remote access to high-power computing resources, and continual maintenance, improvement, and addition of tools and techniques common to the nuclear science industry. A description of the nuclear data resources, and some applications is given.

  2. Analysis of laparoscopic port site complications: A descriptive study.

    Science.gov (United States)

    Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya

    2013-04-01

    The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. Prospective descriptive study. In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit.

  3. Cyclotron resonant scattering feature simulations. II. Description of the CRSF simulation process

    Science.gov (United States)

    Schwarm, F.-W.; Ballhausen, R.; Falkner, S.; Schönherr, G.; Pottschmidt, K.; Wolff, M. T.; Becker, P. A.; Fürst, F.; Marcu-Cheatham, D. M.; Hemphill, P. B.; Sokolova-Lapa, E.; Dauser, T.; Klochkov, D.; Ferrigno, C.; Wilms, J.

    2017-05-01

    Context. Cyclotron resonant scattering features (CRSFs) are formed by scattering of X-ray photons off quantized plasma electrons in the strong magnetic field (of the order 1012 G) close to the surface of an accreting X-ray pulsar. Due to the complex scattering cross-sections, the line profiles of CRSFs cannot be described by an analytic expression. Numerical methods, such as Monte Carlo (MC) simulations of the scattering processes, are required in order to predict precise line shapes for a given physical setup, which can be compared to observations to gain information about the underlying physics in these systems. Aims: A versatile simulation code is needed for the generation of synthetic cyclotron lines. Sophisticated geometries should be investigatable by making their simulation possible for the first time. Methods: The simulation utilizes the mean free path tables described in the first paper of this series for the fast interpolation of propagation lengths. The code is parallelized to make the very time-consuming simulations possible on convenient time scales. Furthermore, it can generate responses to monoenergetic photon injections, producing Green's functions, which can be used later to generate spectra for arbitrary continua. Results: We develop a new simulation code to generate synthetic cyclotron lines for complex scenarios, allowing for unprecedented physical interpretation of the observed data. An associated XSPEC model implementation is used to fit synthetic line profiles to NuSTAR data of Cep X-4. The code has been developed with the main goal of overcoming previous geometrical constraints in MC simulations of CRSFs. By applying this code also to more simple, classic geometries used in previous works, we furthermore address issues of code verification and cross-comparison of various models. The XSPEC model and the Green's function tables are available online (see link in footnote, page 1).

  4. Simulation model for wind energy storage systems. Volume III. Program descriptions. [SIMWEST CODE

    Energy Technology Data Exchange (ETDEWEB)

    Warren, A.W.; Edsinger, R.W.; Burroughs, J.D.

    1977-08-01

    The effort developed a comprehensive computer program for the modeling of wind energy/storage systems utilizing any combination of five types of storage (pumped hydro, battery, thermal, flywheel and pneumatic). An acronym for the program is SIMWEST (Simulation Model for Wind Energy Storage). The level of detail of SIMWEST is consistent with a role of evaluating the economic feasibility as well as the general performance of wind energy systems. The software package consists of two basic programs and a library of system, environmental, and load components. Volume III, the SIMWEST program description contains program descriptions, flow charts and program listings for the SIMWEST Model Generation Program, the Simulation program, the File Maintenance program and the Printer Plotter program. Volume III generally would not be required by SIMWEST user.

  5. Description of occupant behaviour in building energy simulation: state-of-art and concepts for improvements

    DEFF Research Database (Denmark)

    Fabi, Valentina; Andersen, Rune Vinther; Corgnati, Stefano Paolo

    2011-01-01

    of basic assumptions that affect the results. Therefore, the calculated energy performance may differ significantly from the real energy consumption. One of the key reasons is the current inability to properly model occupant behaviour and to quantify the associated uncertainties in building performance...... predictions. By consequence, a better description of parameters related to occupant behaviour is highly required. In this paper, the state of art in occupant behaviour modelling within energy simulation tools is analysed and some concepts related to possible improvements of simulation tools are proposed...

  6. The experiences of last-year student midwives with High-Fidelity Perinatal Simulation training: A qualitative descriptive study.

    Science.gov (United States)

    Vermeulen, Joeri; Beeckman, Katrien; Turcksin, Rivka; Van Winkel, Lies; Gucciardo, Léonardo; Laubach, Monika; Peersman, Wim; Swinnen, Eva

    2017-06-01

    Simulation training is a powerful and evidence-based teaching method in healthcare. It allows students to develop essential competences that are often difficult to achieve during internships. High-Fidelity Perinatal Simulation exposes them to real-life scenarios in a safe environment. Although student midwives' experiences need to be considered to make the simulation training work, these have been overlooked so far. To explore the experiences of last-year student midwives with High-Fidelity Perinatal Simulation training. A qualitative descriptive study, using three focus group conversations with last-year student midwives (n=24). Audio tapes were transcribed and a thematic content analysis was performed. The entire data set was coded according to recurrent or common themes. To achieve investigator triangulation and confirm themes, discussions among the researchers was incorporated in the analysis. Students found High-Fidelity Perinatal Simulation training to be a positive learning method that increased both their competence and confidence. Their experiences varied over the different phases of the High-Fidelity Perinatal Simulation training. Although uncertainty, tension, confusion and disappointment were experienced throughout the simulation trajectory, they reported that this did not affect their learning and confidence-building. As High-Fidelity Perinatal Simulation training constitutes a helpful learning experience in midwifery education, it could have a positive influence on maternal and neonatal outcomes. In the long term, it could therefore enhance the midwifery profession in several ways. The present study is an important first step in opening up the debate about the pedagogical use of High-Fidelity Perinatal Simulation training within midwifery education. Copyright © 2017 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  7. Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2009-01-01

    This contribution presents an overview of sensitivity analysis of simulation models, including the estimation of gradients. It covers classic designs and their corresponding (meta)models; namely, resolution-III designs including fractional-factorial two-level designs for first-order polynomial

  8. Analysis of laparoscopic port site complications: A descriptive study

    Directory of Open Access Journals (Sweden)

    Somu Karthik

    2013-01-01

    Full Text Available Context: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. Aims: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. Settings and Design: Prospective descriptive study. Materials and Methods: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. Statistical Analysis Used: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. Results: Of the 570 patients undergoing laparoscopic surgery, 17 (3% had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI was the most frequent (n = 10, 1.8%, followed by port site bleeding (n = 4, 0.7%, omentum-related complications (n = 2; 0.35%, and port site metastasis (n = 1, 0.175%. Conclusions: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit.

  9. Analysis of laparoscopic port site complications: A descriptive study

    Science.gov (United States)

    Karthik, Somu; Augustine, Alfred Joseph; Shibumon, Mundunadackal Madhavan; Pai, Manohar Varadaraya

    2013-01-01

    CONTEXT: The rate of port site complications following conventional laparoscopic surgery is about 21 per 100,000 cases. It has shown a proportional rise with increase in the size of the port site incision and trocar. Although rare, complications that occur at the port site include infection, bleeding, and port site hernia. AIMS: To determine the morbidity associated with ports at the site of their insertion in laparoscopic surgery and to identify risk factors for complications. SETTINGS AND DESIGN: Prospective descriptive study. MATERIALS AND METHODS: In the present descriptive study, a total of 570 patients who underwent laparoscopic surgeries for various ailments between August 2009 and July 2011 at our institute were observed for port site complications prospectively and the complications were reviewed. STATISTICAL ANALYSIS USED: Descriptive statistical analysis was carried out in the present study. The statistical software, namely, SPSS 15.0 was used for the analysis of the data. RESULTS: Of the 570 patients undergoing laparoscopic surgery, 17 (3%) had developed complications specifically related to the port site during a minimum follow-up of three months; port site infection (PSI) was the most frequent (n = 10, 1.8%), followed by port site bleeding (n = 4, 0.7%), omentum-related complications (n = 2; 0.35%), and port site metastasis (n = 1, 0.175%). CONCLUSIONS: Laparoscopic surgeries are associated with minimal port site complications. Complications are related to the increased number of ports. Umbilical port involvement is the commonest. Most complications are manageable with minimal morbidity, and can be further minimized with meticulous surgical technique during entry and exit. PMID:23741110

  10. Predicate Argument Structure Analysis for Use Case Description Modeling

    Science.gov (United States)

    Takeuchi, Hironori; Nakamura, Taiga; Yamaguchi, Takahira

    In a large software system development project, many documents are prepared and updated frequently. In such a situation, support is needed for looking through these documents easily to identify inconsistencies and to maintain traceability. In this research, we focus on the requirements documents such as use cases and consider how to create models from the use case descriptions in unformatted text. In the model construction, we propose a few semantic constraints based on the features of the use cases and use them for a predicate argument structure analysis to assign semantic labels to actors and actions. With this approach, we show that we can assign semantic labels without enhancing any existing general lexical resources such as case frame dictionaries and design a less language-dependent model construction architecture. By using the constructed model, we consider a system for quality analysis of the use cases and automated test case generation to keep the traceability between document sets. We evaluated the reuse of the existing use cases and generated test case steps automatically with the proposed prototype system from real-world use cases in the development of a system using a packaged application. Based on the evaluation, we show how to construct models with high precision from English and Japanese use case data. Also, we could generate good test cases for about 90% of the real use cases through the manual improvement of the descriptions based on the feedback from the quality analysis system.

  11. ASPEN: A fully kinetic, reduced-description particle-in-cell model for simulating parametric instabilities

    International Nuclear Information System (INIS)

    Vu, H.X.; Bezzerides, B.; DuBois, D.F.

    1999-01-01

    A fully kinetic, reduced-description particle-in-cell (RPIC) model is presented in which deviations from quasineutrality, electron and ion kinetic effects, and nonlinear interactions between low-frequency and high-frequency parametric instabilities are modeled correctly. The model is based on a reduced description where the electromagnetic field is represented by three separate temporal envelopes in order to model parametric instabilities with low-frequency and high-frequency daughter waves. Because temporal envelope approximations are invoked, the simulation can be performed on the electron time scale instead of the time scale of the light waves. The electrons and ions are represented by discrete finite-size particles, permitting electron and ion kinetic effects to be modeled properly. The Poisson equation is utilized to ensure that space-charge effects are included. The RPIC model is fully three dimensional and has been implemented in two dimensions on the Accelerated Strategic Computing Initiative (ASCI) parallel computer at Los Alamos National Laboratory, and the resulting simulation code has been named ASPEN. The authors believe this code is the first particle-in-cell code capable of simulating the interaction between low-frequency and high-frequency parametric instabilities in multiple dimensions. Test simulations of stimulated Raman scattering, stimulated Brillouin scattering, and Langmuir decay instability are presented

  12. High Performance Descriptive Semantic Analysis of Semantic Graph Databases

    Energy Technology Data Exchange (ETDEWEB)

    Joslyn, Cliff A.; Adolf, Robert D.; al-Saffar, Sinan; Feo, John T.; Haglin, David J.; Mackey, Greg E.; Mizell, David W.

    2011-06-02

    As semantic graph database technology grows to address components ranging from extant large triple stores to SPARQL endpoints over SQL-structured relational databases, it will become increasingly important to be able to understand their inherent semantic structure, whether codified in explicit ontologies or not. Our group is researching novel methods for what we call descriptive semantic analysis of RDF triplestores, to serve purposes of analysis, interpretation, visualization, and optimization. But data size and computational complexity makes it increasingly necessary to bring high performance computational resources to bear on this task. Our research group built a novel high performance hybrid system comprising computational capability for semantic graph database processing utilizing the large multi-threaded architecture of the Cray XMT platform, conventional servers, and large data stores. In this paper we describe that architecture and our methods, and present the results of our analyses of basic properties, connected components, namespace interaction, and typed paths such for the Billion Triple Challenge 2010 dataset.

  13. Descriptive analysis of YouTube music therapy videos.

    Science.gov (United States)

    Gooding, Lori F; Gregory, Dianne

    2011-01-01

    The purpose of this study was to conduct a descriptive analysis of music therapy-related videos on YouTube. Preliminary searches using the keywords music therapy, music therapy session, and "music therapy session" resulted in listings of 5000, 767, and 59 videos respectively. The narrowed down listing of 59 videos was divided between two investigators and reviewed in order to determine their relationship to actual music therapy practice. A total of 32 videos were determined to be depictions of music therapy sessions. These videos were analyzed using a 16-item investigator-created rubric that examined both video specific information and therapy specific information. Results of the analysis indicated that audio and visual quality was adequate, while narrative descriptions and identification information were ineffective in the majority of the videos. The top 5 videos (based on the highest number of viewings in the sample) were selected for further analysis in order to investigate demonstration of the Professional Level of Practice Competencies set forth in the American Music Therapy Association (AMTA) Professional Competencies (AMTA, 2008). Four of the five videos met basic competency criteria, with the quality of the fifth video precluding evaluation of content. Of particular interest is the fact that none of the videos included credentialing information. Results of this study suggest the need to consider ways to ensure accurate dissemination of music therapy-related information in the YouTube environment, ethical standards when posting music therapy session videos, and the possibility of creating AMTA standards for posting music therapy related video.

  14. Learning motivation and student achievement : description analysis and relationships both

    Directory of Open Access Journals (Sweden)

    Ari Riswanto

    2017-03-01

    Full Text Available Education is very important for humans, through the education throughout the world will increasingly flourish. However, if faced with the activities within the learning process, not a few men (students who have less motivation in learning activities. This resulted in fewer maximal learning processes and in turn will affect student achievement. This study focuses to discuss matters relating to the motivation to learn and student achievement, with the aim of strengthening the importance of motivation in the learning process so that a clear relationship with student achievement. The method used is descriptive analysis and simple correlation to the 97 students taking the course introduction to Microeconomics and Indonesian. The conclusion from this research is the students have a good record if it has a well and motivated as well, and this study concludes their tie's difference between learning motivation and achievement of students on two different courses.

  15. DESCRIPTIVE ANALYSIS OF CORPORATE CULTURE FOLLOWING THE CHANGES

    Directory of Open Access Journals (Sweden)

    Elenko Zahariev

    2016-09-01

    Full Text Available Corporate culture more sensibly makes additions to the economic knowledge, accompanies the strategy and tactics in management. It feels in manners and overall activity of the organization - it is empathy and tolerance, respect and responsibility. The new corporate culture transforms each participant, changes his/her mind in the general collaborations and working habits. The new corporate culture requires improving the management style. It is no longer necessary the leader only to rule, to administer and control, but to lead and inspire. The leader sets challenging targets, optimizes the performance of the teams, fuels an optimistic mood and faith, gains agreement between workers, monitors and evaluate the work in a fair way. Current study raises the problem of interpreting cultural profiles in modern organizations and analyzes corporate culture after the changes during the transition period in Bulgaria. The descriptive analysis of corporate culture allows the relatively precise identification of its various types based on the accepted classification signs.

  16. Analysis of individual brain activation maps using hierarchical description and multiscale detection

    International Nuclear Information System (INIS)

    Poline, J.B.; Mazoyer, B.M.

    1994-01-01

    The authors propose a new method for the analysis of brain activation images that aims at detecting activated volumes rather than pixels. The method is based on Poisson process modeling, hierarchical description, and multiscale detection (MSD). Its performances have been assessed using both Monte Carlo simulated images and experimental PET brain activation data. As compared to other methods, the MSD approach shows enhanced sensitivity with a controlled overall type I error, and has the ability to provide an estimate of the spatial limits of the detected signals. It is applicable to any kind of difference image for which the spatial autocorrelation function can be approximated by a stationary Gaussian function

  17. Supplemental description of ROSA-IV/LSTF with No.1 simulated fuel-rod assembly

    International Nuclear Information System (INIS)

    1989-09-01

    Forty-two integral simulation tests of PWR small break LOCA (loss-of-coolant accident) and transient were conducted at the ROSA-IV Large-Scale Test Facility (LSTF) with the No.1 simulated fuel-rod assembly between March 1985 and August 1988. Described in the report are supplemental information on modifications of the system hardware and measuring systems, results of system characteristics tests including the initial fluid mass inventory and heat loss distribution for the primary system, and thermal properties for the heater rod materials. These are necessary to establish the correct boundary conditions of each LSTF experiment with the No.1 core assembly in addition to the system data given in the system description report (JAERI-M 84-237). (author)

  18. Description and Simulation of a Fast Packet Switch Architecture for Communication Satellites

    Science.gov (United States)

    Quintana, Jorge A.; Lizanich, Paul J.

    1995-01-01

    The NASA Lewis Research Center has been developing the architecture for a multichannel communications signal processing satellite (MCSPS) as part of a flexible, low-cost meshed-VSAT (very small aperture terminal) network. The MCSPS architecture is based on a multifrequency, time-division-multiple-access (MF-TDMA) uplink and a time-division multiplex (TDM) downlink. There are eight uplink MF-TDMA beams, and eight downlink TDM beams, with eight downlink dwells per beam. The information-switching processor, which decodes, stores, and transmits each packet of user data to the appropriate downlink dwell onboard the satellite, has been fully described by using VHSIC (Very High Speed Integrated-Circuit) Hardware Description Language (VHDL). This VHDL code, which was developed in-house to simulate the information switching processor, showed that the architecture is both feasible and viable. This paper describes a shared-memory-per-beam architecture, its VHDL implementation, and the simulation efforts.

  19. Baseline process description for simulating plutonium oxide production for precalc project

    Energy Technology Data Exchange (ETDEWEB)

    Pike, J. A. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-10-26

    Savannah River National Laboratory (SRNL) started a multi-year project, the PreCalc Project, to develop a computational simulation of a plutonium oxide (PuO2) production facility with the objective to study the fundamental relationships between morphological and physicochemical properties. This report provides a detailed baseline process description to be used by SRNL personnel and collaborators to facilitate the initial design and construction of the simulation. The PreCalc Project team selected the HB-Line Plutonium Finishing Facility as the basis for a nominal baseline process since the facility is operational and significant model validation data can be obtained. The process boundary as well as process and facility design details necessary for multi-scale, multi-physics models are provided.

  20. Overview description of the base scenario derived from FEP analysis

    International Nuclear Information System (INIS)

    Locke, J.; Bailey, L.

    1998-01-01

    , subsequent evolution and the processes affecting radionuclide transport for the groundwater and gas pathways. This report uses the conceptual models developed from the FEP analysis to present a description of the base scenario, in terms of the processes to be represented in detailed models. This report does not present an assessment of the base scenario, but rather seeks to provide a summary of those features, events and processes that should be represented, at an appropriate level of detail, within numerical models. The requirements for the development of appropriate models for representing the base scenario are described in an underlying report within the model development document suite. (author)

  1. Aircraft/Air Traffic Management Functional Analysis Model: Technical Description. 2.0

    Science.gov (United States)

    Etheridge, Melvin; Plugge, Joana; Retina, Nusrat

    1998-01-01

    The Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 (FAM 2.0), is a discrete event simulation model designed to support analysis of alternative concepts in air traffic management and control. FAM 2.0 was developed by the Logistics Management Institute (LMI) under a National Aeronautics and Space Administration (NASA) contract. This document provides a technical description of FAM 2.0 and its computer files to enable the modeler and programmer to make enhancements or modifications to the model. Those interested in a guide for using the model in analysis should consult the companion document, Aircraft/Air Traffic Management Functional Analysis Model, Version 2.0 Users Manual.

  2. Hanford Site Composite Analysis Technical Approach Description: Groundwater

    Energy Technology Data Exchange (ETDEWEB)

    Budge, T. J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-10-02

    The groundwater facet of the revised CA is responsible for generating predicted contaminant concentration values over the entire analysis spatial and temporal domain. These estimates will be used as part of the groundwater pathway dose calculation facet to estimate dose for exposure scenarios. Based on the analysis of existing models and available information, the P2R Model was selected as the numerical simulator to provide these estimates over the 10,000-year temporal domain of the CA. The P2R Model will use inputs from initial plume distributions, updated for a start date of 1/1/2017, and inputs from the vadose zone facet, created by a tool under development as part of the ICF, to produce estimates of hydraulic head, transmissivity, and contaminant concentration over time. A recommendation of acquiring 12 computer processors and 2 TB of hard drive space is made to ensure that the work can be completed within the anticipated schedule of the revised CA.

  3. Evaluating current trends in psychiatric music therapy: a descriptive analysis.

    Science.gov (United States)

    Silverman, Michael J

    2007-01-01

    , improvisation, songwriting, lyric analysis, and music and movement to address consumer objectives. Participants indicated they used therapeutic verbal skills and techniques such as humor, redirection, reinforcement, empathy, and affirmation in their clinical practice. Additionally, the results of this survey were compared to the psychiatric portion of a music therapy descriptive study published in 1979. Similarities and differences are discussed.

  4. Design and Analysis of simulation experiments : Tutorial

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2017-01-01

    This tutorial reviews the design and analysis of simulation experiments. These experiments may have various goals: validation, prediction, sensitivity analysis, optimization (possibly robust), and risk or uncertainty analysis. These goals may be realized through metamodels. Two types of metamodels

  5. Genetic analysis of bulimia nervosa: methods and sample description.

    Science.gov (United States)

    Kaye, Walter H; Devlin, Bernie; Barbarich, Nicole; Bulik, Cynthia M; Thornton, Laura; Bacanu, Silviu-Alin; Fichter, Manfred M; Halmi, Katherine A; Kaplan, Allan S; Strober, Michael; Woodside, D Blake; Bergen, Andrew W; Crow, Scott; Mitchell, James; Rotondo, Alessandro; Mauri, Mauro; Cassano, Giovanni; Keel, Pamela; Plotnicov, Katherine; Pollice, Christine; Klump, Kelly L; Lilenfeld, Lisa R; Ganjei, J Kelly; Quadflieg, Norbert; Berrettini, Wade H

    2004-05-01

    Twin and family studies suggest that genetic variants contribute to the pathogenesis of bulimia nervosa (BN) and anorexia nervosa (AN). The Price Foundation has supported an international, multisite study of families with these disorders to identify these genetic variations. The current study presents the clinical characteristics of this sample as well as a description of the study methodology. All probands met modified criteria for BN or bulimia nervosa with a history of AN (BAN) as defined in the 4th ed. of the Diagnostic and Statistical Manual of Mental Disorders (DSM-IV; American Psychiatric Association, 1994). All affected relatives met DSM-IV criteria for BN, AN, BAN, or eating disorders not otherwise specified (EDNOS). Probands and affected relatives were assessed diagnostically using both trained-rater and self-report assessments. DNA samples were collected from probands, affected relatives, and available biologic parents. Assessments were obtained from 163 BN probands and 165 BAN probands. Overall, there were 365 relative pairs available for linkage analysis. Of the affected relatives of BN probands, 62 were diagnosed as BN (34.8%), 49 as BAN (27.5%), 35 as AN (19.7%), and 32 as EDNOS (18.0%). For the relatives of BAN probands, 42 were diagnosed as BN (22.5%), 67 as BAN (35.8%), 48 as AN (25.7%), and 30 as EDNOS (16.0%). This study represents the largest genetic study of eating disorders to date. Clinical data indicate that although there are a large number of individuals with BN disorders, a range of eating pathology is represented in the sample, allowing for the examination of several different phenotypes in molecular genetic analyses. Copyright 2004 by Wiley Periodicals, Inc. Int J Eat Disord 35: 556-570, 2004.

  6. Modeling and simulation of equivalent circuits in description of biological systems - a fractional calculus approach

    Directory of Open Access Journals (Sweden)

    José Francisco Gómez Aguilar

    2012-07-01

    Full Text Available Using the fractional calculus approach, we present the Laplace analysis of an equivalent electrical circuit for a multilayered system, which includes distributed elements of the Cole model type. The Bode graphs are obtained from the numerical simulation of the corresponding transfer functions using arbitrary electrical parameters in order to illustrate the methodology. A numerical Laplace transform is used with respect to the simulation of the fractional differential equations. From the results shown in the analysis, we obtain the formula for the equivalent electrical circuit of a simple spectrum, such as that generated by a real sample of blood tissue, and the corresponding Nyquist diagrams. In addition to maintaining consistency in adjusted electrical parameters, the advantage of using fractional differential equations in the study of the impedance spectra is made clear in the analysis used to determine a compact formula for the equivalent electrical circuit, which includes the Cole model and a simple RC model as special cases.

  7. Solar Pilot Plant, Phase I. Preliminary design report. Volume II. System description and system analysis. CDRL item 2

    Energy Technology Data Exchange (ETDEWEB)

    None

    1977-05-01

    Honeywell conducted a parametric analysis of the 10-MW(e) solar pilot plant requirements and expected performance and established an optimum system design. The main analytical simulation tools were the optical (ray trace) and the dynamic simulation models. These are described in detail in Books 2 and 3 of this volume under separate cover. In making design decisions, available performance and cost data were used to provide a design reflecting the overall requirements and economics of a commercial-scale plant. This volume contains a description of this analysis/design process and resultant system/subsystem design and performance.

  8. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, R V; Kristensen, D; Nielsen, Jacob Holm

    2006-01-01

    and lipolytic changes occurring in the milk during chill storage for 4 d. Sensory analysis and chemical analysis showed high correlation between the typical descriptors for oxidation such as cardboard, metallic taste, and boiled milk and specific chemical markers for oxidation such as hexanal. Notably, primary......Oxidation in 3 types of bovine milk with different fatty acid profiles obtained through manipulation of feed was evaluated by analytical methods quantifying the content of potential antioxidants, the tendency of formation of free radicals, and the accumulation of primary and secondary oxidation...... products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...

  9. Comparison of descriptive sensory analysis and chemical analysis for oxidative changes in milk

    DEFF Research Database (Denmark)

    Hedegaard, Rikke Susanne Vingborg; Kristensen, D.; Nielsen, J. H.

    2006-01-01

    products. The milk samples were evaluated in parallel by descriptive sensory analysis by a trained panel, and the correlation between the chemical analysis and the descriptive sensory analysis was evaluated. The fatty acid composition of the 3 types of milk was found to influence the oxidative...... and lipolytic changes occurring in the milk during chill storage for 4 d. Sensory analysis and chemical analysis showed high correlation between the typical descriptors for oxidation such as cardboard, metallic taste, and boiled milk and specific chemical markers for oxidation such as hexanal. Notably, primary...... oxidation products (i.e., lipid hydroperoxides) and even the tendency of formation of radicals as measured by electron spin resonance spectroscopy were also highly correlated to the sensory descriptors for oxidation. Electron spin resonance spectroscopy should accordingly be further explored as a routine...

  10. Conception of a PWR simulator as a tool for safety analysis

    International Nuclear Information System (INIS)

    Lanore, J.M.; Bernard, P.; Romeyer Dherbey, J.; Bonnet, C.; Quilchini, P.

    1982-09-01

    A simulator can be a very useful tool for safety analysis to study accident sequences involving malfunctions of the systems and operator interventions. The main characteristics of the simulator SALAMANDRE (description of the systems, physical models, programming organization, control desk) have then been selected according tot he objectives of safety analysis

  11. Water Quality Analysis Simulation Program (WASP)

    Science.gov (United States)

    The Water Quality Analysis Simulation Program (WASP) model helps users interpret and predict water quality responses to natural phenomena and manmade pollution for various pollution management decisions.

  12. Reliability analysis of neutron transport simulation using Monte Carlo method

    International Nuclear Information System (INIS)

    Souza, Bismarck A. de; Borges, Jose C.

    1995-01-01

    This work presents a statistical and reliability analysis covering data obtained by computer simulation of neutron transport process, using the Monte Carlo method. A general description of the method and its applications is presented. Several simulations, corresponding to slowing down and shielding problems have been accomplished. The influence of the physical dimensions of the materials and of the sample size on the reliability level of results was investigated. The objective was to optimize the sample size, in order to obtain reliable results, optimizing computation time. (author). 5 refs, 8 figs

  13. Website Sharing in Online Health Communities: A Descriptive Analysis.

    Science.gov (United States)

    Nath, Chinmoy; Huh, Jina; Adupa, Abhishek Kalyan; Jonnalagadda, Siddhartha R

    2016-01-13

    An increasing number of people visit online health communities to seek health information. In these communities, people share experiences and information with others, often complemented with links to different websites. Understanding how people share websites can help us understand patients' needs in online health communities and improve how peer patients share health information online. Our goal was to understand (1) what kinds of websites are shared, (2) information quality of the shared websites, (3) who shares websites, (4) community differences in website-sharing behavior, and (5) the contexts in which patients share websites. We aimed to find practical applications and implications of website-sharing practices in online health communities. We used regular expressions to extract URLs from 10 WebMD online health communities. We then categorized the URLs based on their top-level domains. We counted the number of trust codes (eg, accredited agencies' formal evaluation and PubMed authors' institutions) for each website to assess information quality. We used descriptive statistics to determine website-sharing activities. To understand the context of the URL being discussed, we conducted a simple random selection of 5 threads that contained at least one post with URLs from each community. Gathering all other posts in these threads resulted in 387 posts for open coding analysis with the goal of understanding motivations and situations in which website sharing occurred. We extracted a total of 25,448 websites. The majority of the shared websites were .com (59.16%, 15,056/25,448) and WebMD internal (23.2%, 5905/25,448) websites; the least shared websites were social media websites (0.15%, 39/25,448). High-posting community members and moderators posted more websites with trust codes than low-posting community members did. The heart disease community had the highest percentage of websites containing trust codes compared to other communities. Members used websites to

  14. Descriptive Analysis Of Nigerian Children Human Figure Drawings ...

    African Journals Online (AJOL)

    Art is a symbolic means of communication through which man can communicate his inner needs, desires and worries. Children find art a good means of communicating what ordinarily, they may not be able to describe orally. This study was on description of Nigerian children human figure art using Lowenfeld and Brittain ...

  15. Simulation modeling and analysis with Arena

    CERN Document Server

    Altiok, Tayfur

    2007-01-01

    Simulation Modeling and Analysis with Arena is a highly readable textbook which treats the essentials of the Monte Carlo discrete-event simulation methodology, and does so in the context of a popular Arena simulation environment.” It treats simulation modeling as an in-vitro laboratory that facilitates the understanding of complex systems and experimentation with what-if scenarios in order to estimate their performance metrics. The book contains chapters on the simulation modeling methodology and the underpinnings of discrete-event systems, as well as the relevant underlying probability, statistics, stochastic processes, input analysis, model validation and output analysis. All simulation-related concepts are illustrated in numerous Arena examples, encompassing production lines, manufacturing and inventory systems, transportation systems, and computer information systems in networked settings.· Introduces the concept of discrete event Monte Carlo simulation, the most commonly used methodology for modeli...

  16. Integrated Data Collection Analysis (IDCA) Program: FY2011 Project Descriptions

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC-IHD), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC-IHD), Indian Head, MD (United States). Indian Head Division; Whinnery, LeRoy L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Shelley, Timothy J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Reyes, Jose A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2012-02-03

    This document provides brief descriptions of research topics for consideration by the IDCA for potential funding in funding in FY 2011. The topics include the utilization of the results from the Proficiency Test developed during FY 2010 to start populating the small-scale safety and thermal testing (SSST) Testing Compendium and revising results from methods modifications. Other research topics were also developed for FY 2011 from issues that arose in the Proficiency Test.

  17. Fractional-Order Nonlinear Systems Modeling, Analysis and Simulation

    CERN Document Server

    Petráš, Ivo

    2011-01-01

    "Fractional-Order Nonlinear Systems: Modeling, Analysis and Simulation" presents a study of fractional-order chaotic systems accompanied by Matlab programs for simulating their state space trajectories, which are shown in the illustrations in the book. Description of the chaotic systems is clearly presented and their analysis and numerical solution are done in an easy-to-follow manner. Simulink models for the selected fractional-order systems are also presented. The readers will understand the fundamentals of the fractional calculus, how real dynamical systems can be described using fractional derivatives and fractional differential equations, how such equations can be solved, and how to simulate and explore chaotic systems of fractional order. The book addresses to mathematicians, physicists, engineers, and other scientists interested in chaos phenomena or in fractional-order systems. It can be used in courses on dynamical systems, control theory, and applied mathematics at graduate or postgraduate level. ...

  18. A reliability simulation language for reliability analysis

    International Nuclear Information System (INIS)

    Deans, N.D.; Miller, A.J.; Mann, D.P.

    1986-01-01

    The results of work being undertaken to develop a Reliability Description Language (RDL) which will enable reliability analysts to describe complex reliability problems in a simple, clear and unambiguous way are described. Component and system features can be stated in a formal manner and subsequently used, along with control statements to form a structured program. The program can be compiled and executed on a general-purpose computer or special-purpose simulator. (DG)

  19. An analysis of simulated and observed storm characteristics

    Science.gov (United States)

    Benestad, R. E.

    2010-09-01

    A calculus-based cyclone identification (CCI) method has been applied to the most recent re-analysis (ERAINT) from the European Centre for Medium-range Weather Forecasts and results from regional climate model (RCM) simulations. The storm frequency for events with central pressure below a threshold value of 960-990hPa were examined, and the gradient wind from the simulated storm systems were compared with corresponding estimates from the re-analysis. The analysis also yielded estimates for the spatial extent of the storm systems, which was also included in the regional climate model cyclone evaluation. A comparison is presented between a number of RCMs and the ERAINT re-analysis in terms of their description of the gradient winds, number of cyclones, and spatial extent. Furthermore, a comparison between geostrophic wind estimated though triangules of interpolated or station measurements of SLP is presented. Wind still represents one of the more challenging variables to model realistically.

  20. Development of BWR [boiling water reactor] and PWR [pressurized water reactor] event descriptions for nuclear facility simulator training

    International Nuclear Information System (INIS)

    Carter, R.J.; Bovell, C.R.

    1987-01-01

    A number of tools that can aid nuclear facility training developers in designing realistic simulator scenarios have been developed. This paper describes each of the tools, i.e., event lists, events-by-competencies matrices, and event descriptions, and illustrates how the tools can be used to construct scenarios

  1. Robotics/Automated Systems Task Analysis and Description of Required Job Competencies Report. Task Analysis and Description of Required Job Competencies of Robotics/Automated Systems Technicians.

    Science.gov (United States)

    Hull, Daniel M.; Lovett, James E.

    This task analysis report for the Robotics/Automated Systems Technician (RAST) curriculum project first provides a RAST job description. It then discusses the task analysis, including the identification of tasks, the grouping of tasks according to major areas of specialty, and the comparison of the competencies to existing or new courses to…

  2. Parity simulation for nuclear plant analysis

    International Nuclear Information System (INIS)

    Hansen, K.F.; Depiente, E.

    1986-01-01

    The analysis of the transient performance of nuclear plants is sufficiently complex that simulation tools are needed for design and safety studies. The simulation tools are needed for design and safety studies. The simulation tools are normally digital because of the speed, flexibility, generality, and repeatability of digital computers. However, communication with digital computers is an awkward matter, requiring special skill or training. The designer wishing to gain insight into system behavior must expend considerable effort in learning to use computer codes, or else have an intermediary communicate with the machine. There has been a recent development in analog simulation that simplifies the user interface with the simulator, while at the same time improving the performance of analog computers. This development is termed parity simulation and is now in routine use in analyzing power electronic network transients. The authors describe the concept of parity simulation and present some results of using the approach to simulate neutron kinetics problems

  3. AN ANALYSIS REGARDING DESCRIPTIVE DIMENSIONS OF BRAND EQUITY

    Directory of Open Access Journals (Sweden)

    Ioan MOISESCU

    2007-01-01

    Full Text Available The competitive potential of any company is significantly influenced by the brands held in the company’s portfolio. Brands are definitely valuable marketing assets. As the brand is a central element of any marketing strategy it is essential to be aware of the descriptive dimensions of its equity. This paper tries to outline these dimensions as follows: brand loyalty, brand awareness, brand perceived quality, brand personality, brand image, brand identity and brand associations, as analyzed in the specialized literature. Identifying and comparing different approaches regarding each brand equity dimension and revealing interdependencies between these dimensions, focusing on the importance of scientifically determining their role in generating a long-term increase in marketing efforts efficiency, are among the main objectives of this paper.

  4. Can Raters with Reduced Job Descriptive Information Provide Accurate Position Analysis Questionnaire (PAQ) Ratings?

    Science.gov (United States)

    Friedman, Lee; Harvey, Robert J.

    1986-01-01

    Job-naive raters provided with job descriptive information made Position Analysis Questionnaire (PAQ) ratings which were validated against ratings of job analysts who were also job content experts. None of the reduced job descriptive information conditions enabled job-naive raters to obtain either acceptable levels of convergent validity with…

  5. Holistic Nursing Simulation: A Concept Analysis.

    Science.gov (United States)

    Cohen, Bonni S; Boni, Rebecca

    2018-03-01

    Simulation as a technology and holistic nursing care as a philosophy are two components within nursing programs that have merged during the process of knowledge and skill acquisition in the care of the patients as whole beings. Simulation provides opportunities to apply knowledge and skill through the use of simulators, standardized patients, and virtual settings. Concerns with simulation have been raised regarding the integration of the nursing process and recognizing the totality of the human being. Though simulation is useful as a technology, the nursing profession places importance on patient care, drawing on knowledge, theories, and expertise to administer patient care. There is a need to promptly and comprehensively define the concept of holistic nursing simulation to provide consistency and a basis for quality application within nursing curricula. This concept analysis uses Walker and Avant's approach to define holistic nursing simulation by defining antecedents, consequences, and empirical referents. The concept of holism and the practice of holistic nursing incorporated into simulation require an analysis of the concept of holistic nursing simulation by developing a language and model to provide direction for educators in design and development of holistic nursing simulation.

  6. Critical slowing down and error analysis in lattice QCD simulations

    International Nuclear Information System (INIS)

    Virotta, Francesco

    2012-01-01

    In this work we investigate the critical slowing down of lattice QCD simulations. We perform a preliminary study in the quenched approximation where we find that our estimate of the exponential auto-correlation time scales as τ exp (a)∝a -5 , where a is the lattice spacing. In unquenched simulations with O(a) improved Wilson fermions we do not obtain a scaling law but find results compatible with the behavior that we find in the pure gauge theory. The discussion is supported by a large set of ensembles both in pure gauge and in the theory with two degenerate sea quarks. We have moreover investigated the effect of slow algorithmic modes in the error analysis of the expectation value of typical lattice QCD observables (hadronic matrix elements and masses). In the context of simulations affected by slow modes we propose and test a method to obtain reliable estimates of statistical errors. The method is supposed to help in the typical algorithmic setup of lattice QCD, namely when the total statistics collected is of O(10)τ exp . This is the typical case when simulating close to the continuum limit where the computational costs for producing two independent data points can be extremely large. We finally discuss the scale setting in N f =2 simulations using the Kaon decay constant f K as physical input. The method is explained together with a thorough discussion of the error analysis employed. A description of the publicly available code used for the error analysis is included.

  7. Tolerance analysis through computational imaging simulations

    Science.gov (United States)

    Birch, Gabriel C.; LaCasse, Charles F.; Stubbs, Jaclynn J.; Dagel, Amber L.; Bradley, Jon

    2017-11-01

    The modeling and simulation of non-traditional imaging systems require holistic consideration of the end-to-end system. We demonstrate this approach through a tolerance analysis of a random scattering lensless imaging system.

  8. Thermodynamic description of polymorphism in Q- and N-rich peptide aggregates revealed by atomistic simulation.

    Science.gov (United States)

    Berryman, Joshua T; Radford, Sheena E; Harris, Sarah A

    2009-07-08

    Amyloid fibrils are long, helically symmetric protein aggregates that can display substantial variation (polymorphism), including alterations in twist and structure at the beta-strand and protofilament levels, even when grown under the same experimental conditions. The structural and thermodynamic origins of this behavior are not yet understood. We performed molecular-dynamics simulations to determine the thermodynamic properties of different polymorphs of the peptide GNNQQNY, modeling fibrils containing different numbers of protofilaments based on the structure of amyloid-like cross-beta crystals of this peptide. We also modeled fibrils with new orientations of the side chains, as well as a de novo designed structure based on antiparallel beta-strands. The simulations show that these polymorphs are approximately isoenergetic under a range of conditions. Structural analysis reveals a dynamic reorganization of electrostatics and hydrogen bonding in the main and side chains of the Gln and Asn residues that characterize this peptide sequence. Q/N-rich stretches are found in several amyloidogenic proteins and peptides, including the yeast prions Sup35-N and Ure2p, as well as in the human poly-Q disease proteins, including the ataxins and huntingtin. Based on our results, we propose that these residues imbue a unique structural plasticity to the amyloid fibrils that they comprise, rationalizing the ability of proteins enriched in these amino acids to form prion strains with heritable and different phenotypic traits.

  9. A Qualitative Descriptive Analysis of Collaboration Technology in the Navy

    Directory of Open Access Journals (Sweden)

    Ryan Wark

    2015-10-01

    Full Text Available Collaboration technologies enable people to communicate and use information to make organizational decisions. The United States Navy refers to this concept as information dominance. Various collaboration technologies are used by the Navy to achieve this mission. This qualitative descriptive study objectively examined how a matrix oriented Navy activity perceived an implemented collaboration technology. These insights were used to determine whether a specific collaboration technology achieved a mission of information dominance. The study used six collaboration themes as a foundation to include: (a Cultural intelligence, (b Communication, (c Capability, (d Coordination, (e Cooperation, and (f Convergence. It was concluded that collaboration technology was mostly perceived well and helped to achieve some levels of information dominance. Collaboration technology improvement areas included bringing greater awareness to the collaboration technology, revamping the look and feel of the user interface, centrally paying for user and storage fees, incorporating more process management tools, strategically considering a Continuity of Operations, and incorporating additional industry best practices for data structures. Emerging themes of collaboration were collected to examine common patterns identified in the collected data. Emerging themes included acceptance, awareness, search, scope, content, value, tools, system performance, implementation, training, support, usage, structure, complexity, approach, governance/configuration management/policy, and resourcing.

  10. Detector Simulation: Data Treatment and Analysis Methods

    CERN Document Server

    Apostolakis, J

    2011-01-01

    Detector Simulation in 'Data Treatment and Analysis Methods', part of 'Landolt-Börnstein - Group I Elementary Particles, Nuclei and Atoms: Numerical Data and Functional Relationships in Science and Technology, Volume 21B1: Detectors for Particles and Radiation. Part 1: Principles and Methods'. This document is part of Part 1 'Principles and Methods' of Subvolume B 'Detectors for Particles and Radiation' of Volume 21 'Elementary Particles' of Landolt-Börnstein - Group I 'Elementary Particles, Nuclei and Atoms'. It contains the Section '4.1 Detector Simulation' of Chapter '4 Data Treatment and Analysis Methods' with the content: 4.1 Detector Simulation 4.1.1 Overview of simulation 4.1.1.1 Uses of detector simulation 4.1.2 Stages and types of simulation 4.1.2.1 Tools for event generation and detector simulation 4.1.2.2 Level of simulation and computation time 4.1.2.3 Radiation effects and background studies 4.1.3 Components of detector simulation 4.1.3.1 Geometry modeling 4.1.3.2 External fields 4.1.3.3 Intro...

  11. Nuclear reactor descriptions for space power systems analysis

    Science.gov (United States)

    Mccauley, E. W.; Brown, N. J.

    1972-01-01

    For the small, high performance reactors required for space electric applications, adequate neutronic analysis is of crucial importance, but in terms of computational time consumed, nuclear calculations probably yield the least amount of detail for mission analysis study. It has been found possible, after generation of only a few designs of a reactor family in elaborate thermomechanical and nuclear detail to use simple curve fitting techniques to assure desired neutronic performance while still performing the thermomechanical analysis in explicit detail. The resulting speed-up in computation time permits a broad detailed examination of constraints by the mission analyst.

  12. Descriptive Sensory Analysis of Omashikwa, a traditional fermented ...

    African Journals Online (AJOL)

    UNAM002

    2014-04-02

    Apr 2, 2014 ... treatment of milk prior to fermentation and Hazard Analysis Critical ... spoilage microorganisms, reduce lactose intolerance to sensitive .... to good manufacturing practices on unit operations, particularly heat treatment on k-.

  13. Subject description form of crime prevention (morphological analysis)

    OpenAIRE

    Валерій Федорович Оболенцев

    2016-01-01

    Activities of the National Crime Prevention is a system object. Therefore, it should be improving on the basis of systems analysis techniques. The practice of systematic approach was realized in the works of  N. F. Kuznetsova, A. I. Dolgova, D. O. Li, V. M. Dryomin, O. Y. Manokha, O. G. Frolova. Crime models developed C. Y. Vitsin, Y. D. Bluvshteyn, N. V. Smetanina. We previously disclosed basic principles of system analysis system to prevent crime and its genetic and prognostic aspec...

  14. Students’ views on the block evaluation process: A descriptive analysis

    Directory of Open Access Journals (Sweden)

    Ntefeleng E. Pakkies

    2016-03-01

    Full Text Available Background: Higher education institutions have executed policies and practices intended to determine and promote good teaching. Students’ evaluation of the teaching and learning process is seen as one measure of evaluating quality and effectiveness of instruction and courses. Policies and procedures guiding this process are discernible in universities, but it isoften not the case for nursing colleges. Objective: To analyse and describe the views of nursing students on block evaluation, and how feedback obtained from this process was managed.Method: A quantitative descriptive study was conducted amongst nursing students (n = 177 in their second to fourth year of training from one nursing college in KwaZulu-Natal. A questionnaire was administered by the researcher and data were analysed using the Statistical Package of Social Sciences Version 19.0. Results: The response rate was 145 (81.9%. The participants perceived the aim of block evaluation as improving the quality of teaching and enhancing their experiences as students.They questioned the significance of their input as stakeholders given that they had never been consulted about the development or review of the evaluation tool, or the administration process; and they often did not receive feedback from the evaluation they participated in. Conclusion: The college management should develop a clear organisational structure with supporting policies and operational guidelines for administering the evaluation process. The administration, implementation procedures, reporting of results and follow-up mechanisms should be made transparent and communicated to all concerned. Reports and actions related to these evaluations should provide feedback into relevant courses or programmes. Keywords: Student evaluation of teaching; perceptions; undergraduate nursing students; evaluation process

  15. SIMONE: Tool for Data Analysis and Simulation

    International Nuclear Information System (INIS)

    Chudoba, V.; Hnatio, B.; Sharov, P.; Papka, Paul

    2013-06-01

    SIMONE is a software tool based on the ROOT Data Analysis Framework and developed in collaboration of FLNR JINR and iThemba LABS. It is intended for physicists planning experiments and analysing experimental data. The goal of the SIMONE framework is to provide a flexible system, user friendly, efficient and well documented. It is intended for simulation of a wide range of Nuclear Physics experiments. The most significant conditions and physical processes can be taken into account during simulation of the experiment. The user can create his own experimental setup through the access of predefined detector geometries. Simulated data is made available in the same format as for the real experiment for identical analysis of both experimental and simulated data. Significant time reduction is expected during experiment planning and data analysis. (authors)

  16. Multiple predictor smoothing methods for sensitivity analysis: Description of techniques

    International Nuclear Information System (INIS)

    Storlie, Curtis B.; Helton, Jon C.

    2008-01-01

    The use of multiple predictor smoothing methods in sampling-based sensitivity analyses of complex models is investigated. Specifically, sensitivity analysis procedures based on smoothing methods employing the stepwise application of the following nonparametric regression techniques are described: (i) locally weighted regression (LOESS), (ii) additive models, (iii) projection pursuit regression, and (iv) recursive partitioning regression. Then, in the second and concluding part of this presentation, the indicated procedures are illustrated with both simple test problems and results from a performance assessment for a radioactive waste disposal facility (i.e., the Waste Isolation Pilot Plant). As shown by the example illustrations, the use of smoothing procedures based on nonparametric regression techniques can yield more informative sensitivity analysis results than can be obtained with more traditional sensitivity analysis procedures based on linear regression, rank regression or quadratic regression when nonlinear relationships between model inputs and model predictions are present

  17. A description of the BNL active surface analysis facility

    International Nuclear Information System (INIS)

    Tyler, J.W.

    1989-11-01

    Berkeley Nuclear Laboratories has a responsibility for the assessment of radioactive specimens arising both from post irradiation examination of power reactor components and structures and experimental programmes concerned with fission and activation product transport. Existing analytical facilities have been extended with the commissioning of an active surface analysis instrument (XSAM 800pci, Kratos Analytical). Surface analysis involves the characterisation of the outer few atomic layers of a solid surface/interface whose chemical composition and electronic structure will probably be different from the bulk. The new instrument consists three interconnected chambers positioned in series; comprising of a high vacuum sample introduction chamber, an ultra-high vacuum sample treatment/fracture chamber and an ultra-high vacuum sample analysis chamber. The sample analysis chamber contains the electron, X-ray and ion-guns and the electron and ion detectors necessary for performing X-ray photoelectron spectroscopy, scanning Auger microscopy and secondary-ion mass spectroscopy. The chamber also contains a high stability manipulator to enable sub-micron imaging of specimens to be achieved and provide sample heating and cooling between - 180 and 600 0 C. (author)

  18. Bridging the gap: linking molecular simulations and systemic descriptions of cellular compartments.

    Directory of Open Access Journals (Sweden)

    Tihamér Geyer

    Full Text Available Metabolic processes in biological cells are commonly either characterized at the level of individual enzymes and metabolites or at the network level. Often these two paradigms are considered as mutually exclusive because concepts from neither side are suited to describe the complete range of scales. Additionally, when modeling metabolic or regulatory cellular systems, often a large fraction of the required kinetic parameters are unknown. This even applies to such simple and extensively studied systems like the photosynthetic apparatus of purple bacteria. Using the chromatophore vesicles of Rhodobacter sphaeroides as a model system, we show that a consistent kinetic model emerges when fitting the dynamics of a molecular stochastic simulation to a set of time dependent experiments even though about two thirds of the kinetic parameters in this system are not known from experiment. Those kinetic parameters that were previously known all came out in the expected range. The simulation model was built from independent protein units composed of elementary reactions processing single metabolites. This pools-and-proteins approach naturally compiles the wealth of available molecular biological data into a systemic model and can easily be extended to describe other systems by adding new protein or nucleic acid types. The automated parameter optimization, performed with an evolutionary algorithm, reveals the sensitivity of the model to the value of each parameter and the relative importances of the experiments used. Such an analysis identifies the crucial system parameters and guides the setup of new experiments that would add most knowledge for a systemic understanding of cellular compartments. The successful combination of the molecular model and the systemic parametrization presented here on the example of the simple machinery for bacterial photosynthesis shows that it is actually possible to combine molecular and systemic modeling. This framework can now

  19. User's guide of DETRAS system-3. Description of the simulated reactor plant

    International Nuclear Information System (INIS)

    Yamaguchi, Yukichi

    2006-12-01

    DETRAS system is a PWR reactor simulator system for operation trainings whose distinguished feature is that it can be operated from the remote place of the simulator site. The document which is the third one of a series of three volumes of the user's guide of DETRAS, describes firstly an outline of the simulated reactor system then a user's interface needed for operation of the simulator of interest and finally a series of procedure for startup of the simulated reactor and shutdown of it from its rated operation state. (author)

  20. Arabian Oryx (Oryx leucoryx) Trachea: a Descriptive and Morphometric Analysis

    OpenAIRE

    Al-Zhgoul, Mohammad Borhan; Dalab, Abdul Hafeed S; Abdulhakeem, ElJarah; Ismail, Zuhair Bani; Thanain, Al Thanain

    2013-01-01

    The objective of this study was to report the distinctive anatomical and histological features of the trachea of the Arabian Oryx (Oryx leucoryx). The number of tracheal rings and tracheal length were measured. The diameter, thickness and cross sectional area of tracheal ring were determined at four tracheal regions (cranial cervical (CCR), middle cervical, thoracic inlets and intra thoracic). Tracheal rings were also collected for histological analysis. The mean length of the trachea was 54....

  1. Hanford Site Composite Analysis Technical Approach Description: Integrated Computational Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Smith, K. J. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needs if potential problems are identified.

  2. Hanford Site Composite Analysis Technical Approach Description: Waste Form Release.

    Energy Technology Data Exchange (ETDEWEB)

    Hardie, S. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Paris, B. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Apted, M. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-09-14

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions or assessment needs, if potential problems are identified.

  3. Subject description form of crime prevention (morphological analysis

    Directory of Open Access Journals (Sweden)

    Валерій Федорович Оболенцев

    2016-06-01

    Full Text Available Activities of the National Crime Prevention is a system object. Therefore, it should be improving on the basis of systems analysis techniques. The practice of systematic approach was realized in the works of  N. F. Kuznetsova, A. I. Dolgova, D. O. Li, V. M. Dryomin, O. Y. Manokha, O. G. Frolova. Crime models developed C. Y. Vitsin, Y. D. Bluvshteyn, N. V. Smetanina. We previously disclosed basic principles of system analysis system to prevent crime and its genetic and prognostic aspects, classification features, systemic factors latentyzatsiyi criminogenic factors - object protective activity, the amount of protected public relations. In order to investigate the systemic properties of the system of crime prevention in Ukraine we have defined objectives of the study - to its morphological analysis. Elements of a specialized system of crime prevention - a prosecution, Interior, Security Service, the Military Service of the Armed Forces of Ukraine, the National Anti-Corruption Bureau of Ukraine, bodies of state border protection agencies revenues and fees, enforcement and penal institutions, remand centers, public financial control, fisheries, the state forest protection. We determined depth analysis of your system functions at the level of law enforcement agencies. Intercom system to prevent crime is information links between elements of the system (transfer of information on crimes and criminals current activity. External relations systems provide processes of interaction with the environment. For crime prevention system external relations are relations of elements (law enforcement society. In the system of crime prevention implemented such coordination links: 1 Departmental coordination meeting of law enforcement agencies; 2 inter-agency coordination meeting of law enforcement agencies (Prosecutor General of Ukraine, the State Border Service of Ukraine and others. 3 mutual exchange of information; 4 order the prosecution, SBU on other agencies

  4. Hanford Site Composite Analysis Technical Approach Description: Atmospheric Transport Modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Sun, B. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Lehman, L. L. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-10-02

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions or assessment needs, if potential problems are identified.

  5. Stochastic simulation algorithms and analysis

    CERN Document Server

    Asmussen, Soren

    2007-01-01

    Sampling-based computational methods have become a fundamental part of the numerical toolset of practitioners and researchers across an enormous number of different applied domains and academic disciplines. This book provides a broad treatment of such sampling-based methods, as well as accompanying mathematical analysis of the convergence properties of the methods discussed. The reach of the ideas is illustrated by discussing a wide range of applications and the models that have found wide usage. The first half of the book focuses on general methods, whereas the second half discusses model-specific algorithms. Given the wide range of examples, exercises and applications students, practitioners and researchers in probability, statistics, operations research, economics, finance, engineering as well as biology and chemistry and physics will find the book of value.

  6. ROSA-IV Large Scale Test Facility (LSTF) system description for second simulated fuel assembly

    International Nuclear Information System (INIS)

    1990-10-01

    The ROSA-IV Program's Large Scale Test Facility (LSTF) is a test facility for integral simulation of thermal-hydraulic response of a pressurized water reactor (PWR) during small break loss-of-coolant accidents (LOCAs) and transients. In this facility, the PWR core nuclear fuel rods are simulated using electric heater rods. The simulated fuel assembly which was installed during the facility construction was replaced with a new one in 1988. The first test with this second simulated fuel assembly was conducted in December 1988. This report describes the facility configuration and characteristics as of this date (December 1988) including the new simulated fuel assembly design and the facility changes which were made during the testing with the first assembly as well as during the renewal of the simulated fuel assembly. (author)

  7. Simulation data analysis by virtual reality system

    International Nuclear Information System (INIS)

    Ohtani, Hiroaki; Mizuguchi, Naoki; Shoji, Mamoru; Ishiguro, Seiji; Ohno, Nobuaki

    2010-01-01

    We introduce new software for analysis of time-varying simulation data and new approach for contribution of simulation to experiment by virtual reality (VR) technology. In the new software, the objects of time-varying field are visualized in VR space and the particle trajectories in the time-varying electromagnetic field are also traced. In the new approach, both simulation results and experimental device data are simultaneously visualized in VR space. These developments enhance the study of the phenomena in plasma physics and fusion plasmas. (author)

  8. Hanford Site Composite Analysis Technical Approach Description: Vadose Zone

    Energy Technology Data Exchange (ETDEWEB)

    Williams, M. D. [INTERA Inc., Austin, TX (United States); Nichols, W. E. [CH2M, Richland, WA (United States); Ali, A. [INTERA Inc., Austin, TX (United States); Allena, P. [INTERA Inc., Austin, TX (United States); Teague, G. [INTERA Inc., Austin, TX (United States); Hammond, T. B. [INTERA Inc., Austin, TX (United States)

    2017-10-30

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, and DOE M 435.1 Chg 1, Radioactive Waste Management Manual, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needs, if potential problems are identified.

  9. Thermodynamic approach to rheological modeling and simulations at the configuration space level of description

    NARCIS (Netherlands)

    Jongschaap, R.J.J.; Denneman, A.I.M.; Denneman, A.I.M.; Conrads, W.

    1997-01-01

    The so-called matrix model is a general thermodynamic framework for microrheological modeling. This model has already been proven to be applicable for a wide class of systems, in particular to models formulated at the configuration tensor level of description. For models formulated at the

  10. The PHMC algorithm for simulations of dynamical fermions; 1, description and properties

    CERN Document Server

    Frezzotti, R

    1999-01-01

    We give a detailed description of the so-called Polynomial Hybrid Monte Carlo (PHMC) algorithm. The effects of the correction factor, which is introduced to render the algorithm exact, are discussed, stressing their relevance for the statistical fluctuations and (almost) zero mode contributions to physical observables. We also investigate rounding-error effects and propose several ways to reduce memory requirements.

  11. The Balanced Scorecard in Portuguese Private Organizations: A Descriptive Analysis

    Directory of Open Access Journals (Sweden)

    Patrícia Rodrigues Quesado

    2012-06-01

    Full Text Available The understanding that the existing models applied to business performance evaluation were obsolete and could induce companies to taking erroneous decisions led to the development of information and management control systems that reflect the evolution of the non-financial or qualitative key success factors, such as the Balanced Scorecard (BSC. Nowadays, the BSC is one of the most popular management accounting tools. However, reports of high rates of failure require a more close analysis at the challenges and key issues with which various private organizations were confronted in its implementation. Consequently, even after more than two decades after the creation of the model, and despite the introduction of some improvements, investment in BSC projects within the corporate and academic environments can still be confirmed. Thus, in order to find out if Portuguese organizations know and implement the BSC, we sent a questionnaire to 549 private organizations, obtaining a response rate of 28.2%. The results suggest that although the majority of respondents know the BSC, its implementation in these organizations is limited and very recent.

  12. Leadership communication styles: a descriptive analysis of health care professionals

    Directory of Open Access Journals (Sweden)

    Rogers R

    2012-06-01

    Full Text Available Rebekah RogersSchool of Communication, East Carolina University, NC, USAAbstract: The study of leadership in health care is important to examine for many reasons. Health care leaders will inevitably have an impact on the lives of many people, as individuals rely on physicians and nurses during some of the most critical moments in their lives. This paper presents a broad overview of a research study conducted over the past year and highlights its general conclusions. In this study, I examined the leadership styles of health care administrators and those of physicians and nurses who chair departments. Thorough analysis yielded three clear themes: viewpoints on leadership, decision making, and relationships. Physicians' viewpoints on leadership varied; however, it was assumed that they knew they were leaders. Nurses seemed to be in a category of their own, in which it was common for them to use the term “servant leadership.” Results from the hospital administrators suggested that they were always thinking “big picture leadership.” Leadership is a working component of every job and it is important for people to become as educated as possible about their own communication style.Keywords: leadership, communication, health care

  13. Seismic analysis of structures by simulation

    International Nuclear Information System (INIS)

    Sundararajan, C.; Gangadharan, A.C.

    1977-01-01

    The paper presents a state-of-the-art survey, and recommendations for future work in the area of stochastic seismic analysis by Monte Carlo simulation. First the Monte Carlo simulation procedure is described, with special emphasis on a 'unified approach' for the digital generation of artificial earthquake motions. Next, the advantages and disadvantages of the method over the power spectral method are discussed; and finally, an efficient 'Hybrid Monte Carlo-Power Spectral Method' is developed. The Monte Carlo simulation procedure consists of the following tasks: (1) Digital generation of artificial earthquake motions, (2) Response analysis of the structure to a number of sample motions, and (3) statistical analysis of the structural responses

  14. Seismic analysis of structures by simulation

    International Nuclear Information System (INIS)

    Sundararajan, C.; Gangadharan, A.C.

    1977-01-01

    The paper presents a state-of-the-art survey, and recommendations for future work in the area of stochastic seismic analysis by Monte Carlo simulation. First the Monte Carlo simulation procedure is described with special emphasis on a 'unified approach' for the digital generation of anificial earthquake motions. Next, the advantages and disadvantages of the method over the power spectral method are discussed; and finally, an efficient 'Hybrid Monte Carlo-Power Spectral Method' is developed. The Monte Carlo simulation procedure consists of the following tasks: (1) Digital generation of artificial earthquake motions, (2) Response analysis of the structure to a number of sample motions, and (3) Statistical analysis of the structural responses. (Auth.)

  15. Regional hydrogeological simulations for Forsmark - numerical modelling using DarcyTools. Preliminary site description Forsmark area version 1.2

    International Nuclear Information System (INIS)

    Follin, Sven; Stigsson, Martin; Svensson, Urban

    2005-12-01

    A numerical model is developed on a regional-scale (hundreds of square kilometres) to study the zone of influence for variable-density groundwater flow that affects the Forsmark area. Transport calculations are performed by particle tracking from a local-scale release area (a few square kilometres) to test the sensitivity to different hydrogeological uncertainties and the need for far-field realism. The main objectives of the regional flow modelling were to achieve the following: I. Palaeo-hydrogeological understanding: An improved understanding of the palaeohydrogeological conditions is necessary in order to gain credibility for the site descriptive model in general and the hydrogeological description in particular. This requires modelling of the groundwater flow from the last glaciation up to present-day with comparisons against measured TDS and other hydro-geochemical measures. II. Simulation of flow paths: The simulation and visualisation of flow paths from a tentative repository area is a means for describing the role of the current understanding of the modelled hydrogeological conditions in the target volume, i.e. the conditions of primary interest for Safety Assessment. Of particular interest here is demonstration of the need for detailed far-field realism in the numerical simulations. The motivation for a particular model size (and resolution) and set of boundary conditions for a realistic description of the recharge and discharge connected to the flow at repository depth is an essential part of the groundwater flow path simulations. The numerical modelling was performed by two separate modelling teams, the ConnectFlow Team and the DarcyTools Team. The work presented in this report was based on the computer code DarcyTools developed by Computer-aided Fluid Engineering. DarcyTools is a kind of equivalent porous media (EPM) flow code specifically designed to treat flow and salt transport in sparsely fractured crystalline rock intersected by transmissive

  16. Regional hydrogeological simulations for Forsmark - numerical modelling using DarcyTools. Preliminary site description Forsmark area version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2005-12-15

    A numerical model is developed on a regional-scale (hundreds of square kilometres) to study the zone of influence for variable-density groundwater flow that affects the Forsmark area. Transport calculations are performed by particle tracking from a local-scale release area (a few square kilometres) to test the sensitivity to different hydrogeological uncertainties and the need for far-field realism. The main objectives of the regional flow modelling were to achieve the following: I. Palaeo-hydrogeological understanding: An improved understanding of the palaeohydrogeological conditions is necessary in order to gain credibility for the site descriptive model in general and the hydrogeological description in particular. This requires modelling of the groundwater flow from the last glaciation up to present-day with comparisons against measured TDS and other hydro-geochemical measures. II. Simulation of flow paths: The simulation and visualisation of flow paths from a tentative repository area is a means for describing the role of the current understanding of the modelled hydrogeological conditions in the target volume, i.e. the conditions of primary interest for Safety Assessment. Of particular interest here is demonstration of the need for detailed far-field realism in the numerical simulations. The motivation for a particular model size (and resolution) and set of boundary conditions for a realistic description of the recharge and discharge connected to the flow at repository depth is an essential part of the groundwater flow path simulations. The numerical modelling was performed by two separate modelling teams, the ConnectFlow Team and the DarcyTools Team. The work presented in this report was based on the computer code DarcyTools developed by Computer-aided Fluid Engineering. DarcyTools is a kind of equivalent porous media (EPM) flow code specifically designed to treat flow and salt transport in sparsely fractured crystalline rock intersected by transmissive

  17. Concentration in the Greek private hospital sector: a descriptive analysis.

    Science.gov (United States)

    Boutsioli, Zoe

    2007-07-01

    Over the last 20 years, governments all around the world have attempted to boost the role of market and competition in health care industries in order to increase efficiency and reduce costs. The increased competition and the significant implications on costs and prices of health care services resulted in health care industries being transformed. Large firms are merging and acquiring other firms. If this trend continues, few firms will dominate the health care markets. In this study, I use the simple concentration ratio (CR) for the largest 4, 8 and 20 companies to measure the concentration of Greek private hospitals during the period 1997-2004. Also, the Gini coefficient for inequality is used. For the two different categories of hospitals used (a) general and neuropsychiatric and (b) obstetric/gynaecological it is evident that the top four firms of the first category accounted for 43% of sales in 1997, and 52% in 2004, while the four largest firms of the second category accounted for almost 83% in 1997, and 81% in 2004. Also, the Gini coefficient increases over the 8-year period examined from 0.69 in 1997 to 0.82 in 2004. It explains that the market of the private health care services becomes less equal in the sense that fewer private hospitals and clinics hold more and more of the share of the total sales. From a cross-industry analysis it is clear that the private hospital sector has the highest concentration rate. Finally, it appears that the market structure of the private hospitals in Greece resembles more closely to an oligopoly rather than a monopolistic competition, since very few firms dominate the market.

  18. Description and analysis of hospital pharmacies in Madagascar.

    Science.gov (United States)

    Ratsimbazafimahefa, H R; Sadeghipour, F; Trouiller, P; Pannatier, A; Allenet, B

    2018-05-01

    Madagascar's health care system has operated without formal hospital pharmacies for more than two decades. The gradual integration of pharmacists in public hospitals since 2012 will allow the structuring of this field. This study was conducted to characterize the current situation regarding all aspects relating to the general functioning of hospital pharmacies and the services provided. This qualitative research used semi-structured interviews. Interviewees' perceptions about the general organization and functioning of hospital pharmacies and details on services provided were collected. The 16 interviewees were Ministry of Health staff members involved in hospital pharmacy, hospital directors, medical staff members and hospital pharmacy managers. Interviews were recorded, translated into French if conducted in Malagasy, and fully transcribed. Verbatim transcripts were coded according to the themes of hospital pharmacy and topical content analysis was performed. The principal issue perceived by interviewees was the heterogeneity of the system in terms of technical and financing management, with a main impact on the restocking of pharmaceutical products. The drug supply chain is not under control: no internal procedure has been established for the selection of pharmaceutical products, the quantification of needs is complex, stock management is difficult to supervise, a standard prescription protocol is lacking, dispensing is performed by unqualified staff, no pharmaceutical preparation is manufactured in the hospitals and administration occurs without pharmaceutical support. Progressive structuring of efficient hospital pharmacy services using the Basel statements for the future of hospital pharmacy is urgently needed to improve health care in Madagascar. Copyright © 2017 Académie Nationale de Pharmacie. Published by Elsevier Masson SAS. All rights reserved.

  19. Credibility assessment in child sexual abuse investigations: A descriptive analysis.

    Science.gov (United States)

    Melkman, Eran P; Hershkowitz, Irit; Zur, Ronit

    2017-05-01

    A major challenge in cases of child sexual abuse (CSA) is determining the credibility of children's reports. Consequently cases may be misclassified as false or deemed 'no judgment possible'. Based on a large national sample of reports of CSA made in Israel in 2014, the study examines child and event characteristics contributing to the probability that reports of abuse would be judged credible. National data files of all children aged 3-14, who were referred for investigation following suspected victimization of sexual abuse, and had disclosed sexual abuse, were analyzed. Cases were classified as either 'credible' or 'no judgment possible'. The probability of reaching a 'credible' judgment was examined in relation to characteristics of the child (age, gender, cognitive delay, marital status of the parents,) and of the abusive event (abuse severity, frequency, perpetrator-victim relationship, perpetrator's use of grooming, and perpetrator's use of coercion), controlling for investigator's identity at the cluster level of the analysis. Of 1563 cases analyzed, 57.9% were assessed as credible. The most powerful predictors of a credible judgment were older age and absence of a cognitive delay. Reports of children to married parents, who experienced a single abusive event that involved perpetrator's use of grooming, were also more likely to be judged as credible. Rates of credible judgments found are lower than expected suggesting under-identification of truthful reports of CSA. In particular, those cases of severe and multiple abuse involving younger and cognitively delayed children are the ones with the lowest chances of being assessed as credible. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Generator dynamics in aeroelastic analysis and simulations

    Energy Technology Data Exchange (ETDEWEB)

    Larsen, T.J.; Hansen, M.H.; Iov, F.

    2003-05-01

    This report contains a description of a dynamic model for a doubly-fed induction generator implemented in the aeroelastic code HAWC. The model has physical input parameters (resistance, reactance etc.) and input variables (stator and rotor voltage and rotor speed). The model can be used to simulate the generator torque as well as the rotor and stator currents, active and reactive power. A perturbation method has been used to reduce the original generator model equations to a set of equations which can be solved with the same time steps as a typical aeroelastic code. The method is used to separate the fast transients of the model from the slow variations and deduce a reduced order expression for the slow part. Dynamic effects of the first order terms in the model as well as the influence on drive train eigenfrequencies and damping has been investigated. Load response during time simulation of wind turbine response have been compared to simulations with a linear static generator model originally implemented i HAWC. A 2 MW turbine has been modelled in the aeroelastic code HAWC. When using the new dynamic generator model there is an interesting coupling between the generator dynamics and a global turbine vibration mode at 4.5 Hz, which only occurs when a dynamic formulation of the generator equations is applied. This frequency can especially be seen in the electrical power of the generator and the rotational speed of the generator, but also as torque variations in the drive train. (au)

  1. Quantitative Image Simulation and Analysis of Nanoparticles

    DEFF Research Database (Denmark)

    Madsen, Jacob; Hansen, Thomas Willum

    Microscopy (HRTEM) has become a routine analysis tool for structural characterization at atomic resolution, and with the recent development of in-situ TEMs, it is now possible to study catalytic nanoparticles under reaction conditions. However, the connection between an experimental image, and the underlying...... physical phenomena or structure is not always straightforward. The aim of this thesis is to use image simulation to better understand observations from HRTEM images. Surface strain is known to be important for the performance of nanoparticles. Using simulation, we estimate of the precision and accuracy...... of strain measurements from TEM images, and investigate the stability of these measurements to microscope parameters. This is followed by our efforts toward simulating metal nanoparticles on a metal-oxide support using the Charge Optimized Many Body (COMB) interatomic potential. The simulated interface...

  2. NASA's GMAO Atmospheric Motion Vectors Simulator: Description and Application to the MISTiC Winds Concept

    Science.gov (United States)

    Carvalho, David; McCarty, Will; Errico, Ron; Prive, Nikki

    2018-01-01

    An atmospheric wind vectors (AMVs) simulator was developed by NASA's GMAO to simulate observations from future satellite constellation concepts. The synthetic AMVs can then be used in OSSEs to estimate and quantify the potential added value of new observations to the present Earth observing system and, ultimately, the expected impact on the current weather forecasting skill. The GMAO AMV simulator is a tunable and flexible computer code that is able to simulate AMVs expected to be derived from different instruments and satellite orbit configurations. As a case study and example of the usefulness of this tool, the GMAO AMV simulator was used to simulate AMVs envisioned to be provided by the MISTiC Winds, a NASA mission concept consisting of a constellation of satellites equipped with infrared spectral midwave spectrometers, expected to provide high spatial and temporal resolution temperature and humidity soundings of the troposphere that can be used to derive AMVs from the tracking of clouds and water vapor features. The GMAO AMV simulator identifies trackable clouds and water vapor features in the G5NR and employs a probabilistic function to draw a subset of the identified trackable features. Before the simulator is applied to the MISTiC Winds concept, the simulator was calibrated to yield realistic observations counts and spatial distributions and validated considering as a proxy instrument to the MISTiC Winds the Himawari-8 Advanced Imager (AHI). The simulated AHI AMVs showed a close match with the real AHI AMVs in terms of observation counts and spatial distributions, showing that the GMAO AMVs simulator synthesizes AMVs observations with enough quality and realism to produce a response from the DAS equivalent to the one produced with real observations. When applied to the MISTiC Winds scanning points, it can be expected that the MISTiC Winds will be able to collect approximately 60,000 wind observations every 6 hours, if considering a constellation composed of

  3. Generator dynamics in aeroelastic analysis and simulations

    DEFF Research Database (Denmark)

    Larsen, Torben J.; Hansen, Morten Hartvig; Iov, F.

    2003-01-01

    This report contains a description of a dynamic model for a doubly-fed induction generator. The model has physical input parameters (voltage, resistance, reactance etc.) and can be used to calculate rotor and stator currents, hence active and reactivepower. A perturbation method has been used...... to reduce the original generator model equations to a set of equations which can be solved with the same time steps as a typical aeroelastic code. The method is used to separate the fast transients of the modelfrom the slow variations and deduce a reduced order expression for the slow part. Dynamic effects...... of the first order terms in the model as well as the influence on drive train eigenfrequencies and damping has been investigated. Load response during timesimulation of wind turbine response have been compared to simulations with a traditional static generator model based entirely on the slip angle. A 2 MW...

  4. Sequentially linear analysis for simulating brittle failure

    NARCIS (Netherlands)

    van de Graaf, A.V.

    2017-01-01

    The numerical simulation of brittle failure at structural level with nonlinear finite
    element analysis (NLFEA) remains a challenge due to robustness issues. We attribute these problems to the dimensions of real-world structures combined with softening behavior and negative tangent stiffness at

  5. A regional climate model for northern Europe: model description and results from the downscaling of two GCM control simulations

    Science.gov (United States)

    Rummukainen, M.; Räisänen, J.; Bringfelt, B.; Ullerstig, A.; Omstedt, A.; Willén, U.; Hansson, U.; Jones, C.

    This work presents a regional climate model, the Rossby Centre regional Atmospheric model (RCA1), recently developed from the High Resolution Limited Area Model (HIRLAM). The changes in the HIRLAM parametrizations, necessary for climate-length integrations, are described. A regional Baltic Sea ocean model and a modeling system for the Nordic inland lake systems have been coupled with RCA1. The coupled system has been used to downscale 10-year time slices from two different general circulation model (GCM) simulations to provide high-resolution regional interpretation of large-scale modeling. A selection of the results from the control runs, i.e. the present-day climate simulations, are presented: large-scale free atmospheric fields, the surface temperature and precipitation results and results for the on-line simulated regional ocean and lake surface climates. The regional model modifies the surface climate description compared to the GCM simulations, but it is also substantially affected by the biases in the GCM simulations. The regional model also improves the representation of the regional ocean and the inland lakes, compared to the GCM results.

  6. A regional climate model for northern Europe: model description and results from the downscaling of two GCM control simulations

    Energy Technology Data Exchange (ETDEWEB)

    Rummukainen, M.; Raeisaenen, J.; Bringfelt, B.; Ullerstig, A.; Omstedt, A.; Willen, U.; Hansson, U.; Jones, C. [Rossby Centre, Swedish Meteorological and Hydrological Inst., Norrkoeping (Sweden)

    2001-03-01

    This work presents a regional climate model, the Rossby Centre regional Atmospheric model (RCA1), recently developed from the High Resolution Limited Area Model (HIRLAM). The changes in the HIRLAM parametrizations, necessary for climate-length integrations, are described. A regional Baltic Sea ocean model and a modeling system for the Nordic inland lake systems have been coupled with RCA1. The coupled system has been used to downscale 10-year time slices from two different general circulation model (GCM) simulations to provide high-resolution regional interpretation of large-scale modeling. A selection of the results from the control runs, i.e. the present-day climate simulations, are presented: large-scale free atmospheric fields, the surface temperature and precipitation results and results for the on-line simulated regional ocean and lake surface climates. The regional model modifies the surface climate description compared to the GCM simulations, but it is also substantially affected by the biases in the GCM simulations. The regional model also improves the representation of the regional ocean and the inland lakes, compared to the GCM results. (orig.)

  7. Regional hydrogeological simulations using CONECTFLOW. Preliminary site description. Laxemar sub area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Hunter, Fiona; Jackson, Peter; McCarthy, Rachel [Serco Assurance, Risley (United Kingdom); Gylling, Bjoern; Marsic, Niko [Kemakta Konsult AB, Stockholm (Sweden)

    2006-04-15

    The main objective of this study is to support the development of a preliminary Site Description of the Laxemar subarea on a regional-scale based on the available data of November 2004 (Data Freeze L1.2). A more specific objective of this study is to assess the role of both known and less quantified hydrogeological conditions in determining the present-day distribution of saline groundwater in the Laxemar subarea on a regional-scale. An improved understanding of the palaeo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale, as well as predictions of future hydrogeological conditions. Another objective is to assess the flow-paths from the local-scale model domain, based on the present-day flow conditions, to assess the distribution of discharge and recharge areas connected to the flow at the approximate repository depth to inform the Preliminary Safety Evaluation. Significant new features incorporated in the modelling include: a depth variation in hydraulic properties within the deformation zones; a dependence on rock domain and depth in the rock mass properties in regional-scale models; a more detailed model of the overburden in terms of a layered system of spatially variable thickness made up of several different types of Quaternary deposits has been implemented; and several variants on the position of the watertable have been tried. The motivation for introducing a dependence on rock domain was guided by the hydrogeological interpretation with the aim of honouring the observed differences in hydraulic properties measured at the boreholes.

  8. Regional hydrogeological simulations using CONECTFLOW. Preliminary site description. Laxemar sub area - version 1.2

    International Nuclear Information System (INIS)

    Hartley, Lee; Hunter, Fiona; Jackson, Peter; McCarthy, Rachel; Gylling, Bjoern; Marsic, Niko

    2006-04-01

    The main objective of this study is to support the development of a preliminary Site Description of the Laxemar subarea on a regional-scale based on the available data of November 2004 (Data Freeze L1.2). A more specific objective of this study is to assess the role of both known and less quantified hydrogeological conditions in determining the present-day distribution of saline groundwater in the Laxemar subarea on a regional-scale. An improved understanding of the palaeo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale, as well as predictions of future hydrogeological conditions. Another objective is to assess the flow-paths from the local-scale model domain, based on the present-day flow conditions, to assess the distribution of discharge and recharge areas connected to the flow at the approximate repository depth to inform the Preliminary Safety Evaluation. Significant new features incorporated in the modelling include: a depth variation in hydraulic properties within the deformation zones; a dependence on rock domain and depth in the rock mass properties in regional-scale models; a more detailed model of the overburden in terms of a layered system of spatially variable thickness made up of several different types of Quaternary deposits has been implemented; and several variants on the position of the watertable have been tried. The motivation for introducing a dependence on rock domain was guided by the hydrogeological interpretation with the aim of honouring the observed differences in hydraulic properties measured at the boreholes

  9. Application of three-dimensional simulation at lecturing on descriptive geometry

    Directory of Open Access Journals (Sweden)

    Tel'noy Viktor Ivanovich

    2014-05-01

    Full Text Available Teaching descriptive geometry has its own characteristics. Need not only to inform students of a certain amount of knowledge on the subject, but also to develop their spatial imagination as well as the right to develop the skills of logical thinking. Practice of teaching the discipline showed that students face serious difficulties in the process of its study. This is due to the relatively low level of their schooling in geometry and technical drawing, and lacking in high spatial imagination. They find it difficult to imagine the geometrical image of the object of study and mentally convert it on the plane. Because of this, there is a need to find ways to effectively teach the discipline «Descriptive Geometry» at university. In the context of global informatization and computerization of the educational process, implementation of graphically programs for the development of design documentation and 3D modeling is one of the most promising applications of information technology in the process of solving these problems. With the help of three-dimensional models the best visibility in the classroom is achieved. When conducting lectures on descriptive geometry it is requested to use three-dimensional modeling not only as didactic means (demonstrativeness means, but also as a method of teaching (learning tool to deal with various graphics tasks. Bearing this in mind, the essence of the implementation of 3D modeling is revealed with the aim of better understanding of the algorithms for solving both positional and metric tasks using spatial representation of graphic constructions. It is shown that the possibility to consider the built model from different angles is of particular importance, as well as the use of transparency properties for illustrating the results of solving geometric problems. Using 3D models together with their display on the plane, as well as text information promotes better assimilation and more lasting memorization of the

  10. Critical slowing down and error analysis in lattice QCD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Virotta, Francesco

    2012-02-21

    In this work we investigate the critical slowing down of lattice QCD simulations. We perform a preliminary study in the quenched approximation where we find that our estimate of the exponential auto-correlation time scales as {tau}{sub exp}(a){proportional_to}a{sup -5}, where a is the lattice spacing. In unquenched simulations with O(a) improved Wilson fermions we do not obtain a scaling law but find results compatible with the behavior that we find in the pure gauge theory. The discussion is supported by a large set of ensembles both in pure gauge and in the theory with two degenerate sea quarks. We have moreover investigated the effect of slow algorithmic modes in the error analysis of the expectation value of typical lattice QCD observables (hadronic matrix elements and masses). In the context of simulations affected by slow modes we propose and test a method to obtain reliable estimates of statistical errors. The method is supposed to help in the typical algorithmic setup of lattice QCD, namely when the total statistics collected is of O(10){tau}{sub exp}. This is the typical case when simulating close to the continuum limit where the computational costs for producing two independent data points can be extremely large. We finally discuss the scale setting in N{sub f}=2 simulations using the Kaon decay constant f{sub K} as physical input. The method is explained together with a thorough discussion of the error analysis employed. A description of the publicly available code used for the error analysis is included.

  11. Maintenance Personnel Performance Simulation (MAPPS) model: description of model content, structure, and sensitivity testing. Volume 2

    International Nuclear Information System (INIS)

    Siegel, A.I.; Bartter, W.D.; Wolf, J.J.; Knee, H.E.

    1984-12-01

    This volume of NUREG/CR-3626 presents details of the content, structure, and sensitivity testing of the Maintenance Personnel Performance Simulation (MAPPS) model that was described in summary in volume one of this report. The MAPPS model is a generalized stochastic computer simulation model developed to simulate the performance of maintenance personnel in nuclear power plants. The MAPPS model considers workplace, maintenance technician, motivation, human factors, and task oriented variables to yield predictive information about the effects of these variables on successful maintenance task performance. All major model variables are discussed in detail and their implementation and interactive effects are outlined. The model was examined for disqualifying defects from a number of viewpoints, including sensitivity testing. This examination led to the identification of some minor recalibration efforts which were carried out. These positive results indicate that MAPPS is ready for initial and controlled applications which are in conformity with its purposes

  12. Single-phase multi-dimensional thermohydraulics direct numerical simulation code DINUS-3. Input data description

    Energy Technology Data Exchange (ETDEWEB)

    Muramatsu, Toshiharu [Power Reactor and Nuclear Fuel Development Corp., Oarai, Ibaraki (Japan). Oarai Engineering Center

    1998-08-01

    This report explains the numerical methods and the set-up method of input data for a single-phase multi-dimensional thermohydraulics direct numerical simulation code DINUS-3 (Direct Numerical Simulation using a 3rd-order upwind scheme). The code was developed to simulate non-stationary temperature fluctuation phenomena related to thermal striping phenomena, developed at Power Reactor and Nuclear Fuel Development Corporation (PNC). The DINUS-3 code was characterized by the use of a third-order upwind scheme for convection terms in instantaneous Navier-Stokes and energy equations, and an adaptive control system based on the Fuzzy theory to control time step sizes. Author expect this report is very useful to utilize the DINUS-3 code for the evaluation of various non-stationary thermohydraulic phenomena in reactor applications. (author)

  13. Description and validation of the Simple, Efficient, Dynamic, Global, Ecological Simulator (SEDGES v.1.0)

    Science.gov (United States)

    Paiewonsky, Pablo; Elison Timm, Oliver

    2018-03-01

    In this paper, we present a simple dynamic global vegetation model whose primary intended use is auxiliary to the land-atmosphere coupling scheme of a climate model, particularly one of intermediate complexity. The model simulates and provides important ecological-only variables but also some hydrological and surface energy variables that are typically either simulated by land surface schemes or else used as boundary data input for these schemes. The model formulations and their derivations are presented here, in detail. The model includes some realistic and useful features for its level of complexity, including a photosynthetic dependency on light, full coupling of photosynthesis and transpiration through an interactive canopy resistance, and a soil organic carbon dependence for bare-soil albedo. We evaluate the model's performance by running it as part of a simple land surface scheme that is driven by reanalysis data. The evaluation against observational data includes net primary productivity, leaf area index, surface albedo, and diagnosed variables relevant for the closure of the hydrological cycle. In this setup, we find that the model gives an adequate to good simulation of basic large-scale ecological and hydrological variables. Of the variables analyzed in this paper, gross primary productivity is particularly well simulated. The results also reveal the current limitations of the model. The most significant deficiency is the excessive simulation of evapotranspiration in mid- to high northern latitudes during their winter to spring transition. The model has a relative advantage in situations that require some combination of computational efficiency, model transparency and tractability, and the simulation of the large-scale vegetation and land surface characteristics under non-present-day conditions.

  14. The X-Ray Pebble Recirculation Experiment (X-PREX): Facility Description, Preliminary Discrete Element Method Simulation Validation Studies, and Future Test Program

    International Nuclear Information System (INIS)

    Laufer, Michael R.; Bickel, Jeffrey E.; Buster, Grant C.; Krumwiede, David L.; Peterson, Per F.

    2014-01-01

    This paper presents a facility description, preliminary results, and future test program of the new X-Ray Pebble Recirculation Experiment (X-PREX), which is now operational and being used to collect data on the behavior of slow dense granular flows relevant to pebble bed reactor core designs. The X-PREX facility uses digital x-ray tomography methods to track both the translational and rotational motion of spherical pebbles, which provides unique experimental results that can be used to validate discrete element method (DEM) simulations of pebble motion. The validation effort supported by the X-PREX facility provides a means to build confidence in analysis of pebble bed configuration and residence time distributions that impact the neutronics, thermal hydraulics, and safety analysis of pebble bed reactor cores. Preliminary experimental and DEM simulation results are reported for silo drainage, a classical problem in the granular flow literature, at several hopper angles. These studies include conventional converging and novel diverging geometries that provide additional flexibility in the design of pebble bed reactor cores. Excellent agreement is found between the X-PREX experimental and DEM simulation results. Finally, this paper discusses additional studies in progress relevant to the design and analysis of pebble bed reactor cores including pebble recirculation in cylindrical core geometries and evaluation of forces on shut down blades inserted directly into a packed pebble bed. (author)

  15. Xanthogranulomatous Pyelonephritis Can Simulate a Complex Cyst: Case Description and Review of Literature

    Directory of Open Access Journals (Sweden)

    Salvatore Butticè

    2014-05-01

    Full Text Available Xanthogranulomatous pyelonephritis is a rare and peculiar form of chronic pyelonephritis and is generally associated with renal lithiasis. Its incidence is higher in females. The peculiarity of this disease is that it requires a differential diagnosis, because it can often simulate dramatic pathologic conditions. In fact, in the literature are also described cases in association with squamous cell carcinoma of the kidney The radiologic clinical findings simulate renal masses, sometimes in association with caval thrombus. We describe a case of xanthogranulomatous pyelonephritis with radiologic aspects of a complex cyst of Bosniak class III in a man 40-year old.

  16. Groundwater flow simulations in support of the Local Scale Hydrogeological Description developed within the Laxemar Methodology Test Project

    International Nuclear Information System (INIS)

    Follin, Sven; Svensson, Urban

    2002-05-01

    The deduced Site Descriptive Model of the Laxemar area has been parameterised from a hydraulic point of view and subsequently put into practice in terms of a numerical flow model. The intention of the subproject has been to explore the adaptation of a numerical flow model to site-specific surface and borehole data, and to identify potential needs for development and improvement in the planned modelling methodology and tools. The experiences made during this process and the outcome of the simulations have been presented to the methodology test project group in course of the project. The discussion and conclusions made in this particular report concern two issues mainly, (i) the use of numerical simulations as a means of gaining creditability, e.g. discrimination between alternative geological models, and (ii) calibration and conditioning of probabilistic (Monte Carlo) realisations

  17. Coupling an analytical description of anti-scatter grids with simulation software of radiographic systems using Monte Carlo code

    International Nuclear Information System (INIS)

    Rinkel, J.; Dinten, J.M.; Tabary, J.

    2004-01-01

    The use of focused anti-scatter grids on digital radiographic systems with two-dimensional detectors produces acquisitions with a decreased scatter to primary ratio and thus improved contrast and resolution. Simulation software is of great interest in optimizing grid configuration according to a specific application. Classical simulators are based on complete detailed geometric descriptions of the grid. They are accurate but very time consuming since they use Monte Carlo code to simulate scatter within the high-frequency grids. We propose a new practical method which couples an analytical simulation of the grid interaction with a radiographic system simulation program. First, a two dimensional matrix of probability depending on the grid is created offline, in which the first dimension represents the angle of impact with respect to the normal to the grid lines and the other the energy of the photon. This matrix of probability is then used by the Monte Carlo simulation software in order to provide the final scattered flux image. To evaluate the gain of CPU time, we define the increasing factor as the increase of CPU time of the simulation with as opposed to without the grid. Increasing factors were calculated with the new model and with classical methods representing the grid with its CAD model as part of the object. With the new method, increasing factors are shorter by one to two orders of magnitude compared with the second one. These results were obtained with a difference in calculated scatter of less than five percent between the new and the classical method. (authors)

  18. Description of the artificial parameters in EGS4-Monte Carlo simulation and their influence on the absorbed depth dose from electrons in water

    International Nuclear Information System (INIS)

    Sandborg, M.; Alm Carlsson, G.

    1990-01-01

    This report described the background of the EGS4-Monte Carlo code. It gives a short description of the interaction between electrons and materia and a description of the artificial parameters used for EGS4-Monte Carlo simulating. It also gives advice to choose the right artificial parameters. (K.A.E)

  19. Stochastic analysis for finance with simulations

    CERN Document Server

    Choe, Geon Ho

    2016-01-01

    This book is an introduction to stochastic analysis and quantitative finance; it includes both theoretical and computational methods. Topics covered are stochastic calculus, option pricing, optimal portfolio investment, and interest rate models. Also included are simulations of stochastic phenomena, numerical solutions of the Black–Scholes–Merton equation, Monte Carlo methods, and time series. Basic measure theory is used as a tool to describe probabilistic phenomena. The level of familiarity with computer programming is kept to a minimum. To make the book accessible to a wider audience, some background mathematical facts are included in the first part of the book and also in the appendices. This work attempts to bridge the gap between mathematics and finance by using diagrams, graphs and simulations in addition to rigorous theoretical exposition. Simulations are not only used as the computational method in quantitative finance, but they can also facilitate an intuitive and deeper understanding of theoret...

  20. Accident simulator development for probabilistic safety analysis

    International Nuclear Information System (INIS)

    Cacciabue, P.C.; Amendola, A.; Mancini, G.

    1985-01-01

    This paper describes the basic features of a new concept of incident simulator, Response System Analyzed (RSA) which is being developed within the CEC JRC Research Program on Reactor Safety. Focusing on somewhat different aims than actual simulators, RSA development extends the field of application of simulators to the area of risk and reliability analysis and in particular to the identification of relevant sequences, to the modeling of human behavior and to the validation of operating procedures. The fundamental components of the project, i.e. the deterministic transient model of the plant, the automatic probabilistic driver and the human possible intervention modeling, are discussed in connection with the problem of their dynamic interaction. The analyses so far performed by separately testing RSA on significant study cases have shown encouraging results and have proven the feasibility of the overall program

  1. Subset simulation for structural reliability sensitivity analysis

    International Nuclear Information System (INIS)

    Song Shufang; Lu Zhenzhou; Qiao Hongwei

    2009-01-01

    Based on two procedures for efficiently generating conditional samples, i.e. Markov chain Monte Carlo (MCMC) simulation and importance sampling (IS), two reliability sensitivity (RS) algorithms are presented. On the basis of reliability analysis of Subset simulation (Subsim), the RS of the failure probability with respect to the distribution parameter of the basic variable is transformed as a set of RS of conditional failure probabilities with respect to the distribution parameter of the basic variable. By use of the conditional samples generated by MCMC simulation and IS, procedures are established to estimate the RS of the conditional failure probabilities. The formulae of the RS estimator, its variance and its coefficient of variation are derived in detail. The results of the illustrations show high efficiency and high precision of the presented algorithms, and it is suitable for highly nonlinear limit state equation and structural system with single and multiple failure modes

  2. Combining a building simulation with energy systems analysis to assess the benefits of natural ventilation

    DEFF Research Database (Denmark)

    Oropeza-Perez, Ivan; Østergaard, Poul Alberg; Remmen, Arne

    2013-01-01

    a thermal air flow simulation program - Into the energy systems analysis model. Descriptions of the energy systems in two geographical locations, i.e. Mexico and Denmark, are set up as inputs. Then, the assessment is done by calculating the energy impacts as well as environmental benefits in the energy...

  3. Preliminary site description: Groundwater flow simulations. Simpevarp area (version 1.1) modelled with CONNECTFLOW

    International Nuclear Information System (INIS)

    Hartley, Lee; Worth, David; Gylling, Bjoern; Marsic, Niko; Holmen, Johan

    2004-08-01

    The main objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater at the Simpevarp and Laxemar sites. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Descriptive Model in general and the Site Hydrogeological Description in particular. This is to serve as a basis for describing the present hydrogeological conditions as well as predictions of future hydrogeological conditions. This objective implies a testing of: geometrical alternatives in the structural geology and bedrock fracturing, variants in the initial and boundary conditions, and parameter uncertainties (i.e. uncertainties in the hydraulic property assignment). This testing is necessary in order to evaluate the impact on the groundwater flow field of the specified components and to promote proposals of further investigations of the hydrogeological conditions at the site. The general methodology for modelling transient salt transport and groundwater flow using CONNECTFLOW that was developed for Forsmark has been applied successfully also for Simpevarp. Because of time constraints only a key set of variants were performed that focussed on the influences of DFN model parameters, the kinematic porosity, and the initial condition. Salinity data in deep boreholes available at the time of the project was too limited to allow a good calibration exercise. However, the model predictions are compared with the available data from KLX01 and KLX02 below. Once more salinity data is available it may be possible to draw more definite conclusions based on the differences between variants. At the moment though the differences should just be used understand the sensitivity of the models to various input parameters

  4. Intensive care nurses' perceptions of simulation-based team training for building patient safety in intensive care: a descriptive qualitative study.

    Science.gov (United States)

    Ballangrud, Randi; Hall-Lord, Marie Louise; Persenius, Mona; Hedelin, Birgitta

    2014-08-01

    To describe intensive care nurses' perceptions of simulation-based team training for building patient safety in intensive care. Failures in team processes are found to be contributory factors to incidents in an intensive care environment. Simulation-based training is recommended as a method to make health-care personnel aware of the importance of team working and to improve their competencies. The study uses a qualitative descriptive design. Individual qualitative interviews were conducted with 18 intensive care nurses from May to December 2009, all of which had attended a simulation-based team training programme. The interviews were analysed by qualitative content analysis. One main category emerged to illuminate the intensive care nurse perception: "training increases awareness of clinical practice and acknowledges the importance of structured work in teams". Three generic categories were found: "realistic training contributes to safe care", "reflection and openness motivates learning" and "finding a common understanding of team performance". Simulation-based team training makes intensive care nurses more prepared to care for severely ill patients. Team training creates a common understanding of how to work in teams with regard to patient safety. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Sensitivity Analysis of Multidisciplinary Rotorcraft Simulations

    Science.gov (United States)

    Wang, Li; Diskin, Boris; Biedron, Robert T.; Nielsen, Eric J.; Bauchau, Olivier A.

    2017-01-01

    A multidisciplinary sensitivity analysis of rotorcraft simulations involving tightly coupled high-fidelity computational fluid dynamics and comprehensive analysis solvers is presented and evaluated. An unstructured sensitivity-enabled Navier-Stokes solver, FUN3D, and a nonlinear flexible multibody dynamics solver, DYMORE, are coupled to predict the aerodynamic loads and structural responses of helicopter rotor blades. A discretely-consistent adjoint-based sensitivity analysis available in FUN3D provides sensitivities arising from unsteady turbulent flows and unstructured dynamic overset meshes, while a complex-variable approach is used to compute DYMORE structural sensitivities with respect to aerodynamic loads. The multidisciplinary sensitivity analysis is conducted through integrating the sensitivity components from each discipline of the coupled system. Numerical results verify accuracy of the FUN3D/DYMORE system by conducting simulations for a benchmark rotorcraft test model and comparing solutions with established analyses and experimental data. Complex-variable implementation of sensitivity analysis of DYMORE and the coupled FUN3D/DYMORE system is verified by comparing with real-valued analysis and sensitivities. Correctness of adjoint formulations for FUN3D/DYMORE interfaces is verified by comparing adjoint-based and complex-variable sensitivities. Finally, sensitivities of the lift and drag functions obtained by complex-variable FUN3D/DYMORE simulations are compared with sensitivities computed by the multidisciplinary sensitivity analysis, which couples adjoint-based flow and grid sensitivities of FUN3D and FUN3D/DYMORE interfaces with complex-variable sensitivities of DYMORE structural responses.

  6. Statistical analysis and interpretation of prenatal diagnostic imaging studies, Part 2: descriptive and inferential statistical methods.

    Science.gov (United States)

    Tuuli, Methodius G; Odibo, Anthony O

    2011-08-01

    The objective of this article is to discuss the rationale for common statistical tests used for the analysis and interpretation of prenatal diagnostic imaging studies. Examples from the literature are used to illustrate descriptive and inferential statistics. The uses and limitations of linear and logistic regression analyses are discussed in detail.

  7. Computer image analysis of seed shape and seed color for flax cultivar description

    Czech Academy of Sciences Publication Activity Database

    Wiesnerová, Dana; Wiesner, Ivo

    2008-01-01

    Roč. 61, č. 2 (2008), s. 126-135 ISSN 0168-1699 R&D Projects: GA ČR GA521/03/0019 Institutional research plan: CEZ:AV0Z50510513 Keywords : image analysis * cultivar description * flax Subject RIV: EA - Cell Biology Impact factor: 1.273, year: 2008

  8. A descriptive analysis of studies on behavioural treatment of drooling (1970-2005).

    NARCIS (Netherlands)

    Burg, J.J.W. van der; Didden, R.; Jongerius, P.H.; Rotteveel, J.J.

    2007-01-01

    A descriptive analysis was conducted on studies on the behavioural treatment of drooling (published between 1970 and 2005). The 17 articles that met the inclusion criteria described 53 participants (mean age 14y 7mo, [SD 4y 9mo]; range 6-28y). Sex of 87% of the participants was reported: 28 male, 18

  9. A descriptive analysis of studies on behavioural treatment of drooling (1970-2005)

    NARCIS (Netherlands)

    Burg, J.J.W. van der; Didden, H.C.M.; Jongerius, P.H.; Rotteveel, J.J.

    2007-01-01

    A descriptive analysis was conducted on studies on the behavioural treatment of drooling (published between 1970 and 2005). The 17 articles that met the inclusion criteria described 53 participants (mean age 14y 7mo, [SD 4y 9mo]; range 6-28y). Sex of 87% of the participants was reported: 28 male, 18

  10. Description langugage for the modelling and analysis of temporal change of instrumentation and control system structures

    International Nuclear Information System (INIS)

    Goering, Markus Heinrich

    2013-01-01

    The utilisation of computer-based I and C, as a result of the technological advancements in the computer industry, represents an up-to-date challenge for I and C engineers in nuclear power plants throughout the world. In comparison with the time-proven, hard-wired I and C, the engineering must consider the novel characteristics of computer-based technology during the implementation, these are primarily constituted by higher performance and the utilisation of software. On one hand, this allows for implementing more complex I and C functions and integrating several I and C functions on to single components, although on the other hand, the minimisation of the CCF probability is of high priority to the engineering. Furthermore, the engineering must take the implementation of the deterministic safety concept for the I and C design into consideration. This includes engineering the redundancy, diversity, physical separation, and independence design features, and is complemented by the analysis of the I and C design with respect to the superposition of pre-defined event sequences and postulated failure combinations, so as to secure the safe operation of the nuclear power plant. The focus of this thesis is on the basic principles of engineering, i.e. description languages and methods, which the engineering relies on for a highly qualitative and efficient computer-based I and C implementation. The analysis of the deterministic safety concept and computer-based I and C characteristics yields the relevant technical requirements for the engineering, these are combined with the general structuring principles of standard IEC 81346 and the extended description language evaluation criteria, which are based on the guideline VDI/VDE-3681, resulting in target criteria for evaluating description languages. The analysis and comparison of existing description languages reveals that no description language satisfactorily fulfils all target criteria, which is constituted in the

  11. Description langugage for the modelling and analysis of temporal change of instrumentation and control system structures

    Energy Technology Data Exchange (ETDEWEB)

    Goering, Markus Heinrich

    2013-10-25

    The utilisation of computer-based I and C, as a result of the technological advancements in the computer industry, represents an up-to-date challenge for I and C engineers in nuclear power plants throughout the world. In comparison with the time-proven, hard-wired I and C, the engineering must consider the novel characteristics of computer-based technology during the implementation, these are primarily constituted by higher performance and the utilisation of software. On one hand, this allows for implementing more complex I and C functions and integrating several I and C functions on to single components, although on the other hand, the minimisation of the CCF probability is of high priority to the engineering. Furthermore, the engineering must take the implementation of the deterministic safety concept for the I and C design into consideration. This includes engineering the redundancy, diversity, physical separation, and independence design features, and is complemented by the analysis of the I and C design with respect to the superposition of pre-defined event sequences and postulated failure combinations, so as to secure the safe operation of the nuclear power plant. The focus of this thesis is on the basic principles of engineering, i.e. description languages and methods, which the engineering relies on for a highly qualitative and efficient computer-based I and C implementation. The analysis of the deterministic safety concept and computer-based I and C characteristics yields the relevant technical requirements for the engineering, these are combined with the general structuring principles of standard IEC 81346 and the extended description language evaluation criteria, which are based on the guideline VDI/VDE-3681, resulting in target criteria for evaluating description languages. The analysis and comparison of existing description languages reveals that no description language satisfactorily fulfils all target criteria, which is constituted in the

  12. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  13. Description and evaluation of GMXe: a new aerosol submodel for global simulations (v1

    Directory of Open Access Journals (Sweden)

    K. J. Pringle

    2010-09-01

    Full Text Available We present a new aerosol microphysics and gas aerosol partitioning submodel (Global Modal-aerosol eXtension, GMXe implemented within the ECHAM/MESSy Atmospheric Chemistry model (EMAC, version 1.8. The submodel is computationally efficient and is suitable for medium to long term simulations with global and regional models. The aerosol size distribution is treated using 7 log-normal modes and has the same microphysical core as the M7 submodel (Vignati et al., 2004.

    The main developments in this work are: (i the extension of the aerosol emission routines and the M7 microphysics, so that an increased (and variable number of aerosol species can be treated (new species include sodium and chloride, and potentially magnesium, calcium, and potassium, (ii the coupling of the aerosol microphysics to a choice of treatments of gas/aerosol partitioning to allow the treatment of semi-volatile aerosol, and, (iii the implementation and evaluation of the developed submodel within the EMAC model of atmospheric chemistry.

    Simulated concentrations of black carbon, particulate organic matter, dust, sea spray, sulfate and ammonium aerosol are shown to be in good agreement with observations (for all species at least 40% of modeled values are within a factor of 2 of the observations. The distribution of nitrate aerosol is compared to observations in both clean and polluted regions. Concentrations in polluted continental regions are simulated quite well, but there is a general tendency to overestimate nitrate, particularly in coastal regions (geometric mean of modelled values/geometric mean of observed data ≈2. In all regions considered more than 40% of nitrate concentrations are within a factor of two of the observations. Marine nitrate concentrations are well captured with 96% of modeled values within a factor of 2 of the observations.

  14. Development of a compartment model based on CFD simulations for description of mixing in bioreactors

    Directory of Open Access Journals (Sweden)

    Crine, M.

    2010-01-01

    Full Text Available Understanding and modeling the complex interactions between biological reaction and hydrodynamics are a key problem when dealing with bioprocesses. It is fundamental to be able to accurately predict the hydrodynamics behavior of bioreactors of different size and its interaction with the biological reaction. CFD can provide detailed modeling about hydrodynamics and mixing. However, it is computationally intensive, especially when reactions are taken into account. Another way to predict hydrodynamics is the use of "Compartment" or "Multi-zone" models which are much less demanding in computation time than CFD. However, compartments and fluxes between them are often defined by considering global quantities not representative of the flow. To overcome the limitations of these two methods, a solution is to combine compartment modeling and CFD simulations. Therefore, the aim of this study is to develop a methodology in order to propose a compartment model based on CFD simulations of a bioreactor. The flow rate between two compartments can be easily computed from the velocity fields obtained by CFD. The difficulty lies in the definition of the zones in such a way they can be considered as perfectly mixed. The creation of the model compartments from CFD cells can be achieved manually or automatically. The manual zoning consists in aggregating CFD cells according to the user's wish. The automatic zoning defines compartments as regions within which the value of one or several properties are uniform with respect to a given tolerance. Both manual and automatic zoning methods have been developed and compared by simulating the mixing of an inert scalar. For the automatic zoning, several algorithms and different flow properties have been tested as criteria for the compartment creation.

  15. Pandora - a simulation tool for safety assessments. Technical description and user's guide

    Energy Technology Data Exchange (ETDEWEB)

    Ekstroem, Per-Anders (Facilia AB (Sweden))

    2010-12-15

    This report documents a flexible simulation tool, Pandora, used in several post closure safety assessments in both Sweden and Finland to assess the radiological dose to man due to releases from radioactive waste repositories. Pandora allows the user to build compartment models to represent the migration and fate of radionuclides in the environment. The tool simplifies the implementation and simulation of radioecological biosphere models in which there exist a large set of radionuclides and input variables. Based on the well-known technical computing software MATLAB and especially its interactive graphical environment Simulink, Pandora receives many benefits. MATLAB/Simulink is a highly flexible tool used for simulations of practically any type of dynamic system; it is widely used, continuously maintained, and often upgraded. By basing the tool on this commercial software package, we gain both the graphical interface provided by Simulink, as well as the ability to access the advanced numerical equation solving routines in MATLAB. Since these numerical methods are well established and quality assured in their MATLAB implementation, the solution methods used in Pandora can be considered to have high level of quality assurance. The structure of Pandora provides clarity in the model format, which means the model itself assists its own documentation, since the model can be understood by inspecting its structure. With the introduction of the external tool Pandas (Pandora assessment tool), version handling and an integrated way of performing the entire calculation chain has been added. Instead of being dependent on other commercial statistical software as @Risk for performing probabilistic assessments, they can now be performed within the tool

  16. Final Report: Model interacting particle systems for simulation and macroscopic description of particulate suspensions

    Energy Technology Data Exchange (ETDEWEB)

    Peter J. Mucha

    2007-08-30

    Suspensions of solid particles in liquids appear in numerous applications, from environmental settings like river silt, to industrial systems of solids transport and water treatment, and biological flows such as blood flow. Despite their importance, much remains unexplained about these complicated systems. Mucha's research aims to improve understanding of basic properties of suspensions through a program of simulating model interacting particle systems with critical evaluation of proposed continuum equations, in close collaboration with experimentalists. Natural to this approach, the original proposal centered around collaboration with studies already conducted in various experimental groups. However, as was detailed in the 2004 progress report, following the first year of this award, a number of the questions from the original proposal were necessarily redirected towards other specific goals because of changes in the research programs of the proposed experimental collaborators. Nevertheless, the modified project goals and the results that followed from those goals maintain close alignment with the main themes of the original proposal, improving efficient simulation and macroscopic modeling of sedimenting and colloidal suspensions. In particular, the main investigations covered under this award have included: (1) Sedimentation instabilities, including the sedimentation analogue of the Rayleigh-Taylor instability (for heavy, particle-laden fluid over lighter, clear fluid). (2) Ageing dynamics of colloidal suspensions at concentrations above the glass transition, using simplified interactions. (3) Stochastic reconstruction of velocity-field dependence for particle image velocimetry (PIV). (4) Stochastic modeling of the near-wall bias in 'nano-PIV'. (5) Distributed Lagrange multiplier simulation of the 'internal splash' of a particle falling through a stable stratified interface. (6) Fundamental study of velocity fluctuations in sedimentation

  17. Pandora - a simulation tool for safety assessments. Technical description and user's guide

    International Nuclear Information System (INIS)

    Ekstroem, Per-Anders

    2010-12-01

    This report documents a flexible simulation tool, Pandora, used in several post closure safety assessments in both Sweden and Finland to assess the radiological dose to man due to releases from radioactive waste repositories. Pandora allows the user to build compartment models to represent the migration and fate of radionuclides in the environment. The tool simplifies the implementation and simulation of radioecological biosphere models in which there exist a large set of radionuclides and input variables. Based on the well-known technical computing software MATLAB and especially its interactive graphical environment Simulink, Pandora receives many benefits. MATLAB/Simulink is a highly flexible tool used for simulations of practically any type of dynamic system; it is widely used, continuously maintained, and often upgraded. By basing the tool on this commercial software package, we gain both the graphical interface provided by Simulink, as well as the ability to access the advanced numerical equation solving routines in MATLAB. Since these numerical methods are well established and quality assured in their MATLAB implementation, the solution methods used in Pandora can be considered to have high level of quality assurance. The structure of Pandora provides clarity in the model format, which means the model itself assists its own documentation, since the model can be understood by inspecting its structure. With the introduction of the external tool Pandas (Pandora assessment tool), version handling and an integrated way of performing the entire calculation chain has been added. Instead of being dependent on other commercial statistical software as Risk for performing probabilistic assessments, they can now be performed within the tool

  18. Process Simulation Analysis of HF Stripping

    Directory of Open Access Journals (Sweden)

    Thaer A. Abdulla

    2015-02-01

    Full Text Available    HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq. Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee, affecting the results within (0.1-58% variation for the most cases.        The simulated results show that about 4% of paraffin (C10 & C11 presents at the top stream, which may cause a problem in the LAB production plant. The major variations were noticed for the total top vapor flow rate with bottom temperature and with feed composition. The column profiles maintain fairly constants from tray 5 to tray 18. The study gives evidence about a successful simulation with HYSYS because the results correspond with the real plant operation data.

  19. Simulation based analysis of laser beam brazing

    Science.gov (United States)

    Dobler, Michael; Wiethop, Philipp; Schmid, Daniel; Schmidt, Michael

    2016-03-01

    Laser beam brazing is a well-established joining technology in car body manufacturing with main applications in the joining of divided tailgates and the joining of roof and side panels. A key advantage of laser brazed joints is the seam's visual quality which satisfies highest requirements. However, the laser beam brazing process is very complex and process dynamics are only partially understood. In order to gain deeper knowledge of the laser beam brazing process, to determine optimal process parameters and to test process variants, a transient three-dimensional simulation model of laser beam brazing is developed. This model takes into account energy input, heat transfer as well as fluid and wetting dynamics that lead to the formation of the brazing seam. A validation of the simulation model is performed by metallographic analysis and thermocouple measurements for different parameter sets of the brazing process. These results show that the multi-physical simulation model not only can be used to gain insight into the laser brazing process but also offers the possibility of process optimization in industrial applications. The model's capabilities in determining optimal process parameters are exemplarily shown for the laser power. Small deviations in the energy input can affect the brazing results significantly. Therefore, the simulation model is used to analyze the effect of the lateral laser beam position on the energy input and the resulting brazing seam.

  20. Description of an aircraft lightning and simulated nuclear electromagnetic pulse (NEMP) threat based on experimental data

    Science.gov (United States)

    Rustan, Pedro L., Jr.

    1987-01-01

    Lightning data obtained by measuring the surface electromagnetic fields on a CV-580 research aircraft during 48 lightning strikes between 1500 and 18,000 feet in central Florida during the summers of 1984 and 1985, and nuclear electromagnetic pulse (NEMP) data obtained by surface electromagnetic field measurements using a 1:74 CV-580 scale model, are presented. From one lightning event, maximum values of 3750 T/s for the time rate of change of the surface magnetic flux density, and 4.7 kA for the peak current, were obtained. From the simulated NEMP test, maximum values of 40,000 T/s for the time rate of change of the surface magnetic flux density, and 90 A/sq m for the total normal current density, were found. The data have application to the development of a military aircraft lightning/NEMP standard.

  1. Description of web-enhanced virtual character simulation system to standardize patient hand-offs.

    Science.gov (United States)

    Filichia, Lori; Halan, Shivashankar; Blackwelder, Ethan; Rossen, Brent; Lok, Benjamin; Korndorffer, James; Cendan, Juan

    2011-04-01

    The 80-h work week has increased discontinuity of patient care resulting in reports of increased medication errors and preventable adverse events. Graduate medical programs are addressing these shortcomings in a number of ways. We have developed a computer simulation platform called the Virtual People Factory (VPF), which allows us to capture and simulate the dialogue between a real user and a virtual character. We have converted the system to reflect a physician in the process of "checking-out" a patient to a covering physician. The responses are tracked and matched to educator-defined information termed "discoveries." Our proof of concept represented a typical post-operative patient with tachycardia. The system is web enabled. So far, 26 resident users at two institutions have completed the module. The critical discovery of tachycardia was identified by 62% of users. Residents spend 85% of the time asking intraoperative, postoperative, and past medical history questions. The system improves over time such that there is a near-doubling of questions that yield appropriate answers between users 13 and 22. Users who identified the virtual patient's underlying tachycardia expressed more concern and were more likely to order further testing for the patient in a post-module questionnaire (P = 0.13 and 0.08, respectively, NS). The VPF system can capture unique details about the hand-off interchange. The system improves with sequential users such that better matching of questions and answers occurs within the initial 25 users allowing rapid development of new modules. A catalog of hand-off modules could be easily developed. Wide-scale web-based deployment was uncomplicated. Identification of the critical findings appropriately translated to user concern for the patient though our series was too small to reach significance. Performance metrics based on the identification of critical discoveries could be used to assess readiness of the user to carry off a successful hand

  2. Isentropic Analysis of a Simulated Hurricane

    Science.gov (United States)

    Mrowiec, Agnieszka A.; Pauluis, Olivier; Zhang, Fuqing

    2016-01-01

    Hurricanes, like many other atmospheric flows, are associated with turbulent motions over a wide range of scales. Here the authors adapt a new technique based on the isentropic analysis of convective motions to study the thermodynamic structure of the overturning circulation in hurricane simulations. This approach separates the vertical mass transport in terms of the equivalent potential temperature of air parcels. In doing so, one separates the rising air parcels at high entropy from the subsiding air at low entropy. This technique filters out oscillatory motions associated with gravity waves and separates convective overturning from the secondary circulation. This approach is applied here to study the flow of an idealized hurricane simulation with the Weather Research and Forecasting (WRF) Model. The isentropic circulation for a hurricane exhibits similar characteristics to that of moist convection, with a maximum mass transport near the surface associated with a shallow convection and entrainment. There are also important differences. For instance, ascent in the eyewall can be readily identified in the isentropic analysis as an upward mass flux of air with unusually high equivalent potential temperature. The isentropic circulation is further compared here to the Eulerian secondary circulation of the simulated hurricane to show that the mass transport in the isentropic circulation is much larger than the one in secondary circulation. This difference can be directly attributed to the mass transport by convection in the outer rainband and confirms that, even for a strongly organized flow like a hurricane, most of the atmospheric overturning is tied to the smaller scales.

  3. Descriptive Research

    DEFF Research Database (Denmark)

    Wigram, Anthony Lewis

    2003-01-01

    Descriptive research is described by Lathom-Radocy and Radocy (1995) to include Survey research, ex post facto research, case studies and developmental studies. Descriptive research also includes a review of the literature in order to provide both quantitative and qualitative evidence of the effect...... starts will allow effect size calculations to be made in order to evaluate effect over time. Given the difficulties in undertaking controlled experimental studies in the creative arts therapies, descriptive research methods offer a way of quantifying effect through descriptive statistical analysis...

  4. Optoelectronic Devices Advanced Simulation and Analysis

    CERN Document Server

    Piprek, Joachim

    2005-01-01

    Optoelectronic devices transform electrical signals into optical signals and vice versa by utilizing the sophisticated interaction of electrons and light within micro- and nano-scale semiconductor structures. Advanced software tools for design and analysis of such devices have been developed in recent years. However, the large variety of materials, devices, physical mechanisms, and modeling approaches often makes it difficult to select appropriate theoretical models or software packages. This book presents a review of devices and advanced simulation approaches written by leading researchers and software developers. It is intended for scientists and device engineers in optoelectronics, who are interested in using advanced software tools. Each chapter includes the theoretical background as well as practical simulation results that help to better understand internal device physics. The software packages used in the book are available to the public, on a commercial or noncommercial basis, so that the interested r...

  5. Simulating snow maps for Norway: description and statistical evaluation of the seNorge snow model

    Directory of Open Access Journals (Sweden)

    T. M. Saloranta

    2012-11-01

    Full Text Available Daily maps of snow conditions have been produced in Norway with the seNorge snow model since 2004. The seNorge snow model operates with 1 × 1 km resolution, uses gridded observations of daily temperature and precipitation as its input forcing, and simulates, among others, snow water equivalent (SWE, snow depth (SD, and the snow bulk density (ρ. In this paper the set of equations contained in the seNorge model code is described and a thorough spatiotemporal statistical evaluation of the model performance from 1957–2011 is made using the two major sets of extensive in situ snow measurements that exist for Norway. The evaluation results show that the seNorge model generally overestimates both SWE and ρ, and that the overestimation of SWE increases with elevation throughout the snow season. However, the R2-values for model fit are 0.60 for (log-transformed SWE and 0.45 for ρ, indicating that after removal of the detected systematic model biases (e.g. by recalibrating the model or expressing snow conditions in relative units the model performs rather well. The seNorge model provides a relatively simple, not very data-demanding, yet nonetheless process-based method to construct snow maps of high spatiotemporal resolution. It is an especially well suited alternative for operational snow mapping in regions with rugged topography and large spatiotemporal variability in snow conditions, as is the case in the mountainous Norway.

  6. A job analysis design for the rail industry : description and model analysis of the job of freight conductor.

    Science.gov (United States)

    2013-10-01

    This document provides a step-by-step description of the design and execution of a strategic job analysis, using the position of Freight Conductor as an example. This document was created to be useful for many different needs, and can be used as an e...

  7. Empirical evidence from an inter-industry descriptive analysis of overall materiality measures

    OpenAIRE

    N. Pecchiari; C. Emby; G. Pogliani

    2013-01-01

    This study presents an empirical cross-industry descriptive analysis of overall quantitative materiality measures. We examine the behaviour of four commonly used quantitative materiality measures within and across industries with respect to their size, relative size and stability, over ten years. The sample consists of large- and medium-sized European companies, representing 24 different industry categories for the years 1998 through 2007 (a total sample of over 36,000 data points). Our resul...

  8. Correlation of Descriptive Analysis and Instrumental Puncture Testing of Watermelon Cultivars.

    Science.gov (United States)

    Shiu, J W; Slaughter, D C; Boyden, L E; Barrett, D M

    2016-06-01

    The textural properties of 5 seedless watermelon cultivars were assessed by descriptive analysis and the standard puncture test using a hollow probe with increased shearing properties. The use of descriptive analysis methodology was an effective means of quantifying watermelon sensory texture profiles for characterizing specific cultivars' characteristics. Of the 10 cultivars screened, 71% of the variation in the sensory attributes was measured using the 1st 2 principal components. Pairwise correlation of the hollow puncture probe and sensory parameters determined that initial slope, maximum force, and work after maximum force measurements all correlated well to the sensory attributes crisp and firm. These findings confirm that maximum force correlates well with not only firmness in watermelon, but crispness as well. The initial slope parameter also captures the sensory crispness of watermelon, but is not as practical to measure in the field as maximum force. The work after maximum force parameter is thought to reflect cellular arrangement and membrane integrity that in turn impact sensory firmness and crispness. Watermelon cultivar types were correctly predicted by puncture test measurements in heart tissue 87% of the time, although descriptive analysis was correct 54% of the time. © 2016 Institute of Food Technologists®

  9. Regional hydrogeological simulations. Numerical modelling using ConnectFlow. Preliminary site description Simpevarp sub area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Hoch, Andrew; Hunter, Fiona; Jackson, Peter [Serco Assurance, Risley (United Kingdom); Marsic, Niko [Kemakta Konsult, Stockholm (Sweden)

    2005-02-01

    objective of this study is to support the development of a preliminary Site Description of the Simpevarp area on a regional-scale based on the available data of August 2004 (Data Freeze S1.2) and the previous Site Description. A more specific objective of this study is to assess the role of known and unknown hydrogeological conditions for the present-day distribution of saline groundwater in the Simpevarp area on a regional-scale. An improved understanding of the paleo-hydrogeology is necessary in order to gain credibility for the Site Description in general and the hydrogeological description in particular. This is to serve as a basis for describing the present hydrogeological conditions on a local-scale as well as predictions of future hydrogeological conditions. Other key objectives were to identify the model domain required to simulate regional flow and solute transport at the Simpevarp area and to incorporate a new geological model of the deformation zones produced for Version S1.2.Another difference with Version S1.1 is the increased effort invested in conditioning the hydrogeological property models to the fracture boremap and hydraulic data. A new methodology was developed for interpreting the discrete fracture network (DFN) by integrating the geological description of the DFN (GeoDFN) with the hydraulic test data from Posiva Flow-Log and Pipe-String System double-packer techniques to produce a conditioned Hydro-DFN model. This was done in a systematic way that addressed uncertainties associated with the assumptions made in interpreting the data, such as the relationship between fracture transmissivity and length. Consistent hydraulic data was only available for three boreholes, and therefore only relatively simplistic models were proposed as there isn't sufficient data to justify extrapolating the DFN away from the boreholes based on rock domain, for example. Significantly, a far greater quantity of hydro-geochemical data was available for calibration in the

  10. Simulation and Non-Simulation Based Human Reliability Analysis Approaches

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States); Shirley, Rachel Elizabeth [Idaho National Lab. (INL), Idaho Falls, ID (United States); Joe, Jeffrey Clark [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-12-01

    Part of the U.S. Department of Energy’s Light Water Reactor Sustainability (LWRS) Program, the Risk-Informed Safety Margin Characterization (RISMC) Pathway develops approaches to estimating and managing safety margins. RISMC simulations pair deterministic plant physics models with probabilistic risk models. As human interactions are an essential element of plant risk, it is necessary to integrate human actions into the RISMC risk model. In this report, we review simulation-based and non-simulation-based human reliability assessment (HRA) methods. Chapter 2 surveys non-simulation-based HRA methods. Conventional HRA methods target static Probabilistic Risk Assessments for Level 1 events. These methods would require significant modification for use in dynamic simulation of Level 2 and Level 3 events. Chapter 3 is a review of human performance models. A variety of methods and models simulate dynamic human performance; however, most of these human performance models were developed outside the risk domain and have not been used for HRA. The exception is the ADS-IDAC model, which can be thought of as a virtual operator program. This model is resource-intensive but provides a detailed model of every operator action in a given scenario, along with models of numerous factors that can influence operator performance. Finally, Chapter 4 reviews the treatment of timing of operator actions in HRA methods. This chapter is an example of one of the critical gaps between existing HRA methods and the needs of dynamic HRA. This report summarizes the foundational information needed to develop a feasible approach to modeling human interactions in the RISMC simulations.

  11. MPEG-7-based description infrastructure for an audiovisual content analysis and retrieval system

    Science.gov (United States)

    Bailer, Werner; Schallauer, Peter; Hausenblas, Michael; Thallinger, Georg

    2005-01-01

    We present a case study of establishing a description infrastructure for an audiovisual content-analysis and retrieval system. The description infrastructure consists of an internal metadata model and access tool for using it. Based on an analysis of requirements, we have selected, out of a set of candidates, MPEG-7 as the basis of our metadata model. The openness and generality of MPEG-7 allow using it in broad range of applications, but increase complexity and hinder interoperability. Profiling has been proposed as a solution, with the focus on selecting and constraining description tools. Semantic constraints are currently only described in textual form. Conformance in terms of semantics can thus not be evaluated automatically and mappings between different profiles can only be defined manually. As a solution, we propose an approach to formalize the semantic constraints of an MPEG-7 profile using a formal vocabulary expressed in OWL, which allows automated processing of semantic constraints. We have defined the Detailed Audiovisual Profile as the profile to be used in our metadata model and we show how some of the semantic constraints of this profile can be formulated using ontologies. To work practically with the metadata model, we have implemented a MPEG-7 library and a client/server document access infrastructure.

  12. Job Analysis and the Preparation of Job Descriptions. Mendip Papers MP 037.

    Science.gov (United States)

    Saunders, Bob

    This document provides guidelines for conducting job analyses and writing job descriptions. It covers the following topics: the rationale for job descriptions, the terminology of job descriptions, who should write job descriptions, getting the information to write job descriptions, preparing for staff interviews, conducting interviews, writing the…

  13. Analysis and simulation of straw fuel logistics

    Energy Technology Data Exchange (ETDEWEB)

    Nilsson, Daniel [Swedish Univ. of Agricultural Sciences, Uppsala (Sweden). Dept. of Agricultural Engineering

    1998-12-31

    Straw is a renewable biomass that has a considerable potential to be used as fuel in rural districts. This bulky fuel is, however, produced over large areas and must be collected during a limited amount of days and taken to the storages before being ultimately transported to heating plants. Thus, a well thought-out and cost-effective harvesting and handling system is necessary to provide a satisfactory fuel at competitive costs. Moreover, high-quality non-renewable fuels are used in these operations. To be sustainable, the energy content of these fuels should not exceed the energy extracted from the straw. The objective of this study is to analyze straw as fuel in district heating plants with respect to environmental and energy aspects, and to improve the performance and reduce the costs of straw handling. Energy, exergy and emergy analyses were used to assess straw as fuel from an energy point of view. The energy analysis showed that the energy balance is 12:1 when direct and indirect energy requirements are considered. The exergy analysis demonstrated that the conversion step is ineffective, whereas the emergy analysis indicated that large amounts of energy have been used in the past to form the straw fuel (the net emergy yield ratio is 1.1). A dynamic simulation model, called SHAM (Straw HAndling Model), has also been developed to investigate handling of straw from the fields to the plant. The primary aim is to analyze the performance of various machinery chains and management strategies in order to reduce the handling costs and energy needs. The model, which is based on discrete event simulation, takes both weather and geographical conditions into account. The model has been applied to three regions in Sweden (Svaloev, Vara and Enkoeping) in order to investigate the prerequisites for straw harvest at these locations. The simulations showed that straw has the best chances to become a competitive fuel in south Sweden. It was also demonstrated that costs can be

  14. Metabolic control analysis of biochemical pathways based on a thermokinetic description of reaction rates

    DEFF Research Database (Denmark)

    Nielsen, Jens Bredal

    1997-01-01

    Metabolic control analysis is a powerful technique for the evaluation of flux control within biochemical pathways. Its foundation is the elasticity coefficients and the flux control coefficients (FCCs). On the basis of a thermokinetic description of reaction rates it is here shown...... that the elasticity coefficients can be calculated directly from the pool levels of metabolites at steady state. The only requirement is that one thermodynamic parameter be known, namely the reaction affinity at the intercept of the tangent in the inflection point of the curve of reaction rate against reaction...... of the thermokinetic description of reaction rates to include the influence of effecters. Here the reaction rate is written as a linear function of the logarithm of the metabolite concentrations. With this type of rate function it is shown that the approach of Delgado and Liao [Biochem. J. (1992) 282, 919-927] can...

  15. Biomimicry: Descriptive analysis of biodiversity strategy adoption for business sustainable performance

    Directory of Open Access Journals (Sweden)

    Sivave Mashingaidze

    2014-06-01

    Full Text Available Biomimicry is a novel interdisciplinary field that mimics nature’s best ideas and processes to solve human problems. The objective of this article was to do a descriptive documentary analysis of literature in biodiversity and to recommend for business adoption as a sustainable performance strategy. The research was however based on nine (9 Life’s Principles, which were candidly inspired by nature. The research findings indicated that most business theories and strategies can mimic perfunctorily from nature for them to achieve a sustainable performance. The research was quite a conceptual and therefore did not offer direct practical proposition because its value was a descriptive of the ideas and strategies from nature and to outline its fundamental principles since biodiversity has track record of sustainability without men’s interference which humanity can also mimic

  16. Scientific tourism communication in Brazil: Descriptive analysis of national journals from 1990 to 2012

    Directory of Open Access Journals (Sweden)

    Glauber Eduardo de Oliveira Santos

    2013-04-01

    Full Text Available This paper provides descriptive analysis of 2.126 articles published in 20 Brazilian tourism journals from 1990 to 2012. It offers a comprehensive and objective picture of these journals, contributing to the debate about editorial policies, as well as to a broader understanding of the Brazilian academic research developed in this period. The study analyses the evolution of the number of published papers and descriptive statistics about the length of articles, titles and abstracts. Authors with the largest number of publications and the most recurrent keywords are identified. The integration level among journals is analyzed; point out which publications are closer to the center of the Brazilian tourism scientific publishing network.

  17. Analogue circuits simulation

    Energy Technology Data Exchange (ETDEWEB)

    Mendo, C

    1988-09-01

    Most analogue simulators have evolved from SPICE. The history and description of SPICE-like simulators are given. From a mathematical formulation of the electronic circuit the following analysis are possible: DC, AC, transient, noise, distortion, Worst Case and Statistical.

  18. The Science of Transportation Analysis and Simulation

    Science.gov (United States)

    Gleibe, John

    2010-03-01

    Transportation Science focuses on methods developed to model and analyze the interaction between human behavior and transportation systems. From the human behavioral, or demand, perspective, we are interested in how person and households organize their activities across space and time, with travel viewed as an enabling activity. We have a particular interest in how to model the range of responses to public policy and transportation system changes, which leads to the consideration of both short- and long-term decision-making, interpersonal dependencies, and non-transportation-related opportunities and constraints, including household budgets, land use systems and economic systems. This has led to the development of complex structural econometric modeling systems as well as agent-based simulations. From the transportation systems, or supply, perspective we are interested in the level of service provide by transportation facilities, be it auto, transit or multi-modal systems. This has led to the development of network models and equilibrium concepts as well as hybrid simulation systems based on concepts borrowed from physics, such as fluid flow models, and cellular automata-type models. In this presentation, we review a representative sample of these methods and their use in transportation planning and public policy analysis.

  19. An overview of the design and analysis of simulation experiments for sensitivity analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models. This review surveys 'classic' and 'modern' designs for experiments with simulation models. Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc. These designs

  20. Optimization and Simulation in Drug Development - Review and Analysis

    DEFF Research Database (Denmark)

    Schjødt-Eriksen, Jens; Clausen, Jens

    2003-01-01

    We give a review of pharmaceutical R&D and mathematical simulation and optimization methods used to support decision making within the pharmaceutical development process. The complex nature of drug development is pointed out through a description of the various phases of the pharmaceutical develo...... development process. A part of the paper is dedicated to the use of simulation techniques to support clinical trials. The paper ends with a section describing portfolio modelling methods in the context of the pharmaceutical industry....

  1. Data Entry Skills in a Computer-based Spread Sheet Amongst Postgraduate Medical Students: A Simulation Based Descriptive Assessment.

    Science.gov (United States)

    Khan, Amir Maroof; Shah, Dheeraj; Chatterjee, Pranab

    2014-07-01

    In India, research work in the form of a thesis is a mandatory requirement for the postgraduate (PG) medical students. Data entry in a computer-based spread sheet is one of the important basic skills for research, which has not yet been studied. This study was conducted to assess the data entry skills of the 2(nd) year PG medical students of a medical college of North India. A cross-sectional, descriptive study was conducted among 111 second year PG students by using four simulated filled case record forms and a computer-based spread sheet in which data entry was to be carried out. On a scale of 0-10, only 17.1% of the students scored more than seven. The specific sub-skills that were found to be lacking in more than half of the respondents were as follows: Inappropriate coding (93.7%), long variable names (51.4%), coding not being done for all the variables (76.6%), missing values entered in a non-uniform manner (84.7%) and two variables entered in the same column in the case of blood pressure reading (80.2%). PG medical students were not found to be proficient in data entry skill and this can act as a barrier to do research. This being a first of its kind study in India, more research is needed to understand this issue and then include this yet neglected aspect in teaching research methodology to the medical students.

  2. Descriptive analysis of bacon smoked with Brazilian woods from reforestation: methodological aspects, statistical analysis, and study of sensory characteristics.

    Science.gov (United States)

    Saldaña, Erick; Castillo, Luiz Saldarriaga; Sánchez, Jorge Cabrera; Siche, Raúl; de Almeida, Marcio Aurélio; Behrens, Jorge H; Selani, Miriam Mabel; Contreras-Castillo, Carmen J

    2018-06-01

    The aim of this study was to perform a descriptive analysis (DA) of bacons smoked with woods from reforestation and liquid smokes in order to investigate their sensory profile. Six samples of bacon were selected: three smoked bacons with different wood species (Eucalyptus citriodora, Acacia mearnsii, and Bambusa vulgaris), two artificially smoked bacon samples (liquid smoke) and one negative control (unsmoked bacon). Additionally, a commercial bacon sample was also evaluated. DA was developed successfully, presenting a good performance in terms of discrimination, consensus and repeatability. The study revealed that the smoking process modified the sensory profile by intensifying the "saltiness" and differentiating the unsmoked from the smoked samples. The results from the current research represent the first methodological development of descriptive analysis of bacon and may be used by food companies and other stakeholders to understand the changes in sensory characteristics of bacon due to traditional smoking process. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Psychological analysis of primary school pupils self-description in a computer game

    Directory of Open Access Journals (Sweden)

    I. D. Spirina

    2017-06-01

    Full Text Available Objective. The aim of this study was to reveal of the specific impact of computer games on the children`s consciousness in primary school. Materials and Methods. 30 children aged from 6 to 11 years were examined. The qualitative research methods of descriptions the children`s computer games experience according to the main stages of structured phenomenological research have been used. The questionnaire for children`s self- description in a computer game has been developed and qualitative analysis of these descriptions has been conducted. Results. While analyzing the descriptions the difficulty of “true”/“false” separating, the use of personal pronouns of the language, the absence of the proper distinction between "Self" as a game character and "Self" of the child on the whole, attributing the properties of living creatures to virtual "opponents" or "partners" and the confusion of time and spatial terms use while describing the game by the children have been revealed. Only the outer game plan, such as plot, "event", "action", the difficulties occurring in the game have been described by the children, but there have not been any reflected emotions at all. While describing the "events" occurring in the game, the children were not able to focus on themselves either then or during the game. Conclusions. The involvement of a child into the computer game causes, first of all, the disorder of emotional sphere functioning, when the emotions are not understood by the child. The discrepancies while describing by the children themselves, their nature and the trends of their favourite games have been exposed, indicating that there have been the disorders in the child`s self-attitude and self-esteem forming. While playing the computer game a special "operation mode" of the child's mind emerges when the impact of the irreal image on the child`s mind can distort the natural flow of cognitive and emotional reflection of reality forming.

  4. The SocioEconomic Analysis of Repository Siting (SEARS): Technical description: Final draft

    International Nuclear Information System (INIS)

    1984-11-01

    Socioeconomic impacts must be assessed both for the near term and for the future. One means of addressing the need for the assessment of such impacts has been through the development of the computerized socioeconomic assessment model called the SocioEconomic Analysis of Repository Siting (SEARS) model. The SEARS model was developed for the Battelle Project Management Division. It was refined and adapted from state-of-the-art computerized projection models and thoroughly validated and is now available for use in projecting the likely socioeconomic impacts of a repository facility. This Technical Description is one of six major products that describe the SEARS modeling system. 61 refs., 11 figs., 9 tabs

  5. Description and validation of ANTEO, an optimised PC code the thermalhydraulic analysis of fuel bundles

    International Nuclear Information System (INIS)

    Cevolani, S.

    1995-01-01

    The paper deals with the description of a Personal Computer oriented subchannel code, devoted to the steady state thermal hydraulic analysis of nuclear reactor fuel bundles. The development of such a code was made possible by two facts: firstly, the increase, in the computing power of the desk machines; secondly, the fact that several years of experience into operate subchannels codes have shown how to simplify many of the physical models without a sensible loss of accuracy. For sake of validation, the developed code was compared with a traditional subchannel code, the COBRA one. The results of the comparison show a very good agreement between the two codes. (author)

  6. Sensory characterization of a ready-to-eat sweetpotato breakfast cereal by descriptive analysis

    Science.gov (United States)

    Dansby, M. A.; Bovell-Benjamin, A. C.

    2003-01-01

    The sweetpotato [Ipomoea batatas (L.) Lam], an important industry in the United States, has been selected as a candidate crop to be grown on future long-duration space missions by NASA. Raw sweetpotato roots were processed into flour, which was used to formulate ready-to-eat breakfast cereal (RTEBC). Twelve trained panelists evaluated the sensory attributes of the extruded RTEBC using descriptive analysis. The samples were significantly different (Psensory attributes, which could be used to differentiate the appearance, texture, and flavor of sweetpotato RTEBC, were described. The data could be used to optimize the RTEBC and for designing studies to test its consumer acceptance.

  7. Augmenting health care failure modes and effects analysis with simulation

    DEFF Research Database (Denmark)

    Staub-Nielsen, Ditte Emilie; Dieckmann, Peter; Mohr, Marlene

    2014-01-01

    This study explores whether simulation plays a role in health care failure mode and effects analysis (HFMEA); it does this by evaluating whether additional data are found when a traditional HFMEA is augmented with simulation. Two multidisciplinary teams identified vulnerabilities in a process...... by brainstorming, followed by simulation. Two means of adding simulation were investigated as follows: just simulating the process and interrupting the simulation between substeps of the process. By adding simulation to a traditional HFMEA, both multidisciplinary teams identified additional data that were relevant...

  8. Final safety and hazards analysis for the Battelle LOCA simulation tests in the NRU reactor

    International Nuclear Information System (INIS)

    Axford, D.J.; Martin, I.C.; McAuley, S.J.

    1981-04-01

    This is the final safety and hazards report for the proposed Battelle LOCA simulation tests in NRU. A brief description of equipment test design and operating procedure precedes a safety analysis and hazards review of the project. The hazards review addresses potential equipment failures as well as potential for a metal/water reaction and evaluates the consequences. The operation of the tests as proposed does not present an unacceptable risk to the NRU Reactor, CRNL personnel or members of the public. (author)

  9. The coupled chemistry-climate model LMDz-REPROBUS: description and evaluation of a transient simulation of the period 1980–1999

    Directory of Open Access Journals (Sweden)

    L. Jourdain

    2008-06-01

    Full Text Available We present a description and evaluation of the Chemistry-Climate Model (CCM LMDz-REPROBUS, which couples interactively the extended version of the Laboratoire de Météorologie Dynamique General Circulation Model (LMDz GCM and the stratospheric chemistry module of the REactive Processes Ruling the Ozone BUdget in the Stratosphere (REPROBUS model. The transient simulation evaluated here covers the period 1980–1999. The introduction of an interactive stratospheric chemistry module improves the model dynamical climatology, with a substantial reduction of the temperature biases in the lower tropical stratosphere. However, at high latitudes in the Southern Hemisphere, a negative temperature bias, that is already present in the GCM version, albeit with a smaller magnitude, leads to an overestimation of the ozone depletion and its vertical extent in the CCM. This in turn contributes to maintain low polar temperatures in the vortex, delay the break-up of the vortex and the recovery of polar ozone. The latitudinal and vertical variation of the mean age of air compares favourable with estimates derived from long-lived species measurements, though the model mean age of air is 1–3 years too young in the middle stratosphere. The model also reproduces the observed "tape recorder" in tropical total hydrogen (=H2O+2×CH4, but its propagation is about 30% too fast and its signal fades away slightly too quickly. The analysis of the global distributions of CH4 and N2O suggests that the subtropical transport barriers are correctly represented in the simulation. LMDz-REPROBUS also reproduces fairly well most of the spatial and seasonal variations of the stratospheric chemical species, in particular ozone. However, because of the Antarctic cold bias, large discrepancies are found for most species at high latitudes in the Southern Hemisphere during the spring and early summer. In the Northern Hemisphere, polar ozone depletion and its variability are underestimated

  10. Experimental Design for Sensitivity Analysis of Simulation Models

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2001-01-01

    This introductory tutorial gives a survey on the use of statistical designs for what if-or sensitivity analysis in simulation.This analysis uses regression analysis to approximate the input/output transformation that is implied by the simulation model; the resulting regression model is also known as

  11. An Integrated Model for Computer Aided Reservoir Description : from Outcrop Study to Fluid Flow Simulations Un logiciel intégré pour une description des gisements assistée par ordinateur : de l'étude d'un affleurement aux simulations de l'écoulement des fluides

    Directory of Open Access Journals (Sweden)

    Guerillot D.

    2006-11-01

    Full Text Available An accurate understanding of the internal architecture of a reservoir is required to improve reservoir management for oil recovery. Geostatistical methods give an image of this architecture. The purpose of this paper is to show how this lithological description could be used for reservoir simulation. For this purpose, scale averaging problems must be solved for non-additive variables. A method giving a full effective permeability matrix is proposed. The integrated software described here starts from core analysis and lithologic logs to provide data for reservoir simulators. Each of the steps of this interactive and graphic system is explained here. Pour faire de bonnes prévisions de production pour un gisement pétrolifère, il est nécessaire de connaître précisément son architecture interne. Les méthodes géostatistiques donnent une représentation de cette architecture. L'objectif de cet article est de montrer une façon d'utiliser cette description lithologique pour la simulation des réservoirs. Il faut alors résoudre des problèmes de changement d'échelle pour les variables qui ne sont pas additives. On propose une méthode d'estimation de la perméabilité effective sous la forme d'une matrice pleine. Le logiciel intégré que l'on décrit part de l'analyse des carottes et des diagraphies en lithologies et fournit des données pour les simulateurs de gisement. On détaille ici chaque étape de ce système interactif graphique.

  12. TASAC a computer program for thermal analysis of severe accident conditions. Version 3/01, Dec 1991. Model description and user's guide

    International Nuclear Information System (INIS)

    Stempniewicz, M.; Marks, P.; Salwa, K.

    1992-06-01

    TASAC (Thermal Analysis of Severe Accident Conditions) is computer code developed in the Institute of Atomic Energy written in FORTRAN 77 for the digital computer analysis of PWR rod bundle behaviour during severe accident conditions. The code has the ability to model an early stage of core degradation including heat transfer inside the rods, convective and radiative heat exchange as well as cladding interactions with coolant and fuel, hydrogen generation, melting, relocations and refreezing of fuel rod materials with dissolution of UO 2 and ZrO 2 in liquid phase. The code was applied for the simulation of International Standard Problem number 28, performed on PHEBUS test facility. This report contains the program physical models description, detailed description of input data requirements and results of code verification. The main directions for future TASAC code development are formulated. (author). 20 refs, 39 figs, 4 tabs

  13. TASAC a computer program for thermal analysis of severe accident conditions. Version 3/01, Dec 1991. Model description and user`s guide

    Energy Technology Data Exchange (ETDEWEB)

    Stempniewicz, M; Marks, P; Salwa, K

    1992-06-01

    TASAC (Thermal Analysis of Severe Accident Conditions) is computer code developed in the Institute of Atomic Energy written in FORTRAN 77 for the digital computer analysis of PWR rod bundle behaviour during severe accident conditions. The code has the ability to model an early stage of core degradation including heat transfer inside the rods, convective and radiative heat exchange as well as cladding interactions with coolant and fuel, hydrogen generation, melting, relocations and refreezing of fuel rod materials with dissolution of UO{sub 2} and ZrO{sub 2} in liquid phase. The code was applied for the simulation of International Standard Problem number 28, performed on PHEBUS test facility. This report contains the program physical models description, detailed description of input data requirements and results of code verification. The main directions for future TASAC code development are formulated. (author). 20 refs, 39 figs, 4 tabs.

  14. A descriptive analysis of quantitative indices for multi-objective block layout

    Directory of Open Access Journals (Sweden)

    Amalia Medina Palomera

    2013-01-01

    Full Text Available Layout generation methods provide alternative solutions whose feasibility and quality must be evaluated. Indices must be used to distinguish the feasible solutions (involving different criteria obtained for block layout to identify s solution’s suitability, according to set objectives. This paper provides an accurate and descriptive analysis of the geometric indices used in designing facility layout (during block layout phase. The indices studied here have advantages and disadvantages which should be considered by an analyst before attempting to resolve the facility layout problem. New equations are proposed for measuring geometric indices. The analysis revealed redundant indices and that a minimum number of indices covering overall quality criteria may be used when selecting alternative solutions.

  15. Data entry skills in a computer-based spread sheet amongst postgraduate medical students: A simulation based descriptive assessment

    Directory of Open Access Journals (Sweden)

    Amir Maroof Khan

    2014-01-01

    Full Text Available Background: In India, research work in the form of a thesis is a mandatory requirement for the postgraduate (PG medical students. Data entry in a computer-based spread sheet is one of the important basic skills for research, which has not yet been studied. This study was conducted to assess the data entry skills of the 2 nd year PG medical students of a medical college of North India. Materials and Methods: A cross-sectional, descriptive study was conducted among 111 second year PG students by using four simulated filled case record forms and a computer-based spread sheet in which data entry was to be carried out. Results: On a scale of 0-10, only 17.1% of the students scored more than seven. The specific sub-skills that were found to be lacking in more than half of the respondents were as follows: Inappropriate coding (93.7%, long variable names (51.4%, coding not being done for all the variables (76.6%, missing values entered in a non-uniform manner (84.7% and two variables entered in the same column in the case of blood pressure reading (80.2%. Conclusion: PG medical students were not found to be proficient in data entry skill and this can act as a barrier to do research. This being a first of its kind study in India, more research is needed to understand this issue and then include this yet neglected aspect in teaching research methodology to the medical students.

  16. Content analysis and thematic analysis: Implications for conducting a qualitative descriptive study.

    Science.gov (United States)

    Vaismoradi, Mojtaba; Turunen, Hannele; Bondas, Terese

    2013-09-01

    Qualitative content analysis and thematic analysis are two commonly used approaches in data analysis of nursing research, but boundaries between the two have not been clearly specified. In other words, they are being used interchangeably and it seems difficult for the researcher to choose between them. In this respect, this paper describes and discusses the boundaries between qualitative content analysis and thematic analysis and presents implications to improve the consistency between the purpose of related studies and the method of data analyses. This is a discussion paper, comprising an analytical overview and discussion of the definitions, aims, philosophical background, data gathering, and analysis of content analysis and thematic analysis, and addressing their methodological subtleties. It is concluded that in spite of many similarities between the approaches, including cutting across data and searching for patterns and themes, their main difference lies in the opportunity for quantification of data. It means that measuring the frequency of different categories and themes is possible in content analysis with caution as a proxy for significance. © 2013 Wiley Publishing Asia Pty Ltd.

  17. Analysis and simulation of Wiseman hypocycloid engine

    Directory of Open Access Journals (Sweden)

    Priyesh Ray

    2014-12-01

    Full Text Available This research studies an alternative to the slider-crank mechanism for internal combustion engines, which was proposed by the Wiseman Technologies Inc. Their design involved replacing the crankshaft with a hypocycloid gear assembly. The unique hypocycloid gear arrangement allowed the piston and connecting rod to move in a straight line creating a perfect sinusoidal motion, without any side loads. In this work, the Wiseman hypocycloid engine was modeled in a commercial engine simulation software and compared to slider-crank engine of the same size. The engine’s performance was studied, while operating on diesel, ethanol, and gasoline fuel. Furthermore, a scaling analysis on the Wiseman engine prototypes was carried out to understand how the performance of the engine is affected by increasing the output power and cylinder displacement. It was found that the existing 30cc Wiseman engine produced about 7% less power at peak speeds than the slider-crank engine of the same size. These results were concurrent with the dynamometer tests performed in the past. It also produced lower torque and was about 6% less fuel efficient than the slider-crank engine. The four-stroke diesel variant of the same Wiseman engine performed better than the two-stroke gasoline version. The Wiseman engine with a contra piston (that allowed to vary the compression ratio showed poor fuel efficiency but produced higher torque when operating on E85 fuel. It also produced about 1.4% more power than while running on gasoline. While analyzing effects of the engine size on the Wiseman hypocycloid engine prototypes, it was found that the engines performed better in terms of power, torque, fuel efficiency, and cylinder brake mean effective pressure as the displacement increased. The 30 horsepower (HP conceptual Wiseman prototype, while operating on E85, produced the most optimum results in all aspects, and the diesel test for the same engine proved to be the most fuel efficient.

  18. Quantitative descriptive analysis of Italian polenta produced with different corn cultivars.

    Science.gov (United States)

    Zeppa, Giuseppe; Bertolino, Marta; Rolle, Luca

    2012-01-30

    Polenta is a porridge-like dish, generally made by mixing cornmeal with salt water and stirring constantly while cooking over a low heat. It can be eaten plain, straight from the pan, or topped with various foods (cheeses, meat, sausages, fish, etc.). It is most popular in northern Italy but can also be found in Switzerland, Austria, Croatia, Argentina and other countries in Eastern Europe and South America. Despite this diffusion, there are no data concerning the sensory characteristics of this product. A research study was therefore carried out to define the lexicon for a sensory profile of polenta and relationships with corn cultivars. A lexicon with 13 sensory parameters was defined and validated before references were determined. After panel training, the sensory profiles of 12 autochthonous maize cultivars were defined. The results of this research highlighted that quantitative descriptive analysis can also be used for the sensory description of polenta, and that the defined lexicon can be used to describe the sensory qualities of polenta for both basic research, such as maize selection, and product development. Copyright © 2011 Society of Chemical Industry.

  19. Virtual Environment Computer Simulations to Support Human Factors Engineering and Operations Analysis for the RLV Program

    Science.gov (United States)

    Lunsford, Myrtis Leigh

    1998-01-01

    The Army-NASA Virtual Innovations Laboratory (ANVIL) was recently created to provide virtual reality tools for performing Human Engineering and operations analysis for both NASA and the Army. The author's summer research project consisted of developing and refining these tools for NASA's Reusable Launch Vehicle (RLV) program. Several general simulations were developed for use by the ANVIL for the evaluation of the X34 Engine Changeout procedure. These simulations were developed with the software tool dVISE 4.0.0 produced by Division Inc. All software was run on an SGI Indigo2 High Impact. This paper describes the simulations, various problems encountered with the simulations, other summer activities, and possible work for the future. We first begin with a brief description of virtual reality systems.

  20. Teaching Workflow Analysis and Lean Thinking via Simulation: A Formative Evaluation

    Science.gov (United States)

    Campbell, Robert James; Gantt, Laura; Congdon, Tamara

    2009-01-01

    This article presents the rationale for the design and development of a video simulation used to teach lean thinking and workflow analysis to health services and health information management students enrolled in a course on the management of health information. The discussion includes a description of the design process, a brief history of the use of simulation in healthcare, and an explanation of how video simulation can be used to generate experiential learning environments. Based on the results of a survey given to 75 students as part of a formative evaluation, the video simulation was judged effective because it allowed students to visualize a real-world process (concrete experience), contemplate the scenes depicted in the video along with the concepts presented in class in a risk-free environment (reflection), develop hypotheses about why problems occurred in the workflow process (abstract conceptualization), and develop solutions to redesign a selected process (active experimentation). PMID:19412533

  1. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2005-02-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  2. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  3. Simulation Development and Analysis of Crew Vehicle Ascent Abort

    Science.gov (United States)

    Wong, Chi S.

    2016-01-01

    I have taken thus far that focus on pure logic, simulation code focuses on mimicking the physical world with some approximation and can have inaccuracies or numerical instabilities. Learning from my mistake, I adopted new methods to analyze these different simulations. One method the student used was to numerically plot various physical parameters using MATLAB to confirm the mechanical behavior of the system in addition to comparing the data to the output from a separate simulation tool called FAST. By having full control over what was being outputted from the simulation, I could choose which parameters to change and to plot as well as how to plot them, allowing for an in depth analysis of the data. Another method of analysis was to convert the output data into a graphical animation. Unlike the numerical plots, where all of the physical components were displayed separately, this graphical display allows for a combined look at the simulation output that makes it much easier for one to see the physical behavior of the model. The process for converting SOMBAT output for EDGE graphical display had to be developed. With some guidance from other EDGE users, I developed a process and created a script that would easily allow one to display simulations graphically. Another limitation with the SOMBAT model was the inability for the capsule to have the main parachutes instantly deployed with a large angle between the air speed vector and the chutes drag vector. To explore this problem, I had to learn about different coordinate frames used in Guidance, Navigation & Control (J2000, ECEF, ENU, etc.) to describe the motion of a vehicle and about Euler angles (e.g. Roll, Pitch, Yaw) to describe the orientation of the vehicle. With a thorough explanation from my mentor about the description of each coordinate frame, as well as how to use a directional cosine matrix to transform one frame to another, I investigated the problem by simulating different capsule orientations. In the end

  4. The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP): Overview and Description of Models, Simulations and Climate Diagnostics

    Science.gov (United States)

    Lamarque, J.-F.; Shindell, D. T.; Naik, V.; Plummer, D.; Josse, B.; Righi, M.; Rumbold, S. T.; Schulz, M.; Skeie, R. B.; Strode, S.; hide

    2013-01-01

    The Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP) consists of a series of time slice experiments targeting the long-term changes in atmospheric composition between 1850 and 2100, with the goal of documenting composition changes and the associated radiative forcing. In this overview paper, we introduce the ACCMIP activity, the various simulations performed (with a requested set of 14) and the associated model output. The 16 ACCMIP models have a wide range of horizontal and vertical resolutions, vertical extent, chemistry schemes and interaction with radiation and clouds. While anthropogenic and biomass burning emissions were specified for all time slices in the ACCMIP protocol, it is found that the natural emissions are responsible for a significant range across models, mostly in the case of ozone precursors. The analysis of selected present-day climate diagnostics (precipitation, temperature, specific humidity and zonal wind) reveals biases consistent with state-of-the-art climate models. The model-to- model comparison of changes in temperature, specific humidity and zonal wind between 1850 and 2000 and between 2000 and 2100 indicates mostly consistent results. However, models that are clear outliers are different enough from the other models to significantly affect their simulation of atmospheric chemistry.

  5. Modern analysis of ion channeling data by Monte Carlo simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nowicki, Lech [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland)]. E-mail: lech.nowicki@fuw.edu.pl; Turos, Andrzej [Institute of Electronic Materials Technology, Wolczynska 133, 01-919 Warsaw (Poland); Ratajczak, Renata [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Stonert, Anna [Andrzej SoItan Institute for Nuclear Studies, ul. Hoza 69, 00-681 Warsaw (Poland); Garrido, Frederico [Centre de Spectrometrie Nucleaire et Spectrometrie de Masse, CNRS-IN2P3-Universite Paris-Sud, 91405 Orsay (France)

    2005-10-15

    Basic scheme of ion channeling spectra Monte Carlo simulation is reformulated in terms of statistical sampling. The McChasy simulation code is described and two examples of the code applications are presented. These are: calculation of projectile flux in uranium dioxide crystal and defect analysis for ion implanted InGaAsP/InP superlattice. Virtues and pitfalls of defect analysis using Monte Carlo simulations are discussed.

  6. Image based SAR product simulation for analysis

    Science.gov (United States)

    Domik, G.; Leberl, F.

    1987-01-01

    SAR product simulation serves to predict SAR image gray values for various flight paths. Input typically consists of a digital elevation model and backscatter curves. A new method is described of product simulation that employs also a real SAR input image for image simulation. This can be denoted as 'image-based simulation'. Different methods to perform this SAR prediction are presented and advantages and disadvantages discussed. Ascending and descending orbit images from NASA's SIR-B experiment were used for verification of the concept: input images from ascending orbits were converted into images from a descending orbit; the results are compared to the available real imagery to verify that the prediction technique produces meaningful image data.

  7. Simulation Experiments in Practice: Statistical Design and Regression Analysis

    OpenAIRE

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic DOE and regression analysis assume a single simulation response that is normally and independen...

  8. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  9. Simulation analysis for hyperbola locating accuracy

    International Nuclear Information System (INIS)

    Wang Changli; Liu Daizhi

    2004-01-01

    In the system of the hyperbola location, the geometric shape of the detecting stations has an important influence on the locating accuracy. At first, this paper simulates the process of the hyperbola location by the computer, and then analyzes the influence of the geometric shape on the locating errors and gives the computer simulation results, finally, discusses the problems that require attention in course of selecting the detecting station. The conclusion has practicality. (authors)

  10. Quantitative descriptive analysis and principal component analysis for sensory characterization of Indian milk product cham-cham.

    Science.gov (United States)

    Puri, Ritika; Khamrui, Kaushik; Khetra, Yogesh; Malhotra, Ravinder; Devraja, H C

    2016-02-01

    Promising development and expansion in the market of cham-cham, a traditional Indian dairy product is expected in the coming future with the organized production of this milk product by some large dairies. The objective of this study was to document the extent of variation in sensory properties of market samples of cham-cham collected from four different locations known for their excellence in cham-cham production and to find out the attributes that govern much of variation in sensory scores of this product using quantitative descriptive analysis (QDA) and principal component analysis (PCA). QDA revealed significant (p sensory attributes of cham-cham among the market samples. PCA identified four significant principal components that accounted for 72.4 % of the variation in the sensory data. Factor scores of each of the four principal components which primarily correspond to sweetness/shape/dryness of interior, surface appearance/surface dryness, rancid and firmness attributes specify the location of each market sample along each of the axes in 3-D graphs. These findings demonstrate the utility of quantitative descriptive analysis for identifying and measuring attributes of cham-cham that contribute most to its sensory acceptability.

  11. Passion fruit juice with different sweeteners: sensory profile by descriptive analysis and acceptance.

    Science.gov (United States)

    Rocha, Izabela Furtado de Oliveira; Bolini, Helena Maria André

    2015-03-01

    This study evaluated the effect of different sweeteners on the sensory profile, acceptance, and drivers of preference of passion fruit juice samples sweetened with sucrose, aspartame, sucralose, stevia, cyclamate/saccharin blend 2:1, and neotame. Sensory profiling was performed by 12 trained assessors using quantitative descriptive analysis (QDA). Acceptance tests (appearance, aroma, flavor, texture and overall impression) were performed with 124 consumers of tropical fruit juice. Samples with sucrose, aspartame and sucralose showed similar sensory profile (P Passion fruit flavor affected positively and sweet aftertaste affected negatively the acceptance of the samples. Samples sweetened with aspartame, sucralose, and sucrose presented higher acceptance scores for the attributes flavor, texture, and overall impression, with no significant (P passion fruit juice.

  12. A Description and Linguistic Analysis of the Tai Khuen Writing System

    Directory of Open Access Journals (Sweden)

    R. Wyn Owen

    2017-03-01

    Full Text Available This article provides a description and linguistic analysis of the Tai Tham script-based orthography of Tai Khuen, a Tai-Kadai language spoken in Eastern Shan State, Myanmar. The language has a long history of writing flowing out of the literary and religious culture nurtured by the Lan Na Kingdom from the 13th Century onwards. Comparison of phoneme and grapheme inventories shows that the orthography is well able to represent the spoken language as well as ancient Pali religious texts. Apart from spelling conventions reflecting the etymology of borrowed Pali and Sanskrit morphemes, sound changes over time have also decreased the phonological transparency of the orthography, notwithstanding the need of some more conservative varieties which still preserve distinctions lost in other varieties. Despite the complexities of the script, the literacy rates in Khuen are remarkably high for a minority language not taught in the government school system.

  13. Aircraft vulnerability analysis by modeling and simulation

    Science.gov (United States)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    Infrared missiles pose a significant threat to civilian and military aviation. ManPADS missiles are especially dangerous in the hands of rogue and undisciplined forces. Yet, not all the launched missiles hit their targets; the miss being either attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft-missile engagement is a complex series of events, many of which are only partially understood. Aircraft and missile designers focus on the optimal design and performance of their respective systems, often testing only in a limited set of scenarios. Most missiles react to the contrast intensity, but the variability of the background is rarely considered. Finally, the vulnerability of the aircraft depends jointly on the missile's performance and the doctrine governing the missile's launch. These factors are considered in a holistic investigation. The view direction, altitude, time of day, sun position, latitude/longitude and terrain determine the background against which the aircraft is observed. Especially high gradients in sky radiance occur around the sun and on the horizon. This paper considers uncluttered background scenes (uniform terrain and clear sky) and presents examples of background radiance at all view angles across a sphere around the sensor. A detailed geometrical and spatially distributed radiometric model is used to model the aircraft. This model provides the signature at all possible view angles across the sphere around the aircraft. The signature is determined in absolute terms (no background) and in contrast terms (with background). It is shown that the background significantly affects the contrast signature as observed by the missile sensor. A simplified missile model is constructed by defining the thrust and mass profiles, maximum seeker tracking rate, maximum

  14. Analysis of and reflection on bachelor thesis in nursing students: A descriptive study in Spain.

    Science.gov (United States)

    Roca, Judith; Gros, Silvia; Canet, Olga

    2018-06-11

    The bachelor thesis, a final year subject to obtain a nursing degree, presents an ideal opportunity for the development and assessment of professional competencies. Thus, it is necessary to specify that the structure of the bachelor thesis works as an element of review and reflection from both a pedagogical and professional perspective. To analyse the main elements of the bachelor thesis in the nursing degree 2015-16 in Spain. A transversal descriptive study was conducted using a quantitative documentary analysis via study guides or grade reports. The variables were the main academic elements of the bachelor thesis subject (credits, competencies, learning outcomes, contents, methodologies, training activities and assessment). A probabilistic sample of 66 institutions was studied using descriptive statistics with statistical measures of central tendency and measures of variability. The results showed a maximum range of 12 and a minimum of 6 European Credit Transfer and Accumulation System. The definition and number of competencies to be developed varied and the learning outcomes were formulated in only 40.9% of the guides consulted. The most widely used teaching methodologies and training activities were academic supervision (87.9%) and autonomous work (80.3%). Regarding types of work, basic investigation (34.8%), care plans (33.3%) and literature review (30,3%) ranked highest. No specific descriptors could be linked to the contents. Finally, two main assessment tools were found: process and product. The rubric is presented as a main element of the assessment. The bachelor thesis is conceived as autonomous, personal and original academic work. But no homogeneity was observed in the key development elements such as competencies, teaching strategies, or type of bachelor thesis. Therefore, the findings from the analysis and the bibliographic review are presented as recommendations as regards the outcome, structure and/or teaching elements linked to the bachelor thesis

  15. Description of CORSET: a computer program for quantitative x-ray fluorescence analysis

    International Nuclear Information System (INIS)

    Stohl, F.V.

    1980-08-01

    Quantitative x-ray fluorescence analysis requires a method of correcting for absorption and secondary fluorescence effects due to the sample matrix. The computer program CORSET carries out these corrections without requiring a knowledge of the spectral distribution of the x-ray source, and only requires one standard per element or one standard containing all the elements. Sandia's version of CORSET has been divided into three separate programs to fit Sandia's specific requirements for on-line analysis in a melt facility. The melt facility is used to fabricate new alloys with very variable compositions and requires very rapid analyses during a run. Therefore, the standards must be analyzed several days in advance. Program DAT1 is used to set up a permanent file consisting of all the data related to the standards. Program UNINT is used to set up a permanent file with the intensities, background counts and counting times of the unknowns. Program CORSET uses the files created in UNINT and DAT1 to carry out the analysis. This report contains descriptions, listings, and sample runs for these programs. The accuracy of the analyses carried out with these three programs is about 1 to 2% relative with an elemental concentration of about 10 wt %

  16. A structured hazard analysis and risk assessment method for automotive systems—A descriptive study

    International Nuclear Information System (INIS)

    Beckers, Kristian; Holling, Dominik; Côté, Isabelle; Hatebur, Denis

    2017-01-01

    The 2011 release of the first version of the ISO 26262 standard for automotive systems demand the elicitation of safety goals following a rigorous method for hazard and risk analysis. Companies are struggling with the adoption of the standard due to ambiguities, documentation demands and the alignment of the standards demands to existing processes. We previously proposed a structured engineering method to deal with these problems developed in applying action research together with an OEM. In this work, we evaluate how applicable the method is for junior automotive software engineers by a descriptive study. We provided the method to 8 members of the master course Automotive Software Engineering (ASE) at the Technical University Munich. The participants have each been working in the automotive industry for 1–4 years in parallel to their studies. We investigated their application of our method to an electronic steering column lock system. The participants applied our method in a first round alone and afterwards discussed their results in groups. Our data analysis revealed that the participants could apply the method successfully and the hazard analysis and risk assessment achieved a high precision and productivity. Moreover, the precision could be improved significantly during group discussions.

  17. Simulation modeling and analysis in safety. II

    International Nuclear Information System (INIS)

    Ayoub, M.A.

    1981-01-01

    The paper introduces and illustrates simulation modeling as a viable approach for dealing with complex issues and decisions in safety and health. The author details two studies: evaluation of employee exposure to airborne radioactive materials and effectiveness of the safety organization. The first study seeks to define a policy to manage a facility used in testing employees for radiation contamination. An acceptable policy is one that would permit the testing of all employees as defined under regulatory requirements, while not exceeding available resources. The second study evaluates the relationship between safety performance and the characteristics of the organization, its management, its policy, and communication patterns among various functions and levels. Both studies use models where decisions are reached based on the prevailing conditions and occurrence of key events within the simulation environment. Finally, several problem areas suitable for simulation studies are highlighted. (Auth.)

  18. An adaptive maneuvering logic computer program for the simulation of one-on-one air-to-air combat. Volume 1: General description

    Science.gov (United States)

    Burgin, G. H.; Fogel, L. J.; Phelps, J. P.

    1975-01-01

    A technique for computer simulation of air combat is described. Volume 1 decribes the computer program and its development in general terms. Two versions of the program exist. Both incorporate a logic for selecting and executing air combat maneuvers with performance models of specific fighter aircraft. In the batch processing version the flight paths of two aircraft engaged in interactive aerial combat and controlled by the same logic are computed. The realtime version permits human pilots to fly air-to-air combat against the adaptive maneuvering logic (AML) in Langley Differential Maneuvering Simulator (DMS). Volume 2 consists of a detailed description of the computer programs.

  19. Evaluating the Effect of Virtual Reality Temporal Bone Simulation on Mastoidectomy Performance: A Meta-analysis.

    Science.gov (United States)

    Lui, Justin T; Hoy, Monica Y

    2017-06-01

    Background The increasing prevalence of virtual reality simulation in temporal bone surgery warrants an investigation to assess training effectiveness. Objectives To determine if temporal bone simulator use improves mastoidectomy performance. Data Sources Ovid Medline, Embase, and PubMed databases were systematically searched per the PRISMA guidelines. Review Methods Inclusion criteria were peer-reviewed publications that utilized quantitative data of mastoidectomy performance following the use of a temporal bone simulator. The search was restricted to human studies published in English. Studies were excluded if they were in non-peer-reviewed format, were descriptive in nature, or failed to provide surgical performance outcomes. Meta-analysis calculations were then performed. Results A meta-analysis based on the random-effects model revealed an improvement in overall mastoidectomy performance following training on the temporal bone simulator. A standardized mean difference of 0.87 (95% CI, 0.38-1.35) was generated in the setting of a heterogeneous study population ( I 2 = 64.3%, P virtual reality simulation temporal bone surgery studies, meta-analysis calculations demonstrate an improvement in trainee mastoidectomy performance with virtual simulation training.

  20. Collection and analysis of training simulator data

    International Nuclear Information System (INIS)

    Krois, P.A.; Haas, P.M.

    1985-01-01

    The purposes of this paper are: (1) to review the objectives, approach, and results of a series of research experiments performed on nuclear power plant training simulators in support of regulatory and research programs of the US Nuclear Regulatory Commission (NRC), and (2) to identify general research issues that may lead to an improved research methodology using the training simulator as a field setting. Research products consist of a refined field research methodology, a data store on operator performance, and specific results pertinent to NRC regulatory positions. Issues and potential advances in operator performance measurement are discussed

  1. Reference values for muscle strength: a systematic review with a descriptive meta-analysis.

    Science.gov (United States)

    Benfica, Poliana do Amaral; Aguiar, Larissa Tavares; Brito, Sherindan Ayessa Ferreira de; Bernardino, Luane Helena Nunes; Teixeira-Salmela, Luci Fuscaldi; Faria, Christina Danielli Coelho de Morais

    2018-05-03

    Muscle strength is an important component of health. To describe and evaluate the studies which have established the reference values for muscle strength on healthy individuals and to synthesize these values with a descriptive meta-analysis approach. A systematic review was performed in MEDLINE, LILACS, and SciELO databases. Studies that investigated the reference values for muscle strength of two or more appendicular/axial muscle groups of health individuals were included. Methodological quality, including risk of bias was assessed by the QUADAS-2. Data extracted included: country of the study, sample size, population characteristics, equipment/method used, and muscle groups evaluated. Of the 414 studies identified, 46 were included. Most of the studies had adequate methodological quality. Included studies evaluated: appendicular (80.4%) and axial (36.9%) muscles; adults (78.3%), elderly (58.7%), adolescents (43.5%), children (23.9%); isometric (91.3%) and isokinetic (17.4%) strength. Six studies (13%) with similar procedures were synthesized with meta-analysis. Generally, the coefficient of variation values that resulted from the meta-analysis ranged from 20.1% to 30% and were similar to those reported by the original studies. The meta-analysis synthesized the reference values of isometric strength of 14 muscle groups of the dominant/non-dominant sides of the upper/lower limbs of adults/elderly from developed countries, using dynamometers/myometer. Most of the included studies had adequate methodological quality. The meta-analysis provided reference values for the isometric strength of 14 appendicular muscle groups of the dominant/non-dominant sides, measured with dynamometers/myometers, of men/women, of adults/elderly. These data may be used to interpret the results of the evaluations and establish appropriate treatment goals. Copyright © 2018 Associação Brasileira de Pesquisa e Pós-Graduação em Fisioterapia. Publicado por Elsevier Editora Ltda. All rights

  2. Relationships between Descriptive Sensory Attributes and Physicochemical Analysis of Broiler and Taiwan Native Chicken Breast Meat

    Directory of Open Access Journals (Sweden)

    Wanwisa Chumngoen

    2015-07-01

    Full Text Available Unique organoleptic characteristics such as rich flavors and chewy texture contribute to the higher popularity of native chicken in many Asian areas, while the commercial broilers are well-accepted due to their fast-growing and higher yields of meat. Sensory attributes of foods are often used to evaluate food eating quality and serve as references during the selection of foods. In this study, a three-phase descriptive sensory study was conducted to evaluate the sensory attributes of commercial broiler (BR and Taiwan native chicken (TNC breast meat, and investigate correlations between these sensory attributes and instrumental measurements. The results showed that for the first bite (phase 1, TNC meat had significantly higher moisture release, hardness, springiness, and cohesiveness than BR meat. After chewing for 10 to 12 bites (phase 2, TNC meat presented significantly higher chewdown hardness and meat particle size, whereas BR meat had significantly higher cohesiveness of mass. After swallowing (phase 3, TNC meat had higher chewiness and oily mouthcoat and lower residual loose particles than BR meat. TNC meat also provided more intense chicken flavors. This study clearly demonstrates that descriptive sensory analysis provides more detailed and more objectively information about the sensory attributes of meats from various chicken breeds. Additionally, sensory textural attributes vary between BR and TNC meat, and are highly correlated to the shear force value and collagen content which influence meat eating qualities greatly. The poultry industry and scientists should be able to recognize the sensory characteristics of different chicken meats more clearly. Accordingly, based on the meat’s unique sensory and physicochemical characteristics, future work might address how meat from various breeds could best satisfy consumer needs using various cooking methods.

  3. Relationships between Descriptive Sensory Attributes and Physicochemical Analysis of Broiler and Taiwan Native Chicken Breast Meat

    Science.gov (United States)

    Chumngoen, Wanwisa; Tan, Fa-Jui

    2015-01-01

    Unique organoleptic characteristics such as rich flavors and chewy texture contribute to the higher popularity of native chicken in many Asian areas, while the commercial broilers are well-accepted due to their fast-growing and higher yields of meat. Sensory attributes of foods are often used to evaluate food eating quality and serve as references during the selection of foods. In this study, a three-phase descriptive sensory study was conducted to evaluate the sensory attributes of commercial broiler (BR) and Taiwan native chicken (TNC) breast meat, and investigate correlations between these sensory attributes and instrumental measurements. The results showed that for the first bite (phase 1), TNC meat had significantly higher moisture release, hardness, springiness, and cohesiveness than BR meat. After chewing for 10 to 12 bites (phase 2), TNC meat presented significantly higher chewdown hardness and meat particle size, whereas BR meat had significantly higher cohesiveness of mass. After swallowing (phase 3), TNC meat had higher chewiness and oily mouthcoat and lower residual loose particles than BR meat. TNC meat also provided more intense chicken flavors. This study clearly demonstrates that descriptive sensory analysis provides more detailed and more objectively information about the sensory attributes of meats from various chicken breeds. Additionally, sensory textural attributes vary between BR and TNC meat, and are highly correlated to the shear force value and collagen content which influence meat eating qualities greatly. The poultry industry and scientists should be able to recognize the sensory characteristics of different chicken meats more clearly. Accordingly, based on the meat’s unique sensory and physicochemical characteristics, future work might address how meat from various breeds could best satisfy consumer needs using various cooking methods. PMID:26104409

  4. Relationships between Descriptive Sensory Attributes and Physicochemical Analysis of Broiler and Taiwan Native Chicken Breast Meat.

    Science.gov (United States)

    Chumngoen, Wanwisa; Tan, Fa-Jui

    2015-07-01

    Unique organoleptic characteristics such as rich flavors and chewy texture contribute to the higher popularity of native chicken in many Asian areas, while the commercial broilers are well-accepted due to their fast-growing and higher yields of meat. Sensory attributes of foods are often used to evaluate food eating quality and serve as references during the selection of foods. In this study, a three-phase descriptive sensory study was conducted to evaluate the sensory attributes of commercial broiler (BR) and Taiwan native chicken (TNC) breast meat, and investigate correlations between these sensory attributes and instrumental measurements. The results showed that for the first bite (phase 1), TNC meat had significantly higher moisture release, hardness, springiness, and cohesiveness than BR meat. After chewing for 10 to 12 bites (phase 2), TNC meat presented significantly higher chewdown hardness and meat particle size, whereas BR meat had significantly higher cohesiveness of mass. After swallowing (phase 3), TNC meat had higher chewiness and oily mouthcoat and lower residual loose particles than BR meat. TNC meat also provided more intense chicken flavors. This study clearly demonstrates that descriptive sensory analysis provides more detailed and more objectively information about the sensory attributes of meats from various chicken breeds. Additionally, sensory textural attributes vary between BR and TNC meat, and are highly correlated to the shear force value and collagen content which influence meat eating qualities greatly. The poultry industry and scientists should be able to recognize the sensory characteristics of different chicken meats more clearly. Accordingly, based on the meat's unique sensory and physicochemical characteristics, future work might address how meat from various breeds could best satisfy consumer needs using various cooking methods.

  5. Simulation and formal analysis of visual attention

    NARCIS (Netherlands)

    Bosse, T.; Maanen, P.P. van; Treur, J.

    2009-01-01

    In this paper a simulation model for visual attention is discussed and formally analysed. The model is part of the design of an agent-based system that supports a naval officer in its task to compile a tactical picture of the situation in the field. A case study is described in which the model is

  6. Mathematical analysis and simulation of crop micrometeorology

    NARCIS (Netherlands)

    Chen, J.

    1984-01-01

    In crop micrometeorology the transfer of radiation, momentum, heat and mass to or from a crop canopy is studied. Simulation models for these processes do exist but are not easy to handle because of their complexity and the long computing time they need. Moreover, up to now such models can

  7. What does 'recovery' mean to people with neck pain? Results of a descriptive thematic analysis.

    Science.gov (United States)

    Walton, David M; Macdermid, Joy C; Taylor, Todd

    2013-01-01

    To describe the meaning of being recovered as perceived by people with chronic mechanical neck pain. To determine the way people with neck pain would describe a recovered state a descriptive thematic approach was used. A nominal focus group technique, written reflections, and one-on-one semi-structured interviews were used to collect sufficient data. Data from the focus groups were analyzed both through vote tallying and thematic analysis. Reflections and interviews were analyzed thematically by two independent researchers. Triangulation and member-checking were employed to establish trustworthiness of results. A total of 35 people, primarily females with neck pain of traumatic origin, participated in this study. Thematic analysis identified 6 themes that adequately described the data: absent or manageable symptoms, having the physical capacity one ought to have, participation in life roles, feeling positive emotions, autonomy & spontaneity, and re-establishing a sense of self. Member checking and triangulation suggested data saturation and accuracy of the generated themes. Recovery from neck pain appears to be informed by factors that fit with existing models of health, quality of life and satisfaction. Basing recovery solely on symptom or activity-level measures risks inaccurate estimates of recovery trajectories from traumatic or non-traumatic neck pain.

  8. Key parameters analysis of hybrid HEMP simulator

    International Nuclear Information System (INIS)

    Mao Congguang; Zhou Hui

    2009-01-01

    According to the new standards on the high-altitude electromagnetic pulse (HEMP) developed by International Electrotechnical Commission (IEC), the target parameter requirements of the key structure of the hybrid HEMP simulator are decomposed. Firstly, the influences of the different excitation sources and biconical structures to the key parameters of the radiated electric field wave shape are investigated and analyzed. Then based on the influence curves the target parameter requirements of the pulse generator are proposed. Finally the appropriate parameters of the biconical structure and the excitation sources are chosen, and the computational result of the electric field in free space is presented. The results are of great value for the design of the hybrid HEMP simulator. (authors)

  9. Analysis and simulation of Wiseman hypocycloid engine

    OpenAIRE

    Priyesh Ray; Sangram Redkar

    2014-01-01

    This research studies an alternative to the slider-crank mechanism for internal combustion engines, which was proposed by the Wiseman Technologies Inc. Their design involved replacing the crankshaft with a hypocycloid gear assembly. The unique hypocycloid gear arrangement allowed the piston and connecting rod to move in a straight line creating a perfect sinusoidal motion, without any side loads. In this work, the Wiseman hypocycloid engine was modeled in a commercial engine simulation softwa...

  10. Merging Galaxy Clusters: Analysis of Simulated Analogs

    Science.gov (United States)

    Nguyen, Jayke; Wittman, David; Cornell, Hunter

    2018-01-01

    The nature of dark matter can be better constrained by observing merging galaxy clusters. However, uncertainty in the viewing angle leads to uncertainty in dynamical quantities such as 3-d velocities, 3-d separations, and time since pericenter. The classic timing argument links these quantities via equations of motion, but neglects effects of nonzero impact parameter (i.e. it assumes velocities are parallel to the separation vector), dynamical friction, substructure, and larger-scale environment. We present a new approach using n-body cosmological simulations that naturally incorporate these effects. By uniformly sampling viewing angles about simulated cluster analogs, we see projected merger parameters in the many possible configurations of a given cluster. We select comparable simulated analogs and evaluate the likelihood of particular merger parameters as a function of viewing angle. We present viewing angle constraints for a sample of observed mergers including the Bullet cluster and El Gordo, and show that the separation vectors are closer to the plane of the sky than previously reported.

  11. Freud: a software suite for high-throughput simulation analysis

    Science.gov (United States)

    Harper, Eric; Spellings, Matthew; Anderson, Joshua; Glotzer, Sharon

    Computer simulation is an indispensable tool for the study of a wide variety of systems. As simulations scale to fill petascale and exascale supercomputing clusters, so too does the size of the data produced, as well as the difficulty in analyzing these data. We present Freud, an analysis software suite for efficient analysis of simulation data. Freud makes no assumptions about the system being analyzed, allowing for general analysis methods to be applied to nearly any type of simulation. Freud includes standard analysis methods such as the radial distribution function, as well as new methods including the potential of mean force and torque and local crystal environment analysis. Freud combines a Python interface with fast, parallel C + + analysis routines to run efficiently on laptops, workstations, and supercomputing clusters. Data analysis on clusters reduces data transfer requirements, a prohibitive cost for petascale computing. Used in conjunction with simulation software, Freud allows for smart simulations that adapt to the current state of the system, enabling the study of phenomena such as nucleation and growth, intelligent investigation of phases and phase transitions, and determination of effective pair potentials.

  12. Simulation and Analysis of Chain Drive Systems

    DEFF Research Database (Denmark)

    Pedersen, Sine Leergaard

    mathematical models, and compare to the prior done research. Even though the model is developed at first for the use of analysing chain drive systems in marine engines, the methods can with small changes be used in general, as for e.g. chain drives in industrial machines, car engines and motorbikes. A novel...... with a real tooth profile proves superior to other applied models. With this model it is possible to perform a dynamic simulation of large marine engine chain drives. Through the application of this method, it is shown that the interrelated dynamics of the elements in the chain drive system is captured...

  13. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  14. Descriptive Analysis on Flouting and Hedging of Conversational Maxims in the “Post Grad” Movie

    Directory of Open Access Journals (Sweden)

    Nastiti Rokhmania

    2012-11-01

    Full Text Available This research is focused on analyzing flouting and hedging of conversational maxim of utterances used by the main characters in “Post Grad” movie. Conversational maxims are the rules of cooperative principle categorized into four categories; Maxim of Quality, Maxim of Quantity, Maxim of Relevance, and Maxim of Manner. If these maxims are used in conversations, the conversations can go smoothly. However, people often break the maxims overtly (flouting maxim and sometimes break the maxims secretly (hedging maxims when they make a conversation. This research is conducted using descriptive qualitative method based on the theory known as Grice’s Maxims. The data are in form of utterances used by the characters in “Post Grad” movie. The data analysis reveals some finding covering the formulated research question. The maxims are flouted when the speaker breaks some conversational maxims when using the utterances in the form of rhetorical strategies, such as tautology, metaphor, hyperbole, irony, and rhetorical question. On the other hand, conversational maxims are also hedged when the information is not totally accurate or unclearly stated but seems informative, well-founded, and relevant.

  15. Blunt traumatic injury during pregnancy: a descriptive analysis from a level 1 trauma center.

    Science.gov (United States)

    Al-Thani, Hassan; El-Menyar, Ayman; Sathian, Brijesh; Mekkodathil, Ahammed; Thomas, Sam; Mollazehi, Monira; Al-Sulaiti, Maryam; Abdelrahman, Husham

    2018-03-27

    The precise incidence of trauma in pregnancy is not well-known, but trauma is estimated to complicate nearly 1 in 12 pregnancies and it is the leading non-obstetrical cause of maternal death. A retrospective study of all pregnant women presented to national level 1 trauma center from July 2013 to June 2015 was conducted. Descriptive and inferential statistics applied for data analysis. Across the study period, a total of 95 pregnant women were presented to the trauma center. The average incidence rate of traumatic injuries was 250 per 1000 women of childbearing age presented to the Hamad Trauma Center. The mean age of patients was 30.4 ± SD 5.6 years, with age ranging from 20 to 42 years. The mean gestational age at the time of injury was 24.7 ± 8.7 weeks which ranged from 5 to 37 weeks. The majority (47.7%) was in the third trimester of the pregnancy. In addition, the large majority of injuries was due to MVCs (74.7%) followed by falls (15.8%). Trauma during pregnancy is not an uncommon event particularly in the traffic-related crashes. As it is a complex condition for trauma surgeons and obstetrician, an appropriate management protocol and multidisciplinary team are needed to improve the outcome and save lives of both the mother and fetus.

  16. A descriptive analysis of psychological traits among the health-care providers

    Directory of Open Access Journals (Sweden)

    Farah Ahmed

    2017-01-01

    Full Text Available Objective: The objective of this study is to access various personality traits of doctors of hospital subspecialties. Introduction: One of the most common perceptions in our society is that of medicine being a very stressful profession. The demands of practicing medicine can have significant effects on general health, work satisfaction, professional, and nonprofessional life. To increase the profitability, organizations curtail the staff to reduce the costs. Hence, it can be argued that doctors are subjected to extreme amounts psychiatric duress. Methodology: A descriptive study was conducted in a tertiary care hospital. In which one hundred and twenty-one doctors were approached randomly varying from different specialties. The short form of the psychopathic personality inventory (PPI-SF was used as a questionnaire. Results: One hundred and one (81 females and 20 males doctors from various specialties responded and completed the PPI-SF questionnaire. The subspecialty analysis of the doctors' responses was subdivided into pediatrics, gynecology, medical specialties, surgery, anesthetics, and radiology. Surgeons and GyneObs were the highest scorers on the PPI-SF, with scores of 138 and 149, respectively. Conclusion: This study showed that doctors score higher on a scale of psychopathic personality than the general population. This study also showed that stress immunity is the overriding personality trait in doctors which may, in turn, facilitate better overall patient care. Stress immunity may better facilitate empathy in certain acute situations, which plays a vital role in being a proficient doctor and providing satisfactory patient care and counseling.

  17. Characterization of Sensory Differences in Mixing and Premium Rums Through the Use of Descriptive Sensory Analysis.

    Science.gov (United States)

    Ickes, Chelsea M; Cadwallader, Keith R

    2017-11-01

    This study identified and quantitated perceived sensory differences between 7 premium rums and 2 mixing rums using a hybrid of the Quantitative Descriptive Analysis and Spectrum methods. In addition, the results of this study validated the previously developed rum flavor wheel created from web-based materials. Results showed that the use of the rum flavor wheel aided in sensory term generation, as 17 additional terms were generated after the wheel was provided to panelists. Thirty-eight sensory terms encompassing aroma, aroma-by-mouth, mouthfeel, taste and aftertaste modalities, were generated and evaluated by the panel. Of the finalized terms, only 5 did not exist previously on the rum flavor wheel. Twenty attributes were found to be significantly different among rums. The majority of rums showed similar aroma profiles with the exception of 2 rums, which were characterized by higher perceived intensities of brown sugar, caramel, vanilla, and chocolate aroma, caramel, maple, and vanilla aroma-by-mouth and caramel aftertaste. These results demonstrate the previously developed rum flavor wheel can be used to adequately describe the flavor profile of rum. Additionally, results of this study document the sensory differences among premium rums and may be used to correlate with analytical data to better understand how changes in chemical composition of the product affect sensory perception. © 2017 Institute of Food Technologists®.

  18. Descriptive epidemiology of chronic liver disease in northeastern Italy: an analysis of multiple causes of death.

    Science.gov (United States)

    Fedeli, Ugo; Schievano, Elena; Lisiero, Manola; Avossa, Francesco; Mastrangelo, Giuseppe; Saugo, Mario

    2013-10-10

    The analysis of multiple causes of death data has been applied in the United States to examine the population burden of chronic liver disease (CLD) and to assess time trends of alcohol-related and hepatitis C virus (HCV)-related CLD mortality. The aim of this study was to assess the mortality for CLD by etiology in the Veneto Region (northeastern Italy). Using the 2008-2010 regional archive of mortality, all causes registered on death certificates were extracted and different descriptive epidemiological measures were computed for HCV-related, alcohol-related, and overall CLD-related mortality. The crude mortality rate of all CLD was close to 40 per 100,000 residents. In middle ages (35 to 74 years) CLD was mentioned in about 10% and 6% of all deaths in males and females, respectively. Etiology was unspecified in about half of CLD deaths. In females and males, respectively, HCV was mentioned in 44% and 21% and alcohol in 11% and 26% of overall CLD deaths. A bimodal distribution with age was observed for HCV-related proportional mortality among females, reflecting the available seroprevalence data. Multiple causes of death analyses can provide useful insights into the burden of CLD mortality according to etiology among different population subgroups.

  19. A Descriptive Analysis of Decision Support Systems Research Between 1990 and 2003

    Directory of Open Access Journals (Sweden)

    David Arnott

    2005-05-01

    Full Text Available This paper is the first major report of a project that is investigating the theoretic foundations of decision support systems (DSS. The project was principally motivated by a concern for the direction and relevance of DSS research. The main areas of research focus are the decision and judgement theoretic base of the discipline, the research strategies used in published articles, and the professional relevance of DSS research. The project has analysed 926 DSS articles published in 14 major journals from 1990 to 2003. The findings indicate that DSS research is more dominated by positivist research than general information systems (in particular experiments, surveys, and descriptions of specific applications and systems, is heavily influenced by the work of Herbert Simon, is poorly grounded in contemporary judgement and decision-making research, and falls down in the identification of the nature of clients and users. Of great concern is the finding that DSS research has relatively low professional relevance. An overview of the direction of further analysis is presented.

  20. A Descriptive Analysis of the Use of Twitter by Emergency Medicine Residency Programs.

    Science.gov (United States)

    Diller, David; Yarris, Lalena M

    2018-02-01

    Twitter is increasingly recognized as an instructional tool by the emergency medicine (EM) community. In 2012, the Council of Residency Directors in Emergency Medicine (CORD) recommended that EM residency programs' Twitter accounts be managed solely by faculty. To date, little has been published regarding the patterns of Twitter use by EM residency programs. We analyzed current patterns in Twitter use among EM residency programs with accounts and assessed conformance with CORD recommendations. In this mixed methods study, a 6-question, anonymous survey was distributed via e-mail using SurveyMonkey. In addition, a Twitter-based search was conducted, and the public profiles of EM residency programs' Twitter accounts were analyzed. We calculated descriptive statistics and performed a qualitative analysis on the data. Of 168 Accreditation Council for Graduate Medical Education-accredited EM programs, 88 programs (52%) responded. Of those programs, 58% (51 of 88) reported having a program-level Twitter account. Residents served as content managers for those accounts in the majority of survey respondents (61%, 28 of 46). Most programs did not publicly disclose the identity or position of their Twitter content manager. We found a wide variety of applications for Twitter, with EM programs most frequently using Twitter for educational and promotional purposes. There is significant variability in the numbers of followers for EM programs' Twitter accounts. Applications and usage among EM residency programs are varied, and are frequently not consistent with current CORD recommendations.

  1. Shock Mechanism Analysis and Simulation of High-Power Hydraulic Shock Wave Simulator

    Directory of Open Access Journals (Sweden)

    Xiaoqiu Xu

    2017-01-01

    Full Text Available The simulation of regular shock wave (e.g., half-sine can be achieved by the traditional rubber shock simulator, but the practical high-power shock wave characterized by steep prepeak and gentle postpeak is hard to be realized by the same. To tackle this disadvantage, a novel high-power hydraulic shock wave simulator based on the live firing muzzle shock principle was proposed in the current work. The influence of the typical shock characteristic parameters on the shock force wave was investigated via both theoretical deduction and software simulation. According to the obtained data compared with the results, in fact, it can be concluded that the developed hydraulic shock wave simulator can be applied to simulate the real condition of the shocking system. Further, the similarity evaluation of shock wave simulation was achieved based on the curvature distance, and the results stated that the simulation method was reasonable and the structural optimization based on software simulation is also beneficial to the increase of efficiency. Finally, the combination of theoretical analysis and simulation for the development of artillery recoil tester is a comprehensive approach in the design and structure optimization of the recoil system.

  2. An Overview of the Design and Analysis of Simulation Experiments for Sensitivity Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2004-01-01

    Sensitivity analysis may serve validation, optimization, and risk analysis of simulation models.This review surveys classic and modern designs for experiments with simulation models.Classic designs were developed for real, non-simulated systems in agriculture, engineering, etc.These designs assume a

  3. Regional hydrogeological simulations for Forsmark - numerical modelling using CONNECTFLOW. Preliminary site description Forsmark area - version 1.2

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, Lee; Cox, Ian; Hunter, Fiona; Jackson, Peter; Joyce, Steve; Swift, Ben [Serco Assurance, Risley (United Kingdom); Gylling, Bjoern; Marsic, Niko [Kemakta Konsult AB, Stockholm (Sweden)

    2005-05-01

    The Swedish Nuclear Fuel and Waste Management Company (SKB) carries out site investigations in two different candidate areas in Sweden with the objective of describing the in-situ conditions for a bedrock repository for spent nuclear fuel. The site characterisation work is divided into two phases, an initial site investigation phase (IPLU) and a complete site investigation phase (KPLU). The results of IPLU are used as a basis for deciding on a subsequent KPLU phase. On the basis of the KPLU investigations a decision is made as to whether detailed characterisation will be performed (including sinking of a shaft). An integrated component in the site characterisation work is the development of site descriptive models. These comprise basic models in three dimensions with an accompanying text description. Central in the modelling work is the geological model, which provides the geometrical context in terms of a model of deformation zones and the rock mass between the zones. Using the geological and geometrical description models as a basis, descriptive models for other geo-disciplines (hydrogeology, hydro-geochemistry, rock mechanics, thermal properties and transport properties) will be developed. Great care is taken to arrive at a general consistency in the description of the various models and assessment of uncertainty and possible needs of alternative models. Here, a numerical model is developed on a regional-scale (hundreds of square kilometres) to understand the zone of influence for groundwater flow that affects the Forsmark area. Transport calculations are then performed by particle tracking from a local-scale release area (a few square kilometres) to identify potential discharge areas for the site and using greater grid resolution. The main objective of this study is to support the development of a preliminary Site Description of the Forsmark area on a regional-scale based on the available data of 30 June 2004 and the previous Site Description. A more specific

  4. Analysis on descriptions of precautionary statements in package inserts of medicines

    Directory of Open Access Journals (Sweden)

    Tsuchiya F

    2012-02-01

    Full Text Available Keita Nabeta1, Masaomi Kimura2, Michiko Ohkura2, Fumito Tsuchiya31Graduate School of Engineering and Science, Shibaura Institute of Technology, Toyosu 3-7-5, Koto-ku, Tokyo, 135-8548 Japan; 2Faculty of Engineering, Shibaura Institute of Technology, Toyosu 3-7-5, Koto-ku, Tokyo, 135-8548 Japan; 3School of Pharmacy, International University of Health and Welfare, Minami-Aoyama 1-24-1, Minato-ku, Tokyo, 107-0062 JapanBackground: To prevent medical accidents, users must be informed of the cautions written in medical package inserts. To realize countermeasures by utilizing information systems, we must also implement a drug information database. However, this is not easy to develop, since the descriptions in package inserts are too complex and their information poorly structured. It is necessary to analyze package insert information and propose a data structure.Methods: We analyzed the descriptions of 'precautions for application' in package inserts via text mining methods. In order to summarize statements, we applied dependency analysis to statements and visualized their relations between predicate words and other words. Furthermore, we extracted words representing timing to execute the order.Results: We found that there are four types of statements: direct orders such as "使用する" (use, causative orders such as "使用させる" (make someone use, direct interdictions such as "使用しない" (do not use, and causative interdictions such as "使用させない" (do not make user use. As for words representing timing, we extracted six groups: "at the time of delivery," "at the time of preparation," "in use," "after use," and "at the time of storage." From these results, we obtained points of consideration concerning the subjects of orders in the statements and timing of their execution.Conclusion: From the obtained knowledge, we can define the information structure used to describe the precautionary statement. It should contain information such

  5. Descriptive analysis of the masticatory and salivary functions and gustatory sensitivity in healthy children.

    Science.gov (United States)

    Marquezin, Maria Carolina Salomé; Pedroni-Pereira, Aline; Araujo, Darlle Santos; Rosar, João Vicente; Barbosa, Taís S; Castelo, Paula Midori

    2016-08-01

    The objective of this study is to better understand salivary and masticatory characteristics, this study evaluated the relationship among salivary parameters, bite force (BF), masticatory performance (MP) and gustatory sensitivity in healthy children. The secondary outcome was to evaluate possible gender differences. One hundred and sixteen eutrophic subjects aged 7-11 years old were evaluated, caries-free and with no definite need of orthodontic treatment. Salivary flow rate and pH, total protein (TP), alpha-amylase (AMY), calcium (CA) and phosphate (PHO) concentrations were determined in stimulated (SS) and unstimulated saliva (US). BF and MP were evaluated using digital gnathodynamometer and fractional sieving method, respectively. Gustatory sensitivity was determined by detecting the four primary tastes (sweet, salty, sour and bitter) in three different concentrations. Data were evaluated using descriptive statistics, Mann-Whitney/t-test, Spearman correlation and multiple regression analysis, considering α = 0.05. Significant positive correlation between taste and age was observed. CA and PHO concentrations correlated negatively with salivary flow and pH; sweet taste scores correlated with AMY concentrations and bitter taste sensitivity correlated with US flow rate (p salivary, masticatory characteristics and gustatory sensitivity was observed. The regression analysis showed a weak relationship between the distribution of chewed particles among the different sieves and BF. The concentration of some analytes was influenced by salivary flow and pH. Age, saliva flow and AMY concentrations influenced gustatory sensitivity. In addition, salivary, masticatory and taste characteristics did not differ between genders, and only a weak relation between MP and BF was observed.

  6. Who uses nursing theory? A univariate descriptive analysis of five years' research articles.

    Science.gov (United States)

    Bond, A Elaine; Eshah, Nidal Farid; Bani-Khaled, Mohammed; Hamad, Atef Omar; Habashneh, Samira; Kataua', Hussein; al-Jarrah, Imad; Abu Kamal, Andaleeb; Hamdan, Falastine Rafic; Maabreh, Roqia

    2011-06-01

    Since the early 1950s, nursing leaders have worked diligently to build the Scientific Discipline of Nursing, integrating Theory, Research and Practice. Recently, the role of theory has again come into question, with some scientists claiming nurses are not using theory to guide their research, with which to improve practice. The purposes of this descriptive study were to determine: (i) Were nursing scientists' research articles in leading nursing journals based on theory? (ii) If so, were the theories nursing theories or borrowed theories? (iii) Were the theories integrated into the studies, or were they used as organizing frameworks? Research articles from seven top ISI journals were analysed, excluding regularly featured columns, meta-analyses, secondary analysis, case studies and literature reviews. The authors used King's dynamic Interacting system and Goal Attainment Theory as an organizing framework. They developed consensus on how to identify the integration of theory, searching the Title, Abstract, Aims, Methods, Discussion and Conclusion sections of each research article, whether quantitative or qualitative. Of 2857 articles published in the seven journals from 2002 to, and including, 2006, 2184 (76%) were research articles. Of the 837 (38%) authors who used theories, 460 (55%) used nursing theories, 377 (45%) used other theories: 776 (93%) of those who used theory integrated it into their studies, including qualitative studies, while 51 (7%) reported they used theory as an organizing framework for their studies. Closer analysis revealed theory principles were implicitly implied, even in research reports that did not explicitly report theory usage. Increasing numbers of nursing research articles (though not percentagewise) continue to be guided by theory, and not always by nursing theory. Newer nursing research methods may not explicitly state the use of nursing theory, though it is implicitly implied. © 2010 The Authors. Scandinavian Journal of Caring

  7. Framing health for land-use planning legislation: A qualitative descriptive content analysis.

    Science.gov (United States)

    Harris, Patrick; Kent, Jennifer; Sainsbury, Peter; Thow, Anne Marie

    2016-01-01

    Framing health as a relevant policy issue for other sectors is not well understood. A recent review of the New South Wales (Australia) land-use planning system resulted in the drafting of legislation with an internationally unprecedented focus on human health. We apply a political science approach to investigate the question 'how and to what extent were health and wider issues framed in submissions to the review?' We investigated a range of stakeholder submissions including health focussed agencies (n = 31), purposively identified key stakeholders with influence on the review (n = 24), and a random sample of other agencies and individuals (n = 47). Using qualitative descriptive analysis we inductively coded for the term 'health' and sub-categories. We deductively coded for 'wider concerns' using a locally endorsed 'Healthy Urban Development Checklist'. Additional inductive analysis uncovered further 'wider concerns'. Health was explicitly identified as a relevant issue for planning policy only in submissions by health-focussed agencies. This framing concerned the new planning system promoting and protecting health as well as connecting health to wider planning concerns including economic issues, transport, public open space and, to a slightly lesser extent, environmental sustainability. Key stakeholder and other agency submissions focussed on these and other wider planning concerns but did not mention health in detail. Health agency submissions did not emphasise infrastructure, density or housing as explicitly as others. Framing health as a relevant policy issue has the potential to influence legislative change governing the business of other sectors. Without submissions from health agencies arguing the importance of having health as an objective in the proposed legislation it is unlikely health considerations would have gained prominence in the draft bill. The findings have implications for health agency engagement with legislative change processes and beyond in

  8. Aircraft vulnerability analysis by modelling and simulation

    CSIR Research Space (South Africa)

    Willers, CJ

    2014-09-01

    Full Text Available attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft...

  9. Description, Usage, and Validation of the MVL-15 Modified Vortex Lattice Analysis Capability

    Science.gov (United States)

    Ozoroski, Thomas A.

    2015-01-01

    , thereby providing a pathway to map viscous data to the inviscid results. However, a number of factors can sidetrack the analysis consistency during various stages of this process. For example, it should be expected that the final airplane lift curve and drag polar results depend strongly on the geometry and aerodynamics of the airfoil section; however, flap deflections and flap chord extensions change the local reference geometry of the input airfoil, the airplane wing, the tabulated non-dimensional viscous aerodynamics, and the spanwise links between the linear and the viscous aerodynamics. These changes also affect the bound circulation and therefore, calculation and integration of the induced angle of attack and induced drag. MVL-15 is configured to ensure these types of challenges are properly addressed. This report is a comprehensive manual describing the theory, use, and validation of the MVL-15 analysis tool. Section 3 summarizes theoretical, procedural, and characteristic features of MVL-15, and includes a list of the files required to setup, execute, and summarize an analysis. Section 4, Section 5, Section 6, and Section 7 combine to comprise the User's Guide portions of this report. The MVL-15 input and output files are described in Section 4 and Section 5, respectively; the descriptions are supplemented with example files and information about the file formats, parameter definitions, and typical parameter values. Section 6 describes the Wing Geometry Setup Utility and the 2d-Variants Utility files that simplify and assist setting up a consistent set of MVL-15 geometry and aerodynamics input parameters and input files. Section 7 describes the use of the 3d-Results Presentation Utility file that can be used to automatically create summary tables and charts from the MVL-15 output files. Section 8 documents the Validation Results of an extensive and varied validation test matrix, including results of an airplane analysis representative of the ERA Program. A start

  10. Conformational analysis of oligosaccharides and polysaccharides using molecular dynamics simulations.

    Science.gov (United States)

    Frank, Martin

    2015-01-01

    Complex carbohydrates usually have a large number of rotatable bonds and consequently a large number of theoretically possible conformations can be generated (combinatorial explosion). The application of systematic search methods for conformational analysis of carbohydrates is therefore limited to disaccharides and trisaccharides in a routine analysis. An alternative approach is to use Monte-Carlo methods or (high-temperature) molecular dynamics (MD) simulations to explore the conformational space of complex carbohydrates. This chapter describes how to use MD simulation data to perform a conformational analysis (conformational maps, hydrogen bonds) of oligosaccharides and how to build realistic 3D structures of large polysaccharides using Conformational Analysis Tools (CAT).

  11. Tornado missile simulation and risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Chu, J.

    1978-01-01

    Mathematical models of the contributing events to the tornado missile hazard at nuclear power plants have been developed in which the major sources of uncertainty have been considered in a probabilistic framework. These models have been structured into a sequential event formalism which permits the treatment of both single and multiple missile generation events. A simulation computer code utilizing these models has been developed to obtain estimates of tornado missile event likelihoods. Two case studies have been analyzed; the results indicate that the probability of a single missile from the sampling population impacting any of the plant's targets is less then about 10 -7 per reactor-year. Additional work is needed for verification and sensitivity study

  12. Analysis of ship maneuvering data from simulators

    Science.gov (United States)

    Frette, V.; Kleppe, G.; Christensen, K.

    2011-03-01

    We analyze complex manuevering histories of ships obtained from training sessions on bridge simulators. Advanced ships are used in fields like offshore oil exploration: dive support vessels, supply vessels, anchor handling vessels, tugs, cable layers, and multi-purpose vessels. Due to high demands from the operations carried out, these ships need to have very high maneuverability. This is achieved through a propulsion system with several thrusters, water jets, and rudders in addition to standard propellers. For some operations, like subsea maintenance, it is crucial that the ship accurately keeps a fixed position. Therefore, bridge systems usually incorporate equipment for Dynamic Positioning (DP). DP is a method to keep ships and semi submersible rigs in a fixed position using the propulsion systems instead of anchors. It may also be used for sailing a vessel from one position to another along a predefined route. Like an autopilot on an airplane, DP may operate without human involvement. The method relies on accurate determination of position from external reference systems like GPS, as well as a continuously adjusted mathematical model of the ship and external forces from wind, waves and currents. In a specific simulator exercise for offshore crews, a ship is to be taken up to an installation consisting of three nearby oil platforms connected by bridges (Frigg field, North Sea), where a subsea inspection is to be carried out. Due to the many degrees of freedom during maneuvering, including partly or full use of DP, the chosen routes vary significantly. In this poster we report preliminary results on representations of the complex maneuvering histories; representations that allow comparison between crew groups, and, possibly, sorting of the different strategic choices behind.

  13. Assessment of competence in simulated flexible bronchoscopy using motion analysis

    DEFF Research Database (Denmark)

    Collela, Sara; Svendsen, Morten Bo Søndergaard; Konge, Lars

    2015-01-01

    Background: Flexible bronchoscopy should be performed with a correct posture and a straight scope to optimize bronchoscopy performance and at the same time minimize the risk of work-related injuries and endoscope damage. Objectives: We aimed to test whether an automatic motion analysis system could...... intermediates and 9 experienced bronchoscopy operators performed 3 procedures each on a bronchoscopy simulator. The Microsoft Kinect system was used to automatically measure the total deviation of the scope from a perfectly straight, vertical line. Results: The low-cost motion analysis system could measure...... with the performance on the simulator (virtual-reality simulator score; p analysis system could discriminate between different levels of experience. Automatic feedback on correct movements during self-directed training on simulators might help new bronchoscopists learn how to handle...

  14. Simulated pre-industrial climate in Bergen Climate Model (version 2: model description and large-scale circulation features

    Directory of Open Access Journals (Sweden)

    O. H. Otterå

    2009-11-01

    Full Text Available The Bergen Climate Model (BCM is a fully-coupled atmosphere-ocean-sea-ice model that provides state-of-the-art computer simulations of the Earth's past, present, and future climate. Here, a pre-industrial multi-century simulation with an updated version of BCM is described and compared to observational data. The model is run without any form of flux adjustments and is stable for several centuries. The simulated climate reproduces the general large-scale circulation in the atmosphere reasonably well, except for a positive bias in the high latitude sea level pressure distribution. Also, by introducing an updated turbulence scheme in the atmosphere model a persistent cold bias has been eliminated. For the ocean part, the model drifts in sea surface temperatures and salinities are considerably reduced compared to earlier versions of BCM. Improved conservation properties in the ocean model have contributed to this. Furthermore, by choosing a reference pressure at 2000 m and including thermobaric effects in the ocean model, a more realistic meridional overturning circulation is simulated in the Atlantic Ocean. The simulated sea-ice extent in the Northern Hemisphere is in general agreement with observational data except for summer where the extent is somewhat underestimated. In the Southern Hemisphere, large negative biases are found in the simulated sea-ice extent. This is partly related to problems with the mixed layer parametrization, causing the mixed layer in the Southern Ocean to be too deep, which in turn makes it hard to maintain a realistic sea-ice cover here. However, despite some problematic issues, the pre-industrial control simulation presented here should still be appropriate for climate change studies requiring multi-century simulations.

  15. Analysis of the CVT Efficiency by Simulation

    Directory of Open Access Journals (Sweden)

    Valerian Croitorescu

    2011-09-01

    Full Text Available All vehicle manufacturers desire an ideal vehicle that has the highest powertrain efficiency, best safety factor and ease of maintenance while being environmentally friendly. These highly valued vehicle development characteristics are only reachable after countless research hours. One major powertrain component to be studied in relation to these demands is the Continuous Variable Transmission that a Hybrid Electric Vehicle is equipped with. The CVT can increase the overall powertrain efficiency, offering a continuum variable gear ratios between established minimum and maximum limits. This paper aims to determine the losses of a CVT, operating on a HEV. Using simulation, the losses were computed and the fuel economy was analyzed. During various modes of operation, such as electric, regenerative braking, engine charge for maintaining the battery state of charge, the losses and their dependence with the control properties were analyzed. A relevant determination of precise losses is able to reduce them by using appropriate materials for their components and fluids, more efficient technical manufacturing and usage solutions and innovative control strategy.

  16. Analysis and simulation of BGK electron holes

    Directory of Open Access Journals (Sweden)

    L. Muschietti

    1999-01-01

    Full Text Available Recent observations from satellites crossing regions of magnetic-field-aligned electron streams reveal solitary potential structures that move at speeds much greater than the ion acoustic/thermal velocity. The structures appear as positive potential pulses rapidly drifting along the magnetic field, and are electrostatic in their rest frame. We interpret them as BGK electron holes supported by a drifting population of trapped electrons. Using Laplace transforms, we analyse the behavior of one phase-space electron hole. The resulting potential shapes and electron distribution functions are self-consistent and compatible with the field and particle data associated with the observed pulses. In particular, the spatial width increases with increasing amplitude. The stability of the analytic solution is tested by means of a two-dimensional particle-in-cell simulation code with open boundaries. We consider a strongly magnetized parameter regime in which the bounce frequency of the trapped electrons is much less than their gyrofrequency. Our investigation includes the influence of the ions, which in the frame of the hole appear as an incident beam, and impinge on the BGK potential with considerable energy. The nonlinear structure is remarkably resilient

  17. DESCRIPTIVE ANALYSIS OF THE INTERNATIONAL MIGRATION PHENOMENON IN ROMANIA BETWEEN 1991 AND 2008

    Directory of Open Access Journals (Sweden)

    Bac Dorin Paul

    2011-07-01

    Full Text Available Migration represented and represents a very important phenomenon at global level, taking into consideration besides its demographic implications, its extremely diverse implications such as socio-economic, socio-cultural, territorial, or environmental. This represents, probably, the main reason why the research on migration is interdisciplinary, having strong connections with sociology, political sciences, history, economics, geography, demography, psychology, or low, among others. All these disciplines target different aspects of population migration, and a proper comprehension of the phenomenon implies a contribution from the part of all of them. Although migration represents a phenomenon manifested since ancient times, it has never been such an universal or significant phenomenon from the socio-economical or political perspective, as it is in present times. International migration has both a negative and positive impact on both provider and receiving countries, in general playing a very important role in the structure and dimension of the population of a country. Romania is not an exception to the previously expressed statement; furthermore, after the fall of the communist regime, migration became for Romania one of the most important socio-economical phenomena. The present paper aims at analyzing in a descriptive manner the international migration phenomenon in Romania between 1991 and 2008, from quantitative perspective. Based on data identified in the "Statistical Yearbook of Romania - 2008 and 2009 editions - the analysis revealed the fact that both immigration and emigration flows registered oscillatory evolutions in the analysed period, but the general trend of immigration was of increasing, while the one of emigration was of decreasing. Immigration was dominated by the presence of males, of persons aged between 26 and 40 and of persons coming from the Republic of Moldova. On the other side, in the case of emigration the significant

  18. Descriptive sensory analysis of marinated and non-marinated wooden breast fillet portions.

    Science.gov (United States)

    Maxwell, A D; Bowker, B C; Zhuang, H; Chatterjee, D; Adhikari, K

    2018-05-14

    The wooden breast (WB) myopathy influences muscle composition and texture characteristics in broiler breast meat. It is unknown if marination reduces the negative influence of WB on meat sensory quality or if WB effects are uniform throughout the Pectoralis major. The objective of this study was to determine the effects of marination on the sensory attributes and instrumental shear force measurements of the ventral (skin-side) and dorsal (bone-side) portions of normal and severe WB meat. Sixty butterfly fillets (30 normal and 30 severe WB) were selected from the deboning line of a commercial processing plant. Individual fillets were portioned into ventral and dorsal halves. Portions from one side of each butterfly were used as non-marinated controls, and portions from the other side were vacuum-tumble marinated (16 rpm, -0.6 atm, 4°C, 20 min) with 20% (wt/wt) marinade to meat ratio. Marinade was formulated to target a concentration of 0.75% (w/v) salt and 0.45% (w/v) sodium tripolyphosphate in the final product. Descriptive sensory analysis (9 trained panelists) was conducted to evaluate visual, texture, and flavor attributes (0-15 point scale) of breast portions along with Warner-Bratzler shear force. Significant interaction effects between WB and marination were not observed for the sensory attributes. Greater springiness, cohesiveness, hardness, fibrousness, and chewiness scores were observed in WB samples (P sensory texture attributes were more apparent in the ventral portions of the breast fillets. Flavor attributes (salty and brothy) increased (P sensory quality is not uniform throughout the Pectoralis major and that WB-related differences in cooked meat sensory texture attributes are lessened but not eliminated by vacuum-tumbling marination.

  19. [Descriptive Analysis of Health Economics of Intensive Home Care of Ventilated Patients].

    Science.gov (United States)

    Lehmann, Yvonne; Ostermann, Julia; Reinhold, Thomas; Ewers, Michael

    2018-05-14

    Long-term ventilated patients in Germany receive intensive care mainly in the patients' home or in assisted-living facilities. There is a lack of knowledge about the nature and extent of resource use and costs associated with care of this small, heterogeneous but overall growing patient group. A sub-study in the context of a research project SHAPE analyzed costs of 29 patients descriptively from a social perspective. Direct and indirect costs of intensive home care over a period of three months were recorded and analyzed retrospectively. Standardized recorded written self-reports from patients and relatives as well as information from the interviewing of nursing staff and from nursing documentation were the basis for this analysis. There was an average total cost of intensive home care for three months per patient of 61194 € (95% CI 53 884-68 504) including hospital stays. The main costs were directly linked to outpatient medical and nursing care provided according to the Code of Social Law V and XI. Services provided by nursing home care service according to § 37(2) Code of Social Law V (65%) were the largest cost item. Approximately 13% of the total costs were attributable to indirect costs. Intensive home care for ventilated patients is resource-intensive and cost-intensive and has received little attention also from a health economics perspective. Valid information and transparency about the cost structures are required for an effective and economic design and management of the long-term care of this patient group. © Georg Thieme Verlag KG Stuttgart · New York.

  20. Physicomathematical Simulation Analysis for Small Bullets

    Directory of Open Access Journals (Sweden)

    D. P. Margaris

    2008-01-01

    Full Text Available A full six degrees of freedom (6-DOF flight dynamics model is proposed for the accurate prediction of short and long-range trajectories of small bullets via atmospheric flight to final impact point. The mathematical model is based on the full equations of motion set up in the no-roll body reference frame and is integrated numerically from given initial conditions at the firing site. The projectile maneuvering motion depends on the most significant force and moment variations, in addition to gravity and Magnus effect. The computational flight analysis takes into consideration the Mach number and total angle of attack effects by means of the variable aerodynamic coefficients. For the purposes of the present work, linear interpolation has been applied for aerodynamic coefficients from the official tabulated database. The developed computational method gives satisfactory agreement with published data of verified experiments and computational codes on atmospheric projectile trajectory analysis for various initial firing flight conditions.

  1. PDB4DNA: Implementation of DNA geometry from the Protein Data Bank (PDB) description for Geant4-DNA Monte-Carlo simulations

    Science.gov (United States)

    Delage, E.; Pham, Q. T.; Karamitros, M.; Payno, H.; Stepan, V.; Incerti, S.; Maigne, L.; Perrot, Y.

    2015-07-01

    This paper describes PDB4DNA, a new Geant4 user application, based on an independent, cross-platform, free and open source C++ library, so-called PDBlib, which enables use of atomic level description of DNA molecule in Geant4 Monte Carlo particle transport simulations. For the evaluation of direct damage induced on the DNA molecule by ionizing particles, the application makes use of an algorithm able to determine the closest atom in the DNA molecule to energy depositions. Both the PDB4DNA application and the PDBlib library are available as free and open source under the Geant4 license.

  2. SOSA – a new model to simulate the concentrations of organic vapours and sulphuric acid inside the ABL – Part 1: Model description and initial evaluation

    DEFF Research Database (Denmark)

    Boy, M.; Sogachev, Andrey; Lauros, J.

    2010-01-01

    Chemistry in the atmospheric boundary layer (ABL) is controlled by complex processes of surface fluxes, flow, turbulent transport, and chemical reactions. We present a new model SOSA (model to simulate the concentration of organic vapours and sulphuric acid) and attempt to reconstruct the emissions...... in the surface layer we were able to get a reasonable description of turbulence and other quantities through the ABL. As a first application of the model, we present vertical profiles of organic compounds and discuss their relation to newly formed particles....

  3. SOSA – a new model to simulate the concentrations of organic vapours and sulphuric acid inside the ABL – Part 1: Model description and initial evaluation

    DEFF Research Database (Denmark)

    Boy, M.; Sogachev, Andrey; Lauros, J.

    2011-01-01

    Chemistry in the atmospheric boundary layer (ABL) is controlled by complex processes of surface fluxes, flow, turbulent transport, and chemical reactions. We present a new model SOSA (model to simulate the concentration of organic vapours and sulphuric acid) and attempt to reconstruct the emissions...... in the surface layer we were able to get a reasonable description of turbulence and other quantities through the ABL. As a first application of the model, we present vertical profiles of organic compounds and discuss their relation to newly formed particles....

  4. ROSA-V large scale test facility (LSTF) system description for the third and fourth simulated fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Mitsuhiro; Nakamura, Hideo; Ohtsu, Iwao [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment] [and others

    2003-03-01

    The Large Scale Test Facility (LSTF) is a full-height and 1/48 volumetrically scaled test facility of the Japan Atomic Energy Research Institute (JAERI) for system integral experiments simulating the thermal-hydraulic responses at full-pressure conditions of a 1100 MWe-class pressurized water reactor (PWR) during small break loss-of-coolant accidents (SBLOCAs) and other transients. The LSTF can also simulate well a next-generation type PWR such as the AP600 reactor. In the fifth phase of the Rig-of-Safety Assessment (ROSA-V) Program, eighty nine experiments have been conducted at the LSTF with the third simulated fuel assembly until June 2001, and five experiments have been conducted with the newly-installed fourth simulated fuel assembly until December 2002. In the ROSA-V program, various system integral experiments have been conducted to certify effectiveness of both accident management (AM) measures in beyond design basis accidents (BDBAs) and improved safety systems in the next-generation reactors. In addition, various separate-effect tests have been conducted to verify and develop computer codes and analytical models to predict non-homogeneous and multi-dimensional phenomena such as heat transfer across the steam generator U-tubes under the presence of non-condensable gases in both current and next-generation reactors. This report presents detailed information of the LSTF system with the third and fourth simulated fuel assemblies for the aid of experiment planning and analyses of experiment results. (author)

  5. Analysis of Medication Errors in Simulated Pediatric Resuscitation by Residents

    Directory of Open Access Journals (Sweden)

    Evelyn Porter

    2014-07-01

    Full Text Available Introduction: The objective of our study was to estimate the incidence of prescribing medication errors specifically made by a trainee and identify factors associated with these errors during the simulated resuscitation of a critically ill child. Methods: The results of the simulated resuscitation are described. We analyzed data from the simulated resuscitation for the occurrence of a prescribing medication error. We compared univariate analysis of each variable to medication error rate and performed a separate multiple logistic regression analysis on the significant univariate variables to assess the association between the selected variables. Results: We reviewed 49 simulated resuscitations . The final medication error rate for the simulation was 26.5% (95% CI 13.7% - 39.3%. On univariate analysis, statistically significant findings for decreased prescribing medication error rates included senior residents in charge, presence of a pharmacist, sleeping greater than 8 hours prior to the simulation, and a visual analog scale score showing more confidence in caring for critically ill children. Multiple logistic regression analysis using the above significant variables showed only the presence of a pharmacist to remain significantly associated with decreased medication error, odds ratio of 0.09 (95% CI 0.01 - 0.64. Conclusion: Our results indicate that the presence of a clinical pharmacist during the resuscitation of a critically ill child reduces the medication errors made by resident physician trainees.

  6. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. Statistical theory proves that more information is obtained when applying Design Of Experiments (DOE) and linear regression analysis. Unfortunately, classic

  7. Simulation Experiments in Practice : Statistical Design and Regression Analysis

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2007-01-01

    In practice, simulation analysts often change only one factor at a time, and use graphical analysis of the resulting Input/Output (I/O) data. The goal of this article is to change these traditional, naïve methods of design and analysis, because statistical theory proves that more information is

  8. Facility/equipment performance evaluation using microcomputer simulation analysis

    International Nuclear Information System (INIS)

    Chockie, A.D.; Hostick, C.J.

    1985-08-01

    A computer simulation analysis model was developed at the Pacific Northwest Laboratory to assist in assuring the adequacy of the Monitored Retrievable Storage facility design to meet the specified spent nuclear fuel throughput requirements. The microcomputer-based model was applied to the analysis of material flow, equipment capability and facility layout. The simulation analysis evaluated uncertainties concerning both facility throughput requirements and process duration times as part of the development of a comprehensive estimate of facility performance. The evaluations provided feedback into the design review task to identify areas where design modifications should be considered

  9. Perfil sensorial de ovos de Páscoa Descriptive analysis of easter eggs

    Directory of Open Access Journals (Sweden)

    Valéria P. R. MINIM

    2000-04-01

    Full Text Available Ovo de Páscoa é um produto de chocolate na forma de ovo comercializado no Brasil durante a Páscoa. No presente estudo, Análise Descritiva Quantitativa foi aplicada para levantar atributos sensoriais que melhor definem modificações na aparência, aroma, sabor e textura, quando sucedâneos da manteiga de cacau (SUMC são adicionados ao ovo de Páscoa. Amostras com e sem a adição de SUMC foram avaliadas por uma equipe selecionada de provadores e quatorze atributos sensorial foram definidos. Após um período de treinamento, os provadores avaliaram as amostras através de delineamento de blocos completos balanceadas usando escala não estruturada de 9cm. Os resultados foram analisados através da análise do componente principal, ANOVA e teste de Tukey (pEaster egg is a popular chocolate-candy in egg form commercialized in Brazil during Easter time. In this research, Quantitative Descriptive Analysis was applied to select sensory attributes which best define the modifications in appearance, aroma, flavor and texture when cocoa butter equivalent (CBE is added to Easter eggs. Samples with and without CBE were evaluated by a selected panel and fourteen attributes best describing similarities and differences between them, were defined. Terms definition, reference materials and a consensus ballot were developed. After a training period, panelists evaluated the samples in a Complete Block Design using a 9 cm unstructured scale. Principal Component Analysis, ANOVA and Tukey test (p<0.05 were applied to the data in order to select attributes which best discriminated and characterized the samples. Samples showed significant differences (p<0.05 in all attributes. Easter egg without CBE showed higher intensities (p<0.05 in relation to the following descriptors: brown color, characteristic aroma, cocoa mass aroma, cocoa butter aroma, characteristic flavor, cocoa mass flavor, hardness and brittleness.

  10. Simulation of Rn-222 decay products concentration deposited on a filter. Description of radon1.pas computer program

    International Nuclear Information System (INIS)

    Machaj, B.

    1996-01-01

    A computer program allowing simulation of activity distribution of 222 Rn short lived decay products deposited on a filter against time is presented, for any radiation equilibrium degree of the decay products. Deposition of the decay products is simulated by summing discrete samples every 1/10 min in the sampling time from 1 to 10 min. The concentration (activity) of the decay products is computed in one minute intervals in the range 1 - 100 min. The alpha concentration and the total activity of 218 Po + 214 Po produced are computed in the range 1 to 100 min as well. (author). 10 refs, 4 figs

  11. Equilibration and analysis of first-principles molecular dynamics simulations of water

    Science.gov (United States)

    Dawson, William; Gygi, François

    2018-03-01

    First-principles molecular dynamics (FPMD) simulations based on density functional theory are becoming increasingly popular for the description of liquids. In view of the high computational cost of these simulations, the choice of an appropriate equilibration protocol is critical. We assess two methods of estimation of equilibration times using a large dataset of first-principles molecular dynamics simulations of water. The Gelman-Rubin potential scale reduction factor [A. Gelman and D. B. Rubin, Stat. Sci. 7, 457 (1992)] and the marginal standard error rule heuristic proposed by White [Simulation 69, 323 (1997)] are evaluated on a set of 32 independent 64-molecule simulations of 58 ps each, amounting to a combined cumulative time of 1.85 ns. The availability of multiple independent simulations also allows for an estimation of the variance of averaged quantities, both within MD runs and between runs. We analyze atomic trajectories, focusing on correlations of the Kohn-Sham energy, pair correlation functions, number of hydrogen bonds, and diffusion coefficient. The observed variability across samples provides a measure of the uncertainty associated with these quantities, thus facilitating meaningful comparisons of different approximations used in the simulations. We find that the computed diffusion coefficient and average number of hydrogen bonds are affected by a significant uncertainty in spite of the large size of the dataset used. A comparison with classical simulations using the TIP4P/2005 model confirms that the variability of the diffusivity is also observed after long equilibration times. Complete atomic trajectories and simulation output files are available online for further analysis.

  12. Cost Analysis of Poor Quality Using a Software Simulation

    Directory of Open Access Journals (Sweden)

    Jana Fabianová

    2017-02-01

    Full Text Available The issues of quality, cost of poor quality and factors affecting quality are crucial to maintaining a competitiveness regarding to business activities. Use of software applications and computer simulation enables more effective quality management. Simulation tools offer incorporating the variability of more variables in experiments and evaluating their common impact on the final output. The article presents a case study focused on the possibility of using computer simulation Monte Carlo in the field of quality management. Two approaches for determining the cost of poor quality are introduced here. One from retrospective scope of view, where the cost of poor quality and production process are calculated based on historical data. The second approach uses the probabilistic characteristics of the input variables by means of simulation, and reflects as a perspective view of the costs of poor quality. Simulation output in the form of a tornado and sensitivity charts complement the risk analysis.

  13. Psychometric analysis of simulated psychopathology during sick leave

    Directory of Open Access Journals (Sweden)

    Ignacio Jáuregui Lobera

    2018-01-01

    Full Text Available Simulation from a categorical or diagnostic perspective, has turned into a more dimensional point of view, so it is possible to establish different “levels” of simulation. In order to analyse, from a psychometric perspective, the possible prediction of simulated behaviour based on common measures of general psychopathology, the objective of the current study was to analyse possible predictors of the Structured Symptomatic Simulation Inventory (SIMS scores considering as dependent variables the total SIMS score, the SIMS subscales scores, and the cut-off points usually suggested to discriminate between “no suspected simulation”/“suspected simulation”, which usually are 14 and 16. In terms of possible predictors, a set of variables were established: a categorical (sex, type of treatment - psychopharmacological, psychotherapeutic, combined-, type of work activity, being self-employed or not, presence-absence of a history of psychopathology (both familial and personal, presence or not of associated physical pathology, diagnosis -according to ICD-10- and the final proposal -return to work, sick leave extended, proposal of permanent work incapacity-; and b continuous (perceived stress -general and current, self-esteem, results of a screening questionnaire for personality disorders and scores on a symptoms questionnaire. In addition, a descriptive study of all variables was carried out and possible differences of genre were analysed.

  14. AN ANALYSIS OF STUDENT‘S DESCRIPTIVE TEXT: SYSTEMIC FUNCTIONAL LINGUISTICS PERSPECTIVES

    Directory of Open Access Journals (Sweden)

    Rizka Maulina Wulandari

    2017-12-01

    Full Text Available In Indonesia where different languages co-exist, and where English is used as a foreign language, the learners‘ capabilities in writing toward English plays an important role in formulating effective learning method. This descriptive qualitative research aimed to investigate the student‘s errors in writing descriptive text in SFL perspectives. A secondary, yet important, objective of this research is also to design the appropriate pedagogical plans that can be used for junior high school students in Indonesian education based on the result of the research. The results indicated that the student has good control about the schematic structure of descriptive text although many of his idea still uses Indonesian context which make the reader can be confused in understanding his meaning. It can be concluded that there is intervention from L1, that is Indonesian language, while he wrote his descriptive text.. Hence, the study highlighted that cooperative learning could be an option as an appropriate learning method to solve the students problem on writing descriptive text.

  15. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    Science.gov (United States)

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix. © 2016 Society for Risk Analysis.

  16. Simulation and Analysis of Roller Chain Drive Systems

    DEFF Research Database (Denmark)

    Pedersen, Sine Leergaard

    The subject of this thesis is simulation and analysis of large roller chain drive systems, such as e.g. used in marine diesel engines. The aim of developing a chain drive simulation program is to analyse dynamic phenomena of chain drive systems and investigate different design changes to the syst......The subject of this thesis is simulation and analysis of large roller chain drive systems, such as e.g. used in marine diesel engines. The aim of developing a chain drive simulation program is to analyse dynamic phenomena of chain drive systems and investigate different design changes...... mathematical models, and compare to the prior done research. Even though the model is developed at first for the use of analysing chain drive systems in marine engines, the methods can with small changes be used in general, as for e.g. chain drives in industrial machines, car engines and motorbikes. A novel...

  17. Angular filter refractometry analysis using simulated annealing.

    Science.gov (United States)

    Angland, P; Haberberger, D; Ivancic, S T; Froula, D H

    2017-10-01

    Angular filter refractometry (AFR) is a novel technique used to characterize the density profiles of laser-produced, long-scale-length plasmas [Haberberger et al., Phys. Plasmas 21, 056304 (2014)]. A new method of analysis for AFR images was developed using an annealing algorithm to iteratively converge upon a solution. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on the minimization of the χ 2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in an average uncertainty in the density profile of 5%-20% in the region of interest.

  18. An Innovative Tool for Intraoperative Electron Beam Radiotherapy Simulation and Planning: Description and Initial Evaluation by Radiation Oncologists

    Energy Technology Data Exchange (ETDEWEB)

    Pascau, Javier, E-mail: jpascau@mce.hggm.es [Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Departamento de Bioingenieria e Ingenieria Aeroespacial, Universidad Carlos III de Madrid, Madrid (Spain); Santos Miranda, Juan Antonio [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Calvo, Felipe A. [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Departamento de Oncologia, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Bouche, Ana; Morillo, Virgina [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); Gonzalez-San Segundo, Carmen [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Ferrer, Carlos; Lopez Tarjuelo, Juan [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); and others

    2012-06-01

    Purpose: Intraoperative electron beam radiation therapy (IOERT) involves a modified strategy of conventional radiation therapy and surgery. The lack of specific planning tools limits the spread of this technique. The purpose of the present study is to describe a new simulation and planning tool and its initial evaluation by clinical users. Methods and Materials: The tool works on a preoperative computed tomography scan. A physician contours regions to be treated and protected and simulates applicator positioning, calculating isodoses and the corresponding dose-volume histograms depending on the selected electron energy. Three radiation oncologists evaluated data from 15 IOERT patients, including different tumor locations. Segmentation masks, applicator positions, and treatment parameters were compared. Results: High parameter agreement was found in the following cases: three breast and three rectal cancer, retroperitoneal sarcoma, and rectal and ovary monotopic recurrences. All radiation oncologists performed similar segmentations of tumors and high-risk areas. The average applicator position difference was 1.2 {+-} 0.95 cm. The remaining cancer sites showed higher deviations because of differences in the criteria for segmenting high-risk areas (one rectal, one pancreas) and different surgical access simulated (two rectal, one Ewing sarcoma). Conclusions: The results show that this new tool can be used to simulate IOERT cases involving different anatomic locations, and that preplanning has to be carried out with specialized surgical input.

  19. Simulating potential growth and yield of oil palm (Elaeis guineensis) with PALMSIM: Model description, evaluation and application

    NARCIS (Netherlands)

    Hoffmann, M.; Castaneda Vera, A.; Wijk, van M.T.; Giller, K.E.; Oberthür, T.; Donough, C.; Whitbread, A.M.

    2014-01-01

    Reducing the gap between water-limited potential yield and actual yield in oil palm production systems through intensification is seen as an important option for sustainably increasing palm oil production. Simulation models can play an important role in quantifying water-limited potential yield, and

  20. Computational simulation of migration and dispersion in free capillary zone electrophoresis, I: Description of the theoretical model

    NARCIS (Netherlands)

    Reijenga, J.C.; Kenndler, E.

    1994-01-01

    An instrument simulator was developed for high-performance capillary electrophoresis which allows for fast graphic illustration of the effect of a large number of variables on the shape of the electropherogram. The input data of the separands are values of pK and mobilities at 25°C and infinite

  1. A Descriptive Study of Registers Found in Spoken and Written Communication (A Semantic Analysis

    Directory of Open Access Journals (Sweden)

    Nurul Hidayah

    2016-07-01

    Full Text Available This research is descriptive study of registers found in spoken and written communication. The type of this research is Descriptive Qualitative Research. In this research, the data of the study is register in spoken and written communication that are found in a book entitled "Communicating! Theory and Practice" and from internet. The data can be in the forms of words, phrases and abbreviation. In relation with method of collection data, the writer uses the library method as her instrument. The writer relates it to the study of register in spoken and written communication. The technique of analyzing the data using descriptive method. The types of register in this term will be separated into formal register and informal register, and identify the meaning of register.

  2. Analysis of Time Delay Simulation in Networked Control System

    OpenAIRE

    Nyan Phyo Aung; Zaw Min Naing; Hla Myo Tun

    2016-01-01

    The paper presents a PD controller for the Networked Control Systems (NCS) with delay. The major challenges in this networked control system (NCS) are the delay of the data transmission throughout the communication network. The comparative performance analysis is carried out for different delays network medium. In this paper, simulation is carried out on Ac servo motor control system using CAN Bus as communication network medium. The True Time toolbox of MATLAB is used for simulation to analy...

  3. Hydrodynamic analysis and simulation of a flow cell ammonia electrolyzer

    International Nuclear Information System (INIS)

    Diaz, Luis A.; Botte, Gerardine G.

    2015-01-01

    Highlights: • NH_3 electrooxidation mechanism was validated in a bench scale electrolyzer. • All kinetic parameters for NH_3 electro-oxidation were calculated and verified. • Hydrodynamic behavior of the NH_3 electrolyzer was properly described as a CSTR. • CSTR model was successfully applied to simulate a flow ammonia electrolyzer. - Abstract: The hydrodynamic analysis and simulation of a non-ideal single pass flow cell alkaline ammonia electrolyzer was performed after the scale-up of a well-characterized deposited polycrystalline Pt on Ni anode. The hydrodynamic analysis was performed using the residence time distribution (RTD) test. The results of the hydrodynamic investigation provide additional insights for the kinetic analysis of the ammonia electrooxidation reaction on polycrystalline Pt electrocatalysts -which are typically obtained under controlled flow regime, e.g., rotating disk electrode- by including the flow non-uniformity present in the electrolyzer. Based on the RTD function, the ammonia electrolyzer performance was simulated as a non-steady stirred tank reactor (CSTR) and the unknown kinetic parameters were obtained by fitting the simulation results with an experimental current profile, obtaining an adequate prediction of the ammonia conversion. This simplified approach for the simulation of the ammonia electrolyzer could be implemented in process simulation packages and could be used for the design and scale-up of the process for hydrogen production and wastewater remediation.

  4. Raised fields in the Llanos de Moxos, Bolivia - description and analysis of their morphology

    Science.gov (United States)

    Rodrigues, Leonor; Lombardo, Umberto; Veit, Heinz

    2014-05-01

    The disturbance of Pre Columbian populations on Amazonian ecosystems is being actively debated. The traditional view of amazon being an untouched landscape because of its poor soils and harsh climate has been challenged and the extreme opposite idea of highly modified landscapes with complex societies is growing. Recent research has led to new impulses and issues requesting about the agricultural strategies people developed to survive in this climate. The Llanos de Moxos, situated in the Bolivian Lowlands in south-eastern Amazonia is one important region which was densely altered and where a great variety of earthworks can be found. One of the most impressive earth works are the raised fields, which are earth platforms for cultivation of differing shape and dimension that are elevated above the landscapes natural surface. In contrast to the "terra preta" formation where artefacts and amendments like charcoal and kitchen waste have been clearly identified, raised fields have shown to be artefact poor and studies up till now couldn't find any evidence of additional amendments which could have improved soil quality in the long term. As a result the function and productivity of raised fields is still not well understood and is being actively discussed. Detailed investigations on raised fields located in the indigenous community of Bermeo, in the vicinity of San Ignacio de Moxos provides data showing a novel explanation of the Pre-Columbian management of raised fields, and a chronological sequence of their utilization and abandonment. OSL dating has shown that the raised fields had been in use since as early as 600 AD. Comparison of Geochemistry with a reference profile, away from raised fields, showed that there is no evidence for manure amendments deriving from kitchen waste or animal residues suggesting a rather extensive use of those fields. Complementary the description of intern morphology and laboratory analysis of this raised fields, combined with radiocarbon

  5. Development and simulation of various methods for neutron activation analysis

    International Nuclear Information System (INIS)

    Otgooloi, B.

    1993-01-01

    Simple methods for neutron activation analysis have been developed. The results on the studies of installation for determination of fluorine in fluorite ores directly on the lorry by fast neutron activation analysis have been shown. Nitrogen in organic materials was shown by N 14 and N 15 activation. The description of the new equipment 'FLUORITE' for fluorate factory have been shortly given. Pu and Be isotope in organic materials, including in wheat, was measured. 25 figs, 19 tabs. (Author, Translated by J.U)

  6. Structure optimization and simulation analysis of the quartz micromachined gyroscope

    Directory of Open Access Journals (Sweden)

    Xuezhong Wu

    2014-02-01

    Full Text Available Structure optimization and simulation analysis of the quartz micromachined gyroscope are reported in this paper. The relationships between the structure parameters and the frequencies of work mode were analysed by finite element analysis. The structure parameters of the quartz micromachined gyroscope were optimized to reduce the difference between the frequencies of the drive mode and the sense mode. The simulation results were proved by testing the prototype gyroscope, which was fabricated by micro-electromechanical systems (MEMS technology. Therefore, the frequencies of the drive mode and the sense mode can match each other by the structure optimization and simulation analysis of the quartz micromachined gyroscope, which is helpful in the design of the high sensitivity quartz micromachined gyroscope.

  7. Kinematics Simulation Analysis of Packaging Robot with Joint Clearance

    Science.gov (United States)

    Zhang, Y. W.; Meng, W. J.; Wang, L. Q.; Cui, G. H.

    2018-03-01

    Considering the influence of joint clearance on the motion error, repeated positioning accuracy and overall position of the machine, this paper presents simulation analysis of a packaging robot — 2 degrees of freedom(DOF) planar parallel robot based on the characteristics of high precision and fast speed of packaging equipment. The motion constraint equation of the mechanism is established, and the analysis and simulation of the motion error are carried out in the case of turning the revolute clearance. The simulation results show that the size of the joint clearance will affect the movement accuracy and packaging efficiency of the packaging robot. The analysis provides a reference point of view for the packaging equipment design and selection criteria and has a great significance on the packaging industry automation.

  8. In situ visualization and data analysis for turbidity currents simulation

    Science.gov (United States)

    Camata, Jose J.; Silva, Vítor; Valduriez, Patrick; Mattoso, Marta; Coutinho, Alvaro L. G. A.

    2018-01-01

    Turbidity currents are underflows responsible for sediment deposits that generate geological formations of interest for the oil and gas industry. LibMesh-sedimentation is an application built upon the libMesh library to simulate turbidity currents. In this work, we present the integration of libMesh-sedimentation with in situ visualization and in transit data analysis tools. DfAnalyzer is a solution based on provenance data to extract and relate strategic simulation data in transit from multiple data for online queries. We integrate libMesh-sedimentation and ParaView Catalyst to perform in situ data analysis and visualization. We present a parallel performance analysis for two turbidity currents simulations showing that the overhead for both in situ visualization and in transit data analysis is negligible. We show that our tools enable monitoring the sediments appearance at runtime and steer the simulation based on the solver convergence and visual information on the sediment deposits, thus enhancing the analytical power of turbidity currents simulations.

  9. A computer simulation of the turbocharged turbo compounded diesel engine system: A description of the thermodynamic and heat transfer models

    Science.gov (United States)

    Assanis, D. N.; Ekchian, J. E.; Frank, R. M.; Heywood, J. B.

    1985-01-01

    A computer simulation of the turbocharged turbocompounded direct-injection diesel engine system was developed in order to study the performance characteristics of the total system as major design parameters and materials are varied. Quasi-steady flow models of the compressor, turbines, manifolds, intercooler, and ducting are coupled with a multicylinder reciprocator diesel model, where each cylinder undergoes the same thermodynamic cycle. The master cylinder model describes the reciprocator intake, compression, combustion and exhaust processes in sufficient detail to define the mass and energy transfers in each subsystem of the total engine system. Appropriate thermal loading models relate the heat flow through critical system components to material properties and design details. From this information, the simulation predicts the performance gains, and assesses the system design trade-offs which would result from the introduction of selected heat transfer reduction materials in key system components, over a range of operating conditions.

  10. A Decade of International Activities by U.S. Nurse Faculty: A Descriptive Analysis.

    Science.gov (United States)

    Lusk, Brigid; Lash, Ayhan Aytekin

    2002-01-01

    A study to assess scholarly activities of U.S. nursing faculty abroad from 1985-1995 resulted in descriptions of 805 visits to 109 countries by 247 scholars. Results showed that U.S. nurse faculty were involved in diverse and widespread international nursing activities. (Contains 25 references.) (JOW)

  11. Pakistani English Newspaper Paid Obituary Announcements: A Descriptive Analysis of the Transliterated Vocabulary

    Science.gov (United States)

    Chaudhry, Sajid M.; Christopher, Anne A.; Krishnasamy, Hariharan A/L N.

    2016-01-01

    The study, qualitative and descriptive in nature, examines the use of transliteration in the paid Pakistani obituary announcements authored in the English language. Primarily, it identifies the frequently used transliterated vocabulary in these linguistic messages and reconnoiters the functional relationship that emerges in and between the textual…

  12. Dinaka/kiba: A descriptive analysis of a Northern Sotho song-dance ...

    African Journals Online (AJOL)

    A disjuncture in the description of persists between practitioners of the genre, deemed custodians of Northern Sotho culture, and some scholars. Drawing from extensive fieldwork and ... song-dance performative compound. Keywords: Indigenous music, African drumming, African performance, African folklore, African dance.

  13. ASAS: Computational code for Analysis and Simulation of Atomic Spectra

    Directory of Open Access Journals (Sweden)

    Jhonatha R. dos Santos

    2017-01-01

    Full Text Available The laser isotopic separation process is based on the selective photoionization principle and, because of this, it is necessary to know the absorption spectrum of the desired atom. Computational resource has become indispensable for the planning of experiments and analysis of the acquired data. The ASAS (Analysis and Simulation of Atomic Spectra software presented here is a helpful tool to be used in studies involving atomic spectroscopy. The input for the simulations is friendly and essentially needs a database containing the energy levels and spectral lines of the atoms subjected to be studied.

  14. Two-step rating-based 'double-faced applicability' test for sensory analysis of spread products as an alternative to descriptive analysis with trained panel.

    Science.gov (United States)

    Kim, In-Ah; den-Hollander, Elyn; Lee, Hye-Seong

    2018-03-01

    Descriptive analysis with a trained sensory panel has thus far been the most well defined methodology to characterize various products. However, in practical terms, intensive training in descriptive analysis has been recognized as a serious defect. To overcome this limitation, various novel rapid sensory profiling methodologies have been suggested in the literature. Among these, attribute-based methodologies such as check-all-that-apply (CATA) questions showed results comparable to those of conventional sensory descriptive analysis. Kim, Hopkinson, van Hout, and Lee (2017a, 2017b) have proposed a novel attribute-based methodology termed the two-step rating-based 'double-faced applicability' test with a novel output measure of applicability magnitude (d' A ) for measuring consumers' product usage experience throughout various product usage stages. In this paper, the potential of the two-step rating-based 'double-faced applicability' test with d' A was investigated as an alternative to conventional sensory descriptive analysis in terms of sensory characterization and product discrimination. Twelve commercial spread products were evaluated using both conventional sensory descriptive analysis with a trained sensory panel and two-step rating-based 'double-faced applicability' test with an untrained sensory panel. The results demonstrated that the 'double-faced applicability' test can be used to provide a direct measure of the applicability magnitude of sensory attributes of the samples tested in terms of d' A for sensory characterization of individual samples and multiple sample comparisons. This suggests that when the appropriate list of attributes to be used in the questionnaire is already available, the two-step rating-based 'double-faced applicability' test with d' A can be used as a more efficient alternative to conventional descriptive analysis, without requiring any intensive training process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Contribution to aroma characteristics of mutton process flavor from the enzymatic hydrolysate of sheep bone protein assessed by descriptive sensory analysis and gas chromatography olfactometry.

    Science.gov (United States)

    Zhan, Ping; Tian, Honglei; Zhang, Xiaoming; Wang, Liping

    2013-03-15

    Changes in the aroma characteristics of mutton process flavors (MPFs) prepared from sheep bone protein hydrolysates (SBPHs) with different degrees of hydrolysis (DH) were evaluated using gas chromatography-mass spectrometry (GC-MS), gas chromatography-olfactometry (GC-O), and descriptive sensory analysis (DSA). Five attributes (muttony, meaty, roasted, mouthful, and simulate) were selected to assess MPFs. The results of DSA showed a distinct difference among the control sample MPF0 and other MPF samples with added SBPHs for different DHs of almost all sensory attributes. MPF5 (DH 25.92%) was the strongest in the muttony, meaty, and roasted attributes, whereas MPF6 (DH 30.89%) was the strongest in the simulate and roasted attributes. Thirty-six compounds were identified as odor-active compounds for the evaluation of the sensory characteristics of MPFs via GC-MS-O analysis. The results of correlation analysis among odor-active compounds, molecular weight, and DSA further confirmed that the SBPH with a DH range of 25.92-30.89% may be a desirable precursor for the sensory characteristics of MPF. Copyright © 2013 Elsevier B.V. All rights reserved.

  16. Flow analysis of HANARO flow simulated test facility

    International Nuclear Information System (INIS)

    Park, Yong-Chul; Cho, Yeong-Garp; Wu, Jong-Sub; Jun, Byung-Jin

    2002-01-01

    The HANARO, a multi-purpose research reactor of 30 MWth open-tank-in-pool type, has been under normal operation since its initial critical in February, 1995. Many experiments should be safely performed to activate the utilization of the NANARO. A flow simulated test facility is being developed for the endurance test of reactivity control units for extended life times and the verification of structural integrity of those experimental facilities prior to loading in the HANARO. This test facility is composed of three major parts; a half-core structure assembly, flow circulation system and support system. The half-core structure assembly is composed of plenum, grid plate, core channel with flow tubes, chimney and dummy pool. The flow channels are to be filled with flow orifices to simulate core channels. This test facility must simulate similar flow characteristics to the HANARO. This paper, therefore, describes an analytical analysis to study the flow behavior of the test facility. The computational flow analysis has been performed for the verification of flow structure and similarity of this test facility assuming that flow rates and pressure differences of the core channel are constant. The shapes of flow orifices were determined by the trial and error method based on the design requirements of core channel. The computer analysis program with standard k - ε turbulence model was applied to three-dimensional analysis. The results of flow simulation showed a similar flow characteristic with that of the HANARO and satisfied the design requirements of this test facility. The shape of flow orifices used in this numerical simulation can be adapted for manufacturing requirements. The flow rate and the pressure difference through core channel proved by this simulation can be used as the design requirements of the flow system. The analysis results will be verified with the results of the flow test after construction of the flow system. (author)

  17. Simulation Analysis of Helicopter Ground Resonance Nonlinear Dynamics

    Science.gov (United States)

    Zhu, Yan; Lu, Yu-hui; Ling, Ai-min

    2017-07-01

    In order to accurately predict the dynamic instability of helicopter ground resonance, a modeling and simulation method of helicopter ground resonance considering nonlinear dynamic characteristics of components (rotor lead-lag damper, landing gear wheel and absorber) is presented. The numerical integral method is used to calculate the transient responses of the body and rotor, simulating some disturbance. To obtain quantitative instabilities, Fast Fourier Transform (FFT) is conducted to estimate the modal frequencies, and the mobile rectangular window method is employed in the predictions of the modal damping in terms of the response time history. Simulation results show that ground resonance simulation test can exactly lead up the blade lead-lag regressing mode frequency, and the modal damping obtained according to attenuation curves are close to the test results. The simulation test results are in accordance with the actual accident situation, and prove the correctness of the simulation method. This analysis method used for ground resonance simulation test can give out the results according with real helicopter engineering tests.

  18. Development of a dynamic model to evaluate economic recovery following a nuclear attack. Volume 1. Description and simulations. Final report

    International Nuclear Information System (INIS)

    Peterson, D.W.; Silverman, W.S.; Weil, H.B.; Willard, S.

    1980-11-01

    A highly-robust, dynamic simulation model of the US economy has been constructed to evaluate the likely economic response after various nuclear attacks or other severe disruptions, under various policies and assumptions. The model consists of a large system of nonlinear, recursive, time-difference equations. The solution-interval of the model is adjustable, with a maximum value of three weeks. The model represents the economy in thirteen sectors. Each sector contains a detailed representation of production, distribution, supply constraints, finance, employment, pricing, and wages. Also included are a full input-output representation of the interconnections among the sectors, and the psychological responses of corporate planners, consumers, and the labor force. The model's equations are formulated to remain consistent and realistic for all values of the variables, including the most extreme conditions. Therefore, the model can realistically simulate any degree or time sequence of nuclear attacks, pre-attack surges, mobilization, or policy shifts. Simulation experiments with the model suggest that the economy is highly vulnerable to nuclear attack, and that recovery requires extensive preparation, including psychological readiness, technology maintenance, special financial policies, and (if possible) maintenance of foreign trade. Civil defense policies must be adaptive (contingent on the nature of the damage) and must strive for balance among sectors, rather than maximum survival. This volume includes two appendices. Appendix A defines the aggregation of the model. Appendix B outlines the range of attack scenarios, pre-attack civil defense policies, and post-attack civil defense policies that can be evaluated with the model, including the model variables applicable to implementing those policies

  19. Abnormal transient analysis by using PWR plant simulator, (2)

    International Nuclear Information System (INIS)

    Naitoh, Akira; Murakami, Yoshimitsu; Yokobayashi, Masao.

    1983-06-01

    This report describes results of abnormal transient analysis by using a PWR plant simulator. The simulator is based on an existing 822MWe power plant with 3 loops, and designed to cover wide range of plant operation from cold shutdown to full power at EOL. In the simulator, malfunctions are provided for abnormal conditions of equipment failures, and in this report, 17 malfunctions for secondary system and 4 malfunctions for nuclear instrumentation systems were simulated. The abnormal conditions are turbine and generator trip, failure of condenser, feedwater system and valve and detector failures of pressure and water level. Fathermore, failure of nuclear instrumentations are involved such as source range channel, intermediate range channel and audio counter. Transient behaviors caused by added malfunctions were reasonable and detail information of dynamic characteristics for turbine-condenser system were obtained. (author)

  20. Environmental management policy analysis using complex system simulation

    International Nuclear Information System (INIS)

    Van Eeckhout, E.; Roberts, D.; Oakes, R.; Shieh, A.; Hardie, W.; Pope, P.

    1999-01-01

    The two primary modules of Envirosim (the model of Los Alamos TA-55 and the WIPP transport/storage model) have been combined into one application, with the simulated waste generated by TA-55 operations being fed to storage, packaging, and transport simulation entities. Three simulation scenarios were executed which demonstrate the usefulness of Envirosim as a policy analysis tool for use in planning shipments to WIPP. A graphical user interface (GUI) has been implemented using IDL (Interactive Data Language) which allows the analyst to easily view simulation results. While IDL is not necessarily the graphics interface that would be selected for a production version of Envirosim, it does provide some powerful data manipulation capabilities, and it runs on a variety of platforms

  1. Toward verifying fossil fuel CO2 emissions with the CMAQ model: motivation, model description and initial simulation.

    Science.gov (United States)

    Liu, Zhen; Bambha, Ray P; Pinto, Joseph P; Zeng, Tao; Boylan, Jim; Huang, Maoyi; Lei, Huimin; Zhao, Chun; Liu, Shishi; Mao, Jiafu; Schwalm, Christopher R; Shi, Xiaoying; Wei, Yaxing; Michelsen, Hope A

    2014-04-01

    Motivated by the question of whether and how a state-of-the-art regional chemical transport model (CTM) can facilitate characterization of CO2 spatiotemporal variability and verify CO2 fossil-fuel emissions, we for the first time applied the Community Multiscale Air Quality (CMAQ) model to simulate CO2. This paper presents methods, input data, and initial results for CO2 simulation using CMAQ over the contiguous United States in October 2007. Modeling experiments have been performed to understand the roles of fossil-fuel emissions, biosphere-atmosphere exchange, and meteorology in regulating the spatial distribution of CO2 near the surface over the contiguous United States. Three sets of net ecosystem exchange (NEE) fluxes were used as input to assess the impact of uncertainty of NEE on CO2 concentrations simulated by CMAQ. Observational data from six tall tower sites across the country were used to evaluate model performance. In particular, at the Boulder Atmospheric Observatory (BAO), a tall tower site that receives urban emissions from Denver CO, the CMAQ model using hourly varying, high-resolution CO2 fossil-fuel emissions from the Vulcan inventory and Carbon Tracker optimized NEE reproduced the observed diurnal profile of CO2 reasonably well but with a low bias in the early morning. The spatial distribution of CO2 was found to correlate with NO(x), SO2, and CO, because of their similar fossil-fuel emission sources and common transport processes. These initial results from CMAQ demonstrate the potential of using a regional CTM to help interpret CO2 observations and understand CO2 variability in space and time. The ability to simulate a full suite of air pollutants in CMAQ will also facilitate investigations of their use as tracers for CO2 source attribution. This work serves as a proof of concept and the foundation for more comprehensive examinations of CO2 spatiotemporal variability and various uncertainties in the future. Atmospheric CO2 has long been modeled

  2. Needs analysis for developing a virtual-reality NOTES simulator.

    Science.gov (United States)

    Sankaranarayanan, Ganesh; Matthes, Kai; Nemani, Arun; Ahn, Woojin; Kato, Masayuki; Jones, Daniel B; Schwaitzberg, Steven; De, Suvranu

    2013-05-01

    INTRODUCTION AND STUDY AIM: Natural orifice translumenal endoscopic surgery (NOTES) is an emerging surgical technique that requires a cautious adoption approach to ensure patient safety. High-fidelity virtual-reality-based simulators allow development of new surgical procedures and tools and train medical personnel without risk to human patients. As part of a project funded by the National Institutes of Health, we are developing the virtual transluminal endoscopic surgery trainer (VTEST) for this purpose. The objective of this study is to conduct a structured needs analysis to identify the design parameters for such a virtual-reality-based simulator for NOTES. A 30-point questionnaire was distributed at the 2011 National Orifice Surgery Consortium for Assessment and Research meeting to obtain responses from experts. Ordinal logistic regression and the Wilcoxon rank-sum test were used for analysis. A total of 22 NOTES experts participated in the study. Cholecystectomy (CE, 68 %) followed by appendectomy (AE, 63 %) (CE vs AE, p = 0.0521) was selected as the first choice for simulation. Flexible (FL, 47 %) and hybrid (HY, 47 %) approaches were equally favorable compared with rigid (RI, 6 %) with p virtual NOTES simulator in training and testing new tools for NOTES were rated very high by the participants. Our study reinforces the importance of developing a virtual NOTES simulator and clearly presents expert preferences. The results of this analysis will direct our initial development of the VTEST platform.

  3. The Satisfaction of Entrepreneurs in Terengganu Private Limited Companies toward the Concept of Corporate Entrepreneurship: A Descriptive Analysis

    OpenAIRE

    Muhammad Abi Sofian Abdul Halim; Hasyaniza Yahya; Syahrul Hezrin Mahmud; Fatimah Zainab

    2011-01-01

    This study will determine the level of satisfaction among Terengganu private limited companies towards the concept of corporate entrepreneurship, in a context of; corporate venturing, strategic renewal, and internalization. The study was based on a survey carried out from the questionnaire that is administered involving 105 private limited companies which are operated their business in Terengganu. By using the descriptive analysis on the level of satisfaction among the Terengganu private limi...

  4. Thermodynamic description of the Al-Cu-Mg-Mn-Si quinary system and its application to solidification simulation

    International Nuclear Information System (INIS)

    Chang, Keke; Liu, Shuhong; Zhao, Dongdong; Du, Yong; Zhou, Liangcai; Chen, Li

    2011-01-01

    By means of the first-principles calculations, the enthalpy of formation for the quaternary phase in the Al-Cu-Mg-Si system was computed. A set of self-consistent thermodynamic parameters for the Al-Cu-Mg-Si and Al-Cu-Mn-Si systems was then obtained using CALPHAD approach taking into account the reliable experimental data and the first-principles calculations. The thermodynamic database for the Al-Cu-Mg-Mn-Si system was developed based on the constituent binary, ternary, and quaternary systems. Comprehensive comparisons between the calculated and measured phase diagrams and invariant reactions showed that the experimental information was satisfactorily accounted for by the present thermodynamic description. The obtained database was used to describe the solidification behavior of Al alloys B319.1 (90.2Al-6Si-3.5Cu-0.3Mg, in wt.%) and B319.1 + xMn (x = 0.5-2, in wt.%) under Gulliver-Scheil non-equilibrium condition. The reliability of the present thermodynamic database was also verified by the good agreement between calculation and experiment for Gulliver-Scheil non-equilibrium solidification.

  5. Development of Nuclear Plant Specific Analysis Simulators with ATLAS

    International Nuclear Information System (INIS)

    Jakubowski, Z.; Draeger, P.; Horche, W.; Pointner, W.

    2006-01-01

    The simulation software ATLAS, based on the best-estimate code ATHLET, has been developed by the GRS for a range of applications in the field of nuclear plant safety analysis. Through application of versatile simulation tools and graphical interfaces the user should be able to analyse with ATLAS all essential accident scenarios. Detailed analysis simulators for several German and Russian NPPs are being constructed on the basis of ATLAS. An overview of the ATLAS is presented in the paper, describing its configuration, functions performed by main components and relationships among them. A significant part of any power plant simulator are the balance-of-plant (BOP) models, not only because all the plant transients and non-LOCA accidents can be initiated by operation of BOP systems, but also because the response of the plant to transients or accidents is strongly influenced by the automatic operation of BOP systems. Modelling aspects of BOP systems are shown in detail, also the interface between the process model and BOP systems. Special emphasis has been put on the BOP model builder based on the methodology developed in the GRS. The BOP modeler called GCSM-Generator is an object oriented tool which runs on the online expert system G2. It is equipped with utilities to edit the BOP models, to verification them and to generate a GCSM code, specific for the ATLAS. The communication system of ATLAS presents graphically the results of the simulation and allows interactively influencing the execution of the simulation process (malfunctions, manual control). Displays for communications with simulated processes and presentation of calculations results are also presented. In the framework of the verification of simulation models different tools are used e.g. the PC-codes MATHCAD for the calculation and documentation, ATLET-Input-Graphic for control of geometry data and the expert system G2 for development of BOP-Models. The validation procedure and selected analyses results

  6. Description of the TREBIL, CRESSEX and STREUSL computer programs, that belongs to RALLY computer code pack for the analysis of reliability systems

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.

    1982-11-01

    The RALLY computer code pack (RALLY pack) is a set of computer codes destinate to the reliability of complex systems, aiming to a risk analysis. Three of the six codes, are commented, presenting their purpose, input description, calculation methods and results obtained with each one of those computer codes. The computer codes are: TREBIL, to obtain the fault tree logical equivalent; CRESSEX, to obtain the minimal cut and the punctual values of the non-reliability and non-availability of the system; and STREUSL, for the dispersion calculation of those values around the media. In spite of the CRESSEX, in its version available at CNEN, uses a little long method to obtain the minimal cut in an HB-CNEN system, the three computer programs show good results, mainly the STREUSL, which permits the simulation of various components. (E.G.) [pt

  7. ANCA: Anharmonic Conformational Analysis of Biomolecular Simulations.

    Science.gov (United States)

    Parvatikar, Akash; Vacaliuc, Gabriel S; Ramanathan, Arvind; Chennubhotla, S Chakra

    2018-05-08

    Anharmonicity in time-dependent conformational fluctuations is noted to be a key feature of functional dynamics of biomolecules. Although anharmonic events are rare, long-timescale (μs-ms and beyond) simulations facilitate probing of such events. We have previously developed quasi-anharmonic analysis to resolve higher-order spatial correlations and characterize anharmonicity in biomolecular simulations. In this article, we have extended this toolbox to resolve higher-order temporal correlations and built a scalable Python package called anharmonic conformational analysis (ANCA). ANCA has modules to: 1) measure anharmonicity in the form of higher-order statistics and its variation as a function of time, 2) output a storyboard representation of the simulations to identify key anharmonic conformational events, and 3) identify putative anharmonic conformational substates and visualization of transitions between these substates. Copyright © 2018 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. Simulation analysis for integrated evaluation of technical and commercial risk

    International Nuclear Information System (INIS)

    Gutleber, D.S.; Heiberger, E.M.; Morris, T.D.

    1995-01-01

    Decisions to invest in oil- and gasfield acquisitions or participating interests often are based on the perceived ability to enhance the economic value of the underlying asset. A multidisciplinary approach integrating reservoir engineering, operations and drilling, and deal structuring with Monte Carlo simulation modeling can overcome weaknesses of deterministic analysis and significantly enhance investment decisions. This paper discusses the use of spreadsheets and Monte Carlo simulation to generate probabilistic outcomes for key technical and economic parameters for ultimate identification of the economic volatility and value of potential deal concepts for a significant opportunity. The approach differs from a simple risk analysis for an individual well by incorporating detailed, full-field simulations that vary the reservoir parameters, capital and operating cost assumptions, and schedules on timing in the framework of various deal structures

  9. Man-machine interfaces analysis system based on computer simulation

    International Nuclear Information System (INIS)

    Chen Xiaoming; Gao Zuying; Zhou Zhiwei; Zhao Bingquan

    2004-01-01

    The paper depicts a software assessment system, Dynamic Interaction Analysis Support (DIAS), based on computer simulation technology for man-machine interfaces (MMI) of a control room. It employs a computer to simulate the operation procedures of operations on man-machine interfaces in a control room, provides quantified assessment, and at the same time carries out analysis on operational error rate of operators by means of techniques for human error rate prediction. The problems of placing man-machine interfaces in a control room and of arranging instruments can be detected from simulation results. DIAS system can provide good technical supports to the design and improvement of man-machine interfaces of the main control room of a nuclear power plant

  10. Fluid analysis of electromagnetic ballooning modes in a fully toroidal description

    International Nuclear Information System (INIS)

    Andersson, P.; Weiland, J.

    1986-01-01

    A comparatively complete two fluid description of collisionless electromagnetic ballooning modes has been derived. Using an unexpanded ion density response it has been shown that a necessary and sufficient condition for an instability below the MHD/BETA/ limit is the presence of an ion temperature gradient exceeding a threshold. The cause of this instability has been identified and an analytical dispersion relation is given. (authors)

  11. Descriptive business intelligence analysis: utting edge strategic asset for SMEs, is it really worth it?

    OpenAIRE

    Sivave Mashingaidze

    2014-01-01

    The purpose of this article is to provide a framework for understanding and adoption of Business Intelligence by (SMEs) within the Zimbabwean economy. The article explores every facet of Business Intelligence, including internal and external BI as cutting edge strategic asset. A descriptive research methodology has been adopted. The article revealed some BI critical success factors for better BI implementation. Findings revealed that organizations which have the greatest success with BI trave...

  12. Management of Industrial Performance Indicators: Regression Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Walter Roberto Hernandez Vergara

    2017-11-01

    Full Text Available Stochastic methods can be used in problem solving and explanation of natural phenomena through the application of statistical procedures. The article aims to associate the regression analysis and systems simulation, in order to facilitate the practical understanding of data analysis. The algorithms were developed in Microsoft Office Excel software, using statistical techniques such as regression theory, ANOVA and Cholesky Factorization, which made it possible to create models of single and multiple systems with up to five independent variables. For the analysis of these models, the Monte Carlo simulation and analysis of industrial performance indicators were used, resulting in numerical indices that aim to improve the goals’ management for compliance indicators, by identifying systems’ instability, correlation and anomalies. The analytical models presented in the survey indicated satisfactory results with numerous possibilities for industrial and academic applications, as well as the potential for deployment in new analytical techniques.

  13. Discrete event simulation versus conventional system reliability analysis approaches

    DEFF Research Database (Denmark)

    Kozine, Igor

    2010-01-01

    Discrete Event Simulation (DES) environments are rapidly developing and appear to be promising tools for building reliability and risk analysis models of safety-critical systems and human operators. If properly developed, they are an alternative to the conventional human reliability analysis models...... and systems analysis methods such as fault and event trees and Bayesian networks. As one part, the paper describes briefly the author’s experience in applying DES models to the analysis of safety-critical systems in different domains. The other part of the paper is devoted to comparing conventional approaches...

  14. Digital Simulation-Based Training: A Meta-Analysis

    Science.gov (United States)

    Gegenfurtner, Andreas; Quesada-Pallarès, Carla; Knogler, Maximilian

    2014-01-01

    This study examines how design characteristics in digital simulation-based learning environments moderate self-efficacy and transfer of learning. Drawing on social cognitive theory and the cognitive theory of multimedia learning, the meta-analysis psychometrically cumulated k?=?15 studies of 25 years of research with a total sample size of…

  15. Sensitivity analysis for oblique incidence reflectometry using Monte Carlo simulations

    DEFF Research Database (Denmark)

    Kamran, Faisal; Andersen, Peter E.

    2015-01-01

    profiles. This article presents a sensitivity analysis of the technique in turbid media. Monte Carlo simulations are used to investigate the technique and its potential to distinguish the small changes between different levels of scattering. We present various regions of the dynamic range of optical...

  16. Social interaction, globalization and computer-aided analysis a practical guide to developing social simulation

    CERN Document Server

    Osherenko, Alexander

    2014-01-01

    This thorough, multidisciplinary study discusses the findings of social interaction and social simulation using understandable global examples. Shows the reader how to acquire intercultural data, illustrating each step with descriptive comments and program code.

  17. An in-depth description of bipolar resistive switching in Cu/HfOx/Pt devices, a 3D kinetic Monte Carlo simulation approach

    Science.gov (United States)

    Aldana, S.; Roldán, J. B.; García-Fernández, P.; Suñe, J.; Romero-Zaliz, R.; Jiménez-Molinos, F.; Long, S.; Gómez-Campos, F.; Liu, M.

    2018-04-01

    A simulation tool based on a 3D kinetic Monte Carlo algorithm has been employed to analyse bipolar conductive bridge RAMs fabricated with Cu/HfOx/Pt stacks. Resistive switching mechanisms are described accounting for the electric field and temperature distributions within the dielectric. The formation and destruction of conductive filaments (CFs) are analysed taking into consideration redox reactions and the joint action of metal ion thermal diffusion and electric field induced drift. Filamentary conduction is considered when different percolation paths are formed in addition to other conventional transport mechanisms in dielectrics. The simulator was tuned by using the experimental data for Cu/HfOx/Pt bipolar devices that were fabricated. Our simulation tool allows for the study of different experimental results, in particular, the current variations due to the electric field changes between the filament tip and the electrode in the High Resistance State. In addition, the density of metallic atoms within the CF can also be characterized along with the corresponding CF resistance description.

  18. Simulating the formation and evolution of galaxies: multi-phase description of the interstellar medium, star formation, and energy feedback

    Science.gov (United States)

    Merlin, E.; Chiosi, C.

    2007-10-01

    Context: Modelling the gaseous component of the interstellar medium (ISM) by Smoothed Particles Hydrodynamics in N-Body simulations (NB-TSPH) is still very crude when compared to the complex real situation. In the real ISM, many different and almost physically decoupled components (phases) coexist for long periods of time, and since they spread over wide ranges of density and temperature, they cannot be correctly represented by a unique continuous fluid. This would influence star formation which is thought to take place in clumps of cold, dense, molecular clouds, embedded in a warmer, neutral medium, that are almost freely moving throughout the tenuous hot ISM. Therefore, assuming that star formation is simply related to the gas content without specifying the component in which this is both observed and expected to occur may not be physically sound. Aims: We consider a multi-phase representation of the ISM in NB-TSPH simulations of galaxy formation and evolution with particular attention to the case of early-type galaxies. Methods: Cold gas clouds are described by the so-called sticky particles algorithm. They can freely move throughout the hot ISM medium; stars form within these clouds and the mass exchange among the three baryonic phases (hot gas, cold clouds, stars) is governed by radiative and Compton cooling and energy feedback by supernova (SN) explosions, stellar winds, and UV radiation. We also consider thermal conduction, cloud-cloud collisions, and chemical enrichment. Results: Our model agrees with and improves upon previous studies on the same subject. The results for the star formation rate agree with recent observational data on early-type galaxies. Conclusions: These models lend further support to the revised monolithic scheme of galaxy formation, which has recently been strengthened by high redshift data leading to the so-called downsizing and top-down scenarios.

  19. An attempt for a unified description of mechanical testing on Zircaloy-4 cladding subjected to simulated LOCA transient

    Directory of Open Access Journals (Sweden)

    Desquines Jean

    2016-01-01

    Full Text Available During a Loss Of Coolant Accident (LOCA, an important safety requirement is that the reflooding of the core by the emergency core cooling system should not lead to a complete rupture of the fuel rods. Several types of mechanical tests are usually performed in the industry to determine the degree of cladding embrittlement, such as ring compression tests or four-point bending of rodlets. Many other tests can be found in the open literature. However, there is presently no real intrinsic understanding of the failure conditions in these tests which would allow translation of the results from one kind of mechanical testing to another. The present study is an attempt to provide a unified description of the failure not directly depending on the tested geometry. This effort aims at providing a better understanding of the link between several existing safety criteria relying on very different mechanical testing. To achieve this objective, the failure mechanisms of pre-oxidized and pre-hydrided cladding samples are characterized by comparing the behavior of two different mechanical tests: Axial Tensile (AT test and “C”-shaped Ring Compression Test (CCT. The failure of samples in both cases can be described by usual linear elastic fracture mechanics theory. Using interrupted mechanical tests, metallographic examinations have evidenced that a set of parallel cracks are nucleated at the inner and outer surface of the samples just before failure, crossing both the oxide layer and the oxygen rich alpha layer. The stress intensity factors for multiple crack geometry are determined for both AT and CCT samples using finite element calculations. After each mechanical test performed on high temperature steam oxidized samples, metallography is then used to individually determine the crack depth and crack spacing. Using these two important parameters and considering the applied load at fracture, the stress intensity factor at failure is derived for each tested

  20. The Atmospheric Chemistry and Canopy Exchange Simulation System (ACCESS: model description and application to a temperate deciduous forest canopy

    Directory of Open Access Journals (Sweden)

    R. D. Saylor

    2013-01-01

    Full Text Available Forest canopies are primary emission sources of biogenic volatile organic compounds (BVOCs and have the potential to significantly influence the formation and distribution of secondary organic aerosol (SOA mass. Biogenically-derived SOA formed as a result of emissions from the widespread forests across the globe may affect air quality in populated areas, degrade atmospheric visibility, and affect climate through direct and indirect forcings. In an effort to better understand the formation of SOA mass from forest emissions, a 1-D column model of the multiphase physical and chemical processes occurring within and just above a vegetative canopy is being developed. An initial, gas-phase-only version of this model, the Atmospheric Chemistry and Canopy Exchange Simulation System (ACCESS, includes processes accounting for the emission of BVOCs from the canopy, turbulent vertical transport within and above the canopy and throughout the height of the planetary boundary layer (PBL, near-explicit representation of chemical transformations, mixing with the background atmosphere and bi-directional exchange between the atmosphere and canopy and the atmosphere and forest floor. The model formulation of ACCESS is described in detail and results are presented for an initial application of the modeling system to Walker Branch Watershed, an isoprene-emission-dominated forest canopy in the southeastern United States which has been the focal point for previous chemical and micrometeorological studies. Model results of isoprene profiles and fluxes are found to be consistent with previous measurements made at the simulated site and with other measurements made in and above mixed deciduous forests in the southeastern United States. Sensitivity experiments are presented which explore how canopy concentrations and fluxes of gas-phase precursors of SOA are affected by background anthropogenic nitrogen oxides (NOx. Results from these experiments suggest that the

  1. Rethinking Sensitivity Analysis of Nuclear Simulations with Topology

    Energy Technology Data Exchange (ETDEWEB)

    Dan Maljovec; Bei Wang; Paul Rosen; Andrea Alfonsi; Giovanni Pastore; Cristian Rabiti; Valerio Pascucci

    2016-01-01

    In nuclear engineering, understanding the safety margins of the nuclear reactor via simulations is arguably of paramount importance in predicting and preventing nuclear accidents. It is therefore crucial to perform sensitivity analysis to understand how changes in the model inputs affect the outputs. Modern nuclear simulation tools rely on numerical representations of the sensitivity information -- inherently lacking in visual encodings -- offering limited effectiveness in communicating and exploring the generated data. In this paper, we design a framework for sensitivity analysis and visualization of multidimensional nuclear simulation data using partition-based, topology-inspired regression models and report on its efficacy. We rely on the established Morse-Smale regression technique, which allows us to partition the domain into monotonic regions where easily interpretable linear models can be used to assess the influence of inputs on the output variability. The underlying computation is augmented with an intuitive and interactive visual design to effectively communicate sensitivity information to the nuclear scientists. Our framework is being deployed into the multi-purpose probabilistic risk assessment and uncertainty quantification framework RAVEN (Reactor Analysis and Virtual Control Environment). We evaluate our framework using an simulation dataset studying nuclear fuel performance.

  2. SADE: system of acquisition of experimental data. Definition and analysis of an experiment description language

    International Nuclear Information System (INIS)

    Gagniere, Jean-Michel

    1983-01-01

    This research thesis presents a computer system for the acquisition of experimental data. It is aimed at acquiring, at processing and at storing information from particle detectors. The acquisition configuration is described by an experiment description language. The system comprises a lexical analyser, a syntactic analyser, a translator, and a data processing module. It also comprises a control language and a statistics management and plotting module. The translator builds up series of tables which allow, during an experiment, different sequences to be executed: experiment running, calculations to be performed on this data, building up of statistics. Short execution time and ease of use are always looked for [fr

  3. Analysis, Analysis Practices and Implications for Modeling and Simulation

    Science.gov (United States)

    2007-01-01

    the Somme, New York: Penguin , 1983. Kent, Glenn A., “Looking Back: Four Decades of Analysis,” Operations Research, Vol. 50, No. 1, 2002, pp. 122–224...to many sources is http://www.saunalahti.fi/ fta /EBO.htm (as of December 18, 2006). Effects-based operations are controversial in some respects (Davis

  4. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    Science.gov (United States)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  5. Battery Simulation Tool for Worst Case Analysis and Mission Evaluations

    Directory of Open Access Journals (Sweden)

    Lefeuvre Stéphane

    2017-01-01

    The first part of this paper presents the PSpice models including their respective variable parameters at SBS and cell level. Then the second part of the paper introduces to the reader the model parameters that were chosen and identified to perform Monte Carlo Analysis simulations. The third part reflects some MCA results for a VES16 battery module. Finally the reader will see some other simulations that were performed by re-using the battery model for an another Saft battery cell type (MP XTD for a specific space application, at high temperature.

  6. Two-dimensional, time-dependent MHD description of interplanetary disturbances: simulation of high speed solar wind interactions

    International Nuclear Information System (INIS)

    Wu, S.T.; Han, S.M.; Dryer, M.

    1979-01-01

    A two-dimensional, time-dependent, magnetohydrodynamic, numerical model is used to investigate multiple, transient solar wind flows which start close to the Sun and then extend into interplanetary space. The initial conditions are assumed to be appropriate for steady, homogeneous solar wind conditions with an average, spiral magnetic field configuration. Because both radial and azimuthal dimensions are included, it is possible to place two or more temporally-developing streams side-by-side at the same time. Thus, the evolution of the ensuing stream interaction is simulated by this numerical code. Advantages of the present method are as follows: (1) the development and decay of asymmetric MHD shocks and their interactions are clearly indicated; and (2) the model allows flexibility in the specification of evolutionary initial conditions in the azimuthal direction, thereby making it possible to gain insight concerning the interplanetary consequences of real physical situations more accurately than by use of the one-dimensional approach. Examples of such situations are the occurrence of near-simultaneous solar flares in adjacent active regions and the sudden appearance of enlargement of coronal holes as a result of a transient re-arrangement from a closed to an open magnetic field topology. (author)

  7. Novel 3D/VR interactive environment for MD simulations, visualization and analysis.

    Science.gov (United States)

    Doblack, Benjamin N; Allis, Tim; Dávila, Lilian P

    2014-12-18

    The increasing development of computing (hardware and software) in the last decades has impacted scientific research in many fields including materials science, biology, chemistry and physics among many others. A new computational system for the accurate and fast simulation and 3D/VR visualization of nanostructures is presented here, using the open-source molecular dynamics (MD) computer program LAMMPS. This alternative computational method uses modern graphics processors, NVIDIA CUDA technology and specialized scientific codes to overcome processing speed barriers common to traditional computing methods. In conjunction with a virtual reality system used to model materials, this enhancement allows the addition of accelerated MD simulation capability. The motivation is to provide a novel research environment which simultaneously allows visualization, simulation, modeling and analysis. The research goal is to investigate the structure and properties of inorganic nanostructures (e.g., silica glass nanosprings) under different conditions using this innovative computational system. The work presented outlines a description of the 3D/VR Visualization System and basic components, an overview of important considerations such as the physical environment, details on the setup and use of the novel system, a general procedure for the accelerated MD enhancement, technical information, and relevant remarks. The impact of this work is the creation of a unique computational system combining nanoscale materials simulation, visualization and interactivity in a virtual environment, which is both a research and teaching instrument at UC Merced.

  8. Advances in the indirect, descriptive, and experimental approaches to the functional analysis of problem behavior.

    Science.gov (United States)

    Wightman, Jade; Julio, Flávia; Virués-Ortega, Javier

    2014-05-01

    Experimental functional analysis is an assessment methodology to identify the environmental factors that maintain problem behavior in individuals with developmental disabilities and in other populations. Functional analysis provides the basis for the development of reinforcement-based approaches to treatment. This article reviews the procedures, validity, and clinical implementation of the methodological variations of functional analysis and function-based interventions. We present six variations of functional analysis methodology in addition to the typical functional analysis: brief functional analysis, single-function tests, latency-based functional analysis, functional analysis of precursors, and trial-based functional analysis. We also present the three general categories of function-based interventions: extinction, antecedent manipulation, and differential reinforcement. Functional analysis methodology is a valid and efficient approach to the assessment of problem behavior and the selection of treatment strategies.

  9. Analysis by simulation of the disposition of nuclear fuel waste

    International Nuclear Information System (INIS)

    Turek, J.L.

    1980-09-01

    A descriptive simulation model is developed which includes all aspects of nuclear waste disposition. The model is comprised of two systems, the second system orchestrated by GASP IV. A spent fuel generation prediction module is interfaced with the AFR Program Management Information System and a repository scheduling information module. The user is permitted a wide range of options with which to tailor the simulation to any desired storage scenario. The model projects storage requirements through the year 2020. The outputs are evaluations of the impact that alternative decision policies and milestone date changes have on the demand for, the availability of, and the utilization of spent fuel storage capacities. Both graphs and detailed listings are available. These outputs give a comprehensive view of the particular scenario under observation, including the tracking, by year, of each discharge from every reactor. Included within the work is a review of the status of spent fuel disposition based on input data accurate as of August 1980. The results indicate that some temporary storage techniques (e.g., transshipment of fuel and/or additional at-reactor storage pools) must be utilized to prevent reactor shutdowns. These techniques will be required until the 1990's when several AFR facilities, and possibly one repository, can become operational

  10. Descriptive analysis of individual and community factors among African American youths in urban public housing.

    Science.gov (United States)

    Nebbitt, Von E; Williams, James Herbert; Lombe, Margaret; McCoy, Henrika; Stephens, Jennifer

    2014-07-01

    African American adolescents are disproportionately represented in urban public housing developments. These neighborhoods are generally characterized by high rates of poverty, crime, violence, and disorganization. Although evidence is emerging on youths in these communities, little is known about their depressive symptoms, perceived efficacy, or frequency of substance use and sex-risk behavior. Further, even less is known about their exposure to community and household violence, their parents' behavior, or their sense of connection to their communities. Using a sample of 782 African American adolescents living in public housing neighborhoods located in four large U.S. cities, this article attempts to rectify the observed gap in knowledge by presenting a descriptive overview of their self-reported depressive symptoms; self-efficacy; frequencies of delinquent and sexual-risk behavior; and alcohol, tobacco, and other drug use. The self-reported ratings of their parents' behavior as well as their exposure to community and household violence are presented. Analytic procedures include descriptive statistics and mean comparisons between genders and across research cities. Results suggest several differences between genders and across research sites. However, results are not very different from national data. Implications for social work practice are discussed.

  11. Description and quantitative analysis of the dentition of the southern thorny skate Amblyraja doellojuradoi.

    Science.gov (United States)

    Delpiani, G; Spath, M C; Deli Antoni, M; Delpiani, M

    2017-06-01

    A description of the tooth morphology of 234 jaws from the southern thorny skate Amblyraja doellojuradoi in the south-west Atlantic Ocean is given. Seven rows of teeth were selected and length and width of each tooth in these rows were measured. It was found that functional series corresponds to the third teeth and the average width and length of these teeth were compared among jaws, maturity stages, sexes and rows. Generalized linear models were used to determine the subset of measures that most contribute to explain the variability between groups. It was observed that males have longer teeth than females, but the teeth of females are wider. These differences are attributed to reproductive behaviour, in which males bite females to hold them during copulation. This study provides a description of the teeth of A. doellojuradoi, supplying a valuable tool for identification of species. In addition, the establishment of the main variations observed in the dentition, improves the understanding of the species' biology. © 2017 The Fisheries Society of the British Isles.

  12. Critical slowing down and error analysis in lattice QCD simulations

    Energy Technology Data Exchange (ETDEWEB)

    Schaefer, Stefan [Humboldt-Universitaet, Berlin (Germany). Inst. fuer Physik; Sommer, Rainer; Virotta, Francesco [Deutsches Elektronen-Synchrotron (DESY), Zeuthen (Germany). John von Neumann-Inst. fuer Computing NIC

    2010-09-15

    We study the critical slowing down towards the continuum limit of lattice QCD simulations with Hybrid Monte Carlo type algorithms. In particular for the squared topological charge we find it to be very severe with an effective dynamical critical exponent of about 5 in pure gauge theory. We also consider Wilson loops which we can demonstrate to decouple from the modes which slow down the topological charge. Quenched observables are studied and a comparison to simulations of full QCD is made. In order to deal with the slow modes in the simulation, we propose a method to incorporate the information from slow observables into the error analysis of physical observables and arrive at safer error estimates. (orig.)

  13. Critical slowing down and error analysis in lattice QCD simulations

    International Nuclear Information System (INIS)

    Schaefer, Stefan; Sommer, Rainer; Virotta, Francesco

    2010-09-01

    We study the critical slowing down towards the continuum limit of lattice QCD simulations with Hybrid Monte Carlo type algorithms. In particular for the squared topological charge we find it to be very severe with an effective dynamical critical exponent of about 5 in pure gauge theory. We also consider Wilson loops which we can demonstrate to decouple from the modes which slow down the topological charge. Quenched observables are studied and a comparison to simulations of full QCD is made. In order to deal with the slow modes in the simulation, we propose a method to incorporate the information from slow observables into the error analysis of physical observables and arrive at safer error estimates. (orig.)

  14. Transient analysis of multifailure conditions by using PWR plant simulator

    International Nuclear Information System (INIS)

    Morisaki, Hidetoshi; Yokobayashi, Masao.

    1984-11-01

    This report describes results of the analysis of abnormal transients caused by multifailures using a PWR plant simulator. The simulator is based on an existing 822MWe power plant with 3 loops, and designed to cover wide range of plant operation from cold shutdown to full power at the end of life. Various malfunctions to simulate abnormal conditions caused by equipment failures are provided. In this report, features of abnormal transients caused by concurrence of malfunctions are discussed. The abnormal conditions studied are leak of primary coolant, loss of charging and feedwater flows, and control systems failure. From the results, it was observed that transient responses caused by some of the malfunctions are almost same as the addition of behaviors caused by each single malfunction. Therefore, it can be said that kinds of malfunctions which are concurrent may be estimated from transient characteristics of each single malfunction. (author)

  15. Uncertainty and sensitivity analysis of control strategies using the benchmark simulation model No1 (BSM1).

    Science.gov (United States)

    Flores-Alsina, Xavier; Rodriguez-Roda, Ignasi; Sin, Gürkan; Gernaey, Krist V

    2009-01-01

    The objective of this paper is to perform an uncertainty and sensitivity analysis of the predictions of the Benchmark Simulation Model (BSM) No. 1, when comparing four activated sludge control strategies. The Monte Carlo simulation technique is used to evaluate the uncertainty in the BSM1 predictions, considering the ASM1 bio-kinetic parameters and influent fractions as input uncertainties while the Effluent Quality Index (EQI) and the Operating Cost Index (OCI) are focused on as model outputs. The resulting Monte Carlo simulations are presented using descriptive statistics indicating the degree of uncertainty in the predicted EQI and OCI. Next, the Standard Regression Coefficients (SRC) method is used for sensitivity analysis to identify which input parameters influence the uncertainty in the EQI predictions the most. The results show that control strategies including an ammonium (S(NH)) controller reduce uncertainty in both overall pollution removal and effluent total Kjeldahl nitrogen. Also, control strategies with an external carbon source reduce the effluent nitrate (S(NO)) uncertainty increasing both their economical cost and variability as a trade-off. Finally, the maximum specific autotrophic growth rate (micro(A)) causes most of the variance in the effluent for all the evaluated control strategies. The influence of denitrification related parameters, e.g. eta(g) (anoxic growth rate correction factor) and eta(h) (anoxic hydrolysis rate correction factor), becomes less important when a S(NO) controller manipulating an external carbon source addition is implemented.

  16. SPECTRAL INDEX AS A FUNCTION OF MASS ACCRETION RATE IN BLACK HOLE SOURCES: MONTE CARLO SIMULATIONS AND AN ANALYTICAL DESCRIPTION

    International Nuclear Information System (INIS)

    Laurent, Philippe; Titarchuk, Lev

    2011-01-01

    We present herein a theoretical study of correlations between spectral indexes of X-ray emergent spectra and mass accretion rate ( m-dot ) in black hole (BH) sources, which provide a definitive signature for BHs. It has been firmly established, using the Rossi X-ray Timing Explorer (RXTE) in numerous BH observations during hard-soft state spectral evolution, that the photon index of X-ray spectra increases when m-dot increases and, moreover, the index saturates at high values of m-dot . In this paper, we present theoretical arguments that the observationally established index saturation effect versus mass accretion rate is a signature of the bulk (converging) flow onto the BH. Also, we demonstrate that the index saturation value depends on the plasma temperature of converging flow. We self-consistently calculate the Compton cloud (CC) plasma temperature as a function of mass accretion rate using the energy balance between energy dissipation and Compton cooling. We explain the observable phenomenon, index- m-dot correlations using a Monte Carlo simulation of radiative processes in the innermost part (CC) of a BH source and we account for the Comptonization processes in the presence of thermal and bulk motions, as basic types of plasma motion. We show that, when m-dot increases, BH sources evolve to high and very soft states (HSS and VSS, respectively), in which the strong blackbody(BB)-like and steep power-law components are formed in the resulting X-ray spectrum. The simultaneous detections of these two components strongly depends on sensitivity of high-energy instruments, given that the relative contribution of the hard power-law tail in the resulting VSS spectrum can be very low, which is why, to date RXTE observations of the VSS X-ray spectrum have been characterized by the presence of the strong BB-like component only. We also predict specific patterns for high-energy e-fold (cutoff) energy (E fold ) evolution with m-dot for thermal and dynamical (bulk

  17. On Monte Carlo Simulation and Analysis of Electricity Markets

    International Nuclear Information System (INIS)

    Amelin, Mikael

    2004-07-01

    This dissertation is about how Monte Carlo simulation can be used to analyse electricity markets. There are a wide range of applications for simulation; for example, players in the electricity market can use simulation to decide whether or not an investment can be expected to be profitable, and authorities can by means of simulation find out which consequences a certain market design can be expected to have on electricity prices, environmental impact, etc. In the first part of the dissertation, the focus is which electricity market models are suitable for Monte Carlo simulation. The starting point is a definition of an ideal electricity market. Such an electricity market is partly practical from a mathematical point of view (it is simple to formulate and does not require too complex calculations) and partly it is a representation of the best possible resource utilisation. The definition of the ideal electricity market is followed by analysis how the reality differs from the ideal model, what consequences the differences have on the rules of the electricity market and the strategies of the players, as well as how non-ideal properties can be included in a mathematical model. Particularly, questions about environmental impact, forecast uncertainty and grid costs are studied. The second part of the dissertation treats the Monte Carlo technique itself. To reduce the number of samples necessary to obtain accurate results, variance reduction techniques can be used. Here, six different variance reduction techniques are studied and possible applications are pointed out. The conclusions of these studies are turned into a method for efficient simulation of basic electricity markets. The method is applied to some test systems and the results show that the chosen variance reduction techniques can produce equal or better results using 99% fewer samples compared to when the same system is simulated without any variance reduction technique. More complex electricity market models

  18. Intelligent simulations for on-line transient analysis

    International Nuclear Information System (INIS)

    Hassberger, J.A.; Lee, J.C.

    1987-01-01

    A unique combination of simulation, parameter estimation and expert systems technology is applied to the problem of diagnosing nuclear power plant transients. Knowledge-based reasoning is ued to monitor plant data and hypothesize about the status of the plant. Fuzzy logic is employed as the inferencing mechanism and an implication scheme based on observations is developed and employed to handle scenarios involving competing failures. Hypothesis testing is performed by simulating the behavior of faulted components using numerical models. A filter has been developed for systematically adjusting key model parameters to force agreement between simulations and actual plant data. Pattern recognition is employed as a decision analysis technique for choosing among several hypotheses based on simulation results. An artificial Intelligence framework based on a critical functions approach is used to deal with the complexity of a nuclear plant system. Detailed simulation results of various nuclear power plant accident scenarios are presented to demonstrate the performance and robustness properties of the diagnostic algorithm developed. The system is shown to be successful in diagnosing and identifying fault parameters for a normal reactor scram, loss-of-feedwater (LOFW) and small loss-of-coolant (LOCA) transients occurring together in a scenario similar to the accident at Three Mile Island

  19. An efficiency improvement in warehouse operation using simulation analysis

    Science.gov (United States)

    Samattapapong, N.

    2017-11-01

    In general, industry requires an efficient system for warehouse operation. There are many important factors that must be considered when designing an efficient warehouse system. The most important is an effective warehouse operation system that can help transfer raw material, reduce costs and support transportation. By all these factors, researchers are interested in studying about work systems and warehouse distribution. We start by collecting the important data for storage, such as the information on products, information on size and location, information on data collection and information on production, and all this information to build simulation model in Flexsim® simulation software. The result for simulation analysis found that the conveyor belt was a bottleneck in the warehouse operation. Therefore, many scenarios to improve that problem were generated and testing through simulation analysis process. The result showed that an average queuing time was reduced from 89.8% to 48.7% and the ability in transporting the product increased from 10.2% to 50.9%. Thus, it can be stated that this is the best method for increasing efficiency in the warehouse operation.

  20. Descriptive business intelligence analysis: utting edge strategic asset for SMEs, is it really worth it?

    Directory of Open Access Journals (Sweden)

    Sivave Mashingaidze

    2014-10-01

    Full Text Available The purpose of this article is to provide a framework for understanding and adoption of Business Intelligence by (SMEs within the Zimbabwean economy. The article explores every facet of Business Intelligence, including internal and external BI as cutting edge strategic asset. A descriptive research methodology has been adopted. The article revealed some BI critical success factors for better BI implementation. Findings revealed that organizations which have the greatest success with BI travel an evolutionary path, starting with basic data and analytical tools and transitioning to increasingly more sophisticated capabilities until BI becomes an intrinsic part of their business culture and ROI is realized. Findings are useful for managers, policy makers, business analysts, and IT specialists in dealing with planning and implementation of BI systems in SMEs.

  1. Radioactive Solid Waste Storage and Disposal at Oak Ridge National Laboratory, Description and Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bates, L.D.

    2001-01-30

    Oak Ridge National Laboratory (ORNL) is a principle Department of Energy (DOE) Research Institution operated by the Union Carbide Corporation - Nuclear Division (UCC-ND) under direction of the DOE Oak Ridge Operations Office (DOE-ORO). The Laboratory was established in east Tennessee, near what is now the city of Oak Ridge, in the mid 1940s as a part of the World War II effort to develop a nuclear weapon. Since its inception, disposal of radioactively contaminated materials, both solid and liquid, has been an integral part of Laboratory operations. The purpose of this document is to provide a detailed description of the ORNL Solid Waste Storage Areas, to describe the practice and procedure of their operation, and to address the health and safety impacts and concerns of that operation.

  2. [Professor Xu Fu-song's traditional Chinese medicine protocols for male diseases: A descriptive analysis].

    Science.gov (United States)

    Liu, Cheng-yong; Xu, Fu-song

    2015-04-01

    To analyze the efficacy and medication principles of Professor Xu Fu-songs traditional Chinese medicine (TCM) protocols for male diseases. We reviewed and descriptively analyzed the unpublished complete medical records of 100 male cases treated by Professor Xu Fu-song with his TCM protocols from 1978 to 1992. The 100 cases involved 32 male diseases, most of which were difficult and complicated cases. The drug compliance was 95%. Each prescription was made up of 14 traditional Chinese drugs on average. The cure rate was 32% , and the effective rate was 85%. Professor Xu Fu-song advanced and proved some new theories and therapeutic methods. Professor Xu Fu-song's TCM protocols can be applied to a wide range of male diseases, mostly complicated, and are characterized by accurate differentiation of symptoms and signs, high drug compliance, and excellent therapeutic efficacy.

  3. Descriptions of reference LWR facilities for analysis of nuclear fuel cycles

    International Nuclear Information System (INIS)

    Schneider, K.J.; Kabele, T.J.

    1979-09-01

    To contribute to the Department of Energy's identification of needs for improved environmental controls in nuclear fuel cycles, a study was made of a light water reactor system. A reference LWR fuel cycle was defined, and each step in this cycle was characterized by facility description and mainline and effluent treatment process performance. The reference fuel cycle uses fresh uranium in light water reactors. Final treatment and ultimate disposition of waste from the fuel cycle steps were not included, and the waste is assumed to be disposed of by approved but currently undefined means. The characterization of the reference fuel cycle system is intended as basic information for further evaluation of alternative effluent control systems

  4. Descriptions of reference LWR facilities for analysis of nuclear fuel cycles

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, K.J.; Kabele, T.J.

    1979-09-01

    To contribute to the Department of Energy's identification of needs for improved environmental controls in nuclear fuel cycles, a study was made of a light water reactor system. A reference LWR fuel cycle was defined, and each step in this cycle was characterized by facility description and mainline and effluent treatment process performance. The reference fuel cycle uses fresh uranium in light water reactors. Final treatment and ultimate disposition of waste from the fuel cycle steps were not included, and the waste is assumed to be disposed of by approved but currently undefined means. The characterization of the reference fuel cycle system is intended as basic information for further evaluation of alternative effluent control systems.

  5. Nursing students' evaluation of a new feedback and reflection tool for use in high-fidelity simulation - Formative assessment of clinical skills. A descriptive quantitative research design.

    Science.gov (United States)

    Solheim, Elisabeth; Plathe, Hilde Syvertsen; Eide, Hilde

    2017-11-01

    Clinical skills training is an important part of nurses' education programmes. Clinical skills are complex. A common understanding of what characterizes clinical skills and learning outcomes needs to be established. The aim of the study was to develop and evaluate a new reflection and feedback tool for formative assessment. The study has a descriptive quantitative design. 129 students participated who were at the end of the first year of a Bachelor degree in nursing. After highfidelity simulation, data were collected using a questionnaire with 19 closed-ended and 2 open-ended questions. The tool stimulated peer assessment, and enabled students to be more thorough in what to assess as an observer in clinical skills. The tool provided a structure for selfassessment and made visible items that are important to be aware of in clinical skills. This article adds to simulation literature and provides a tool that is useful in enhancing peer learning, which is essential for nurses in practice. The tool has potential for enabling students to learn about reflection and developing skills for guiding others in practice after they have graduated. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Unified Modeling Language description of the object-oriented multi-scale adaptive finite element method for Step-and-Flash Imprint Lithography Simulations

    International Nuclear Information System (INIS)

    Paszynski, Maciej; Gurgul, Piotr; Sieniek, Marcin; Pardo, David

    2010-01-01

    In the first part of the paper we present the multi-scale simulation of the Step-and-Flash Imprint Lithography (SFIL), a modern patterning process. The simulation utilizes the hp adaptive Finite Element Method (hp-FEM) coupled with Molecular Statics (MS) model. Thus, we consider the multi-scale problem, with molecular statics applied in the areas of the mesh where the highest accuracy is required, and the continuous linear elasticity with thermal expansion coefficient applied in the remaining part of the domain. The degrees of freedom from macro-scale element's nodes located on the macro-scale side of the interface have been identified with particles from nano-scale elements located on the nano-scale side of the interface. In the second part of the paper we present Unified Modeling Language (UML) description of the resulting multi-scale application (hp-FEM coupled with MS). We investigated classical, procedural codes from the point of view of the object-oriented (O-O) programming paradigm. The discovered hierarchical structure of classes and algorithms makes the UML project as independent on the spatial dimension of the problem as possible. The O-O UML project was defined at an abstract level, independent on the programming language used.

  7. The description of dense hydrogen with Wave Packet Molecular Dynamics (WPMD) simulations; Die Beschreibung von dichtem Wasserstoff mit der Methode der Wellenpaket-Molekulardynamik (WPMD)

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, B.

    2006-10-10

    In this work the wave packet molecular dynamics (WPMD) is presented and applied to dense hydrogen. In the WPMD method the electrons are described by a slater determinant of periodic Gaussian wave packets. Each single particle wave function can parametrised through 8 coordinates which can be interpreted as the position and momentum, the width and its conjugate momentum. The equation of motion for these coordinates can be derived from a time depended variational principle. Properties of the equilibrium can be ascertained by a Monte Carlo simulation. With the now completely implemented antisymmetrisation the simulation yields a fundamental different behavior for dense hydrogen compare to earlier simplified models. The results show a phase transition to metallic hydrogen with a higher density than in the molecular phase. This behavior has e.g. a large implication to the physics of giant planets. This work describes the used model and explains in particular the calculation of the energy and forces. The periodicity of the wave function leads to a description in the Fourier space. The antisymmetrisation is done by Matrix operations. Moreover the numerical implementation is described in detail to allow the further development of the code. The results provided in this work show the equation of state in the temperature range 300K - 50000K an density 10{sup 23}-10{sup 24} cm{sup -3}, according a pressure 1 GPa-1000 GPa. In a phase diagram the phase transition to metallic hydrogen can be red off. The electrical conductivity of both phases is destined. (orig.)

  8. Simulation analysis of globally integrated logistics and recycling strategies

    Energy Technology Data Exchange (ETDEWEB)

    Song, S.J.; Hiroshi, K. [Hiroshima Inst. of Tech., Graduate School of Mechanical Systems Engineering, Dept. of In formation and Intelligent Systems Engineering, Hiroshima (Japan)

    2004-07-01

    This paper focuses on the optimal analysis of world-wide recycling activities associated with managing the logistics and production activities in global manufacturing whose activities stretch across national boundaries. Globally integrated logistics and recycling strategies consist of the home country and two free trading economic blocs, NAFTA and ASEAN, where significant differences are found in production and disassembly cost, tax rates, local content rules and regulations. Moreover an optimal analysis of globally integrated value-chain was developed by applying simulation optimization technique as a decision-making tool. The simulation model was developed and analyzed by using ProModel packages, and the results help to identify some of the appropriate conditions required to make well-performed logistics and recycling plans in world-wide collaborated manufacturing environment. (orig.)

  9. Analysis and simulation of an electrostatic FN Tandem accelerator

    International Nuclear Information System (INIS)

    Ugarte, Ricardo

    2007-01-01

    An analysis, modeling, and simulation of a positive ion FN Tandem electrostatic accelerator has been done. That has induced a detailed study over all physics components inside the accelerators tank, the terminal control stabilizer (TPS), the corona point, the capacitor pick off (CPO) and over the generating voltmeter (GVM) signals. The parameter of the model has been developed using the Prediction Error estimation Methods (PEM), and within classical techniques of analysis of circuits. The result obtained was used to check and increase the stability of the terminal voltage using Matlab software tools. The result of the simulation was contrasted with the reality and it was possible to improve the stability of the terminal voltage, successfully. The facility belongs to ARN (Argentina) and, in principle, it was installed to development an AMS system. (author)

  10. Efficient Analysis of Simulations of the Sun's Magnetic Field

    Science.gov (United States)

    Scarborough, C. W.; Martínez-Sykora, J.

    2014-12-01

    Dynamics in the solar atmosphere, including solar flares, coronal mass ejections, micro-flares and different types of jets, are powered by the evolution of the sun's intense magnetic field. 3D Radiative Magnetohydrodnamics (MHD) computer simulations have furthered our understanding of the processes involved: When non aligned magnetic field lines reconnect, the alteration of the magnetic topology causes stored magnetic energy to be converted into thermal and kinetic energy. Detailed analysis of this evolution entails tracing magnetic field lines, an operation which is not time-efficient on a single processor. By utilizing a graphics card (GPU) to trace lines in parallel, conducting such analysis is made feasible. We applied our GPU implementation to the most advanced 3D Radiative-MHD simulations (Bifrost, Gudicksen et al. 2011) of the solar atmosphere in order to better understand the evolution of the modeled field lines.

  11. Simulation and Analysis of Converging Shock Wave Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Ramsey, Scott D. [Los Alamos National Laboratory; Shashkov, Mikhail J. [Los Alamos National Laboratory

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the original problem, and minimally straining the general credibility of associated analysis and conclusions.

  12. Visualization and analysis of eddies in a global ocean simulation

    Energy Technology Data Exchange (ETDEWEB)

    Williams, Sean J [Los Alamos National Laboratory; Hecht, Matthew W [Los Alamos National Laboratory; Petersen, Mark [Los Alamos National Laboratory; Strelitz, Richard [Los Alamos National Laboratory; Maltrud, Mathew E [Los Alamos National Laboratory; Ahrens, James P [Los Alamos National Laboratory; Hlawitschka, Mario [UC DAVIS; Hamann, Bernd [UC DAVIS

    2010-10-15

    Eddies at a scale of approximately one hundred kilometers have been shown to be surprisingly important to understanding large-scale transport of heat and nutrients in the ocean. Due to difficulties in observing the ocean directly, the behavior of eddies below the surface is not very well understood. To fill this gap, we employ a high-resolution simulation of the ocean developed at Los Alamos National Laboratory. Using large-scale parallel visualization and analysis tools, we produce three-dimensional images of ocean eddies, and also generate a census of eddy distribution and shape averaged over multiple simulation time steps, resulting in a world map of eddy characteristics. As expected from observational studies, our census reveals a higher concentration of eddies at the mid-latitudes than the equator. Our analysis further shows that mid-latitude eddies are thicker, within a range of 1000-2000m, while equatorial eddies are less than 100m thick.

  13. Exploratory Modeling and the use of Simulation for Policy Analysis

    Science.gov (United States)

    1992-01-01

    and the Use of Simulation for Policy Analysis Steven C. Barikes Prepared for the United States Army R A N D Approved for public release; distribution...Research, Vol. 39, No. 3, May-June 1991, pp. 355-365. Lipton, Richard J ., Thomas G. Marr, and J . Douglas Welsh, "Computational Approaches to Discovering...the Visual Cortex, John Wiley & Sons, New York, 1985. / -30- Rothenberg, J ., N. Z. Shapiro, and C. Hefley, "A Propagative’ Approach to Sensitivity

  14. Simulation and Analysis of the Hybrid Operating Mode in ITER

    International Nuclear Information System (INIS)

    Kessel, C.E.; Budny, R.V.; Indireshkumar, K.

    2005-01-01

    The hybrid operating mode in ITER is examined with 0D systems analysis, 1.5D discharge scenario simulations using TSC and TRANSP, and the ideal MHD stability is discussed. The hybrid mode has the potential to provide very long pulses and significant neutron fluence if the physics regime can be produced in ITER. This paper reports progress in establishing the physics basis and engineering limitation for the hybrid mode in ITER

  15. Analysis of manufacturing based on object oriented discrete event simulation

    Directory of Open Access Journals (Sweden)

    Eirik Borgen

    1990-01-01

    Full Text Available This paper describes SIMMEK, a computer-based tool for performing analysis of manufacturing systems, developed at the Production Engineering Laboratory, NTH-SINTEF. Its main use will be in analysis of job shop type of manufacturing. But certain facilities make it suitable for FMS as well as a production line manufacturing. This type of simulation is very useful in analysis of any types of changes that occur in a manufacturing system. These changes may be investments in new machines or equipment, a change in layout, a change in product mix, use of late shifts, etc. The effects these changes have on for instance the throughput, the amount of VIP, the costs or the net profit, can be analysed. And this can be done before the changes are made, and without disturbing the real system. Simulation takes into consideration, unlike other tools for analysis of manufacturing systems, uncertainty in arrival rates, process and operation times, and machine availability. It also shows the interaction effects a job which is late in one machine, has on the remaining machines in its route through the layout. It is these effects that cause every production plan not to be fulfilled completely. SIMMEK is based on discrete event simulation, and the modeling environment is object oriented. The object oriented models are transformed by an object linker into data structures executable by the simulation kernel. The processes of the entity objects, i.e. the products, are broken down to events and put into an event list. The user friendly graphical modeling environment makes it possible for end users to build models in a quick and reliable way, using terms from manufacturing. Various tests and a check of model logic are helpful functions when testing validity of the models. Integration with software packages, with business graphics and statistical functions, is convenient in the result presentation phase.

  16. A detailed description of the analysis of the decay of neutral kaons to $\\pi^+ \\pi^-$ in the CPLEAR experiment

    CERN Document Server

    Apostolakis, Alcibiades J; Backenstoss, Gerhard; Bargassa, P; Behnke, O; Benelli, A; Bertin, V; Blanc, F; Bloch, P; Carlson, P J; Carroll, M; Cawley, E; Chertok, M B; Danielsson, M; Dejardin, M; Derré, J; Ealet, A; Eleftheriadis, C; Fetscher, W; Fidecaro, Maria; Filipcic, A; Francis, D; Fry, J; Gabathuler, Erwin; Gamet, R; Gerber, H J; Go, A; Haselden, A; Hayman, P J; Henry-Coüannier, F; Hollander, R W; Jon-And, K; Kettle, P R; Kokkas, P; Kreuger, R; Le Gac, R; Leimgruber, F; Mandic, I; Manthos, N; Marel, Gérard; Mikuz, M; Miller, J; Montanet, François; Müller, A; Nakada, Tatsuya; Pagels, B; Papadopoulos, I M; Pavlopoulos, P; Polivka, G; Rickenbach, R; Roberts, B L; Ruf, T; Sakelliou, L; Schäfer, M; Schaller, L A; Schietinger, T; Schopper, A; Tauscher, Ludwig; Thibault, C; Touchard, F; Touramanis, C; van Eijk, C W E; Vlachos, S; Weber, P; Wigger, O; Wolter, M; Yéche, C; Zavrtanik, D; Zimmerman, D

    2000-01-01

    A detailed description is given of the analysis of neutral kaons decaying to \\pipi , based on the complete data sample collected with the CPLEAR experiment.Using a novel approach involving initially strangeness-tagged \\kn\\ and \\knb ,the time-dependent decay rate asymmetry has been measured. This asymmetry, resulting from the interference between the \\ks\\and \\kl\\ decay amplitudes, has enabled both the magnitudeand phase of the CP-violation parameter, \\ita , to be measured, with aprecision comparable to that of the current world average values.

  17. A detailed description of the analysis of the decay of neutral kaons to π+π- in the CPLEAR experiment

    International Nuclear Information System (INIS)

    Apostolakis, A.; Aslanides, E.

    2000-01-01

    A detailed description is given of the analysis of neutral kaons decaying to π + π - , based on the complete set of data collected with the CPLEAR experiment. Using a novel approach involving initially strangeness-tagged K 0 and anti K 0 , the time-dependent decay-rate asymmetry has been measured. This asymmetry, resulting from the interference between the K S and K L decay amplitudes, has enabled both the magnitude and phase of the CP-violation parameter, η +- , to be measured, with a precision comparable to that of the current world-average values. (orig.)

  18. Self-care 3 months after attending chronic obstructive pulmonary disease patient education: a qualitative descriptive analysis

    DEFF Research Database (Denmark)

    Mousing, Camilla A; Lomborg, Kirsten

    2012-01-01

    Purpose: The authors performed a qualitative descriptive analysis to explore how group patient education influences the self-care of patients with chronic obstructive pulmonary disease. Patients and methods: In the period 2009–2010, eleven patients diagnosed with chronic obstructive pulmonary...... their symptoms, and that the social aspect of patient education had motivated them to utilize their new habits after finishing the course. The data indicate that patients need a period of adjustment (a "ripening period"): it took time for patients to integrate new habits and competencies into everyday life...

  19. Tool Support for Parametric Analysis of Large Software Simulation Systems

    Science.gov (United States)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  20. Magnetic Testing, and Modeling, Simulation and Analysis for Space Applications

    Science.gov (United States)

    Boghosian, Mary; Narvaez, Pablo; Herman, Ray

    2012-01-01

    The Aerospace Corporation (Aerospace) and Lockheed Martin Space Systems (LMSS) participated with Jet Propulsion Laboratory (JPL) in the implementation of a magnetic cleanliness program of the NASA/JPL JUNO mission. The magnetic cleanliness program was applied from early flight system development up through system level environmental testing. The JUNO magnetic cleanliness program required setting-up a specialized magnetic test facility at Lockheed Martin Space Systems for testing the flight system and a testing program with facility for testing system parts and subsystems at JPL. The magnetic modeling, simulation and analysis capability was set up and performed by Aerospace to provide qualitative and quantitative magnetic assessments of the magnetic parts, components, and subsystems prior to or in lieu of magnetic tests. Because of the sensitive nature of the fields and particles scientific measurements being conducted by the JUNO space mission to Jupiter, the imposition of stringent magnetic control specifications required a magnetic control program to ensure that the spacecraft's science magnetometers and plasma wave search coil were not magnetically contaminated by flight system magnetic interferences. With Aerospace's magnetic modeling, simulation and analysis and JPL's system modeling and testing approach, and LMSS's test support, the project achieved a cost effective approach to achieving a magnetically clean spacecraft. This paper presents lessons learned from the JUNO magnetic testing approach and Aerospace's modeling, simulation and analysis activities used to solve problems such as remnant magnetization, performance of hard and soft magnetic materials within the targeted space system in applied external magnetic fields.

  1. Variable-density groundwater flow simulations and particle tracking. Numerical modelling using DarcyTools. Preliminary site description of the Simpevarp area, version 1.1

    Energy Technology Data Exchange (ETDEWEB)

    Follin, Sven [SF GeoLogic AB, Stockholm (Sweden); Stigsson, Martin; Berglund, Sten [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Svensson, Urban [Computer-aided Fluid Engineering AB, Norrkoeping (Sweden)

    2004-12-01

    SKB is conducting site investigations for a high-level nuclear waste repository in fractured crystalline rocks at two coastal areas in Sweden, Forsmark and Simpevarp. The investigations started in 2002 and have been planned since the late 1990s. The work presented here investigates the possibility of using hydrogeochemical measurements in deep boreholes to reduce parameter uncertainty in a regional modelling of groundwater flow in fractured rock. The work was conducted with the aim of improving the palaeohydrogeological understanding of the Simpevarp area and to give recommendations to the preparations of the next version of the Preliminary Site Description (1.2). The study is based on a large number of numerical simulations of transient variable density groundwater flow through a strongly heterogeneous and anisotropic medium. The simulations were conducted with the computer code DarcyTools, the development of which has been funded by SKB. DarcyTools is a flexible porous media code specifically designed to treat groundwater flow and salt transport in sparsely fractured crystalline rock and it is noted that some of the features presented in this report are still under development or subjected to testing and verification. The simulations reveal the sensitivity of the results to different hydrogeological modelling assumptions, e.g. the sensitivity to the initial groundwater conditions at 10,000 BC, the size of the model domain and boundary conditions, and the hydraulic properties of deterministically and stochastically modelled deformation zones. The outcome of these simulations was compared with measured salinities and calculated relative proportions of different water types (mixing proportions) from measurements in two deep core drilled boreholes in the Laxemar subarea. In addition to the flow simulations, the statistics of flow related transport parameters were calculated for particle flowpaths from repository depth to ground surface for two subareas within the

  2. Variable-density groundwater flow simulations and particle tracking. Numerical modelling using DarcyTools. Preliminary site description of the Simpevarp area, version 1.1

    International Nuclear Information System (INIS)

    Follin, Sven; Stigsson, Martin; Berglund, Sten; Svensson, Urban

    2004-12-01

    SKB is conducting site investigations for a high-level nuclear waste repository in fractured crystalline rocks at two coastal areas in Sweden, Forsmark and Simpevarp. The investigations started in 2002 and have been planned since the late 1990s. The work presented here investigates the possibility of using hydrogeochemical measurements in deep boreholes to reduce parameter uncertainty in a regional modelling of groundwater flow in fractured rock. The work was conducted with the aim of improving the palaeohydrogeological understanding of the Simpevarp area and to give recommendations to the preparations of the next version of the Preliminary Site Description (1.2). The study is based on a large number of numerical simulations of transient variable density groundwater flow through a strongly heterogeneous and anisotropic medium. The simulations were conducted with the computer code DarcyTools, the development of which has been funded by SKB. DarcyTools is a flexible porous media code specifically designed to treat groundwater flow and salt transport in sparsely fractured crystalline rock and it is noted that some of the features presented in this report are still under development or subjected to testing and verification. The simulations reveal the sensitivity of the results to different hydrogeological modelling assumptions, e.g. the sensitivity to the initial groundwater conditions at 10,000 BC, the size of the model domain and boundary conditions, and the hydraulic properties of deterministically and stochastically modelled deformation zones. The outcome of these simulations was compared with measured salinities and calculated relative proportions of different water types (mixing proportions) from measurements in two deep core drilled boreholes in the Laxemar subarea. In addition to the flow simulations, the statistics of flow related transport parameters were calculated for particle flowpaths from repository depth to ground surface for two subareas within the

  3. Descriptions of verbal communication errors between staff. An analysis of 84 root cause analysis-reports from Danish hospitals

    DEFF Research Database (Denmark)

    Rabøl, Louise Isager; Andersen, Mette Lehmann; Østergaard, Doris

    2011-01-01

    incidents. The objective of this study is to review RCA reports (RCAR) for characteristics of verbal communication errors between hospital staff in an organisational perspective. Method Two independent raters analysed 84 RCARs, conducted in six Danish hospitals between 2004 and 2006, for descriptions......Introduction Poor teamwork and communication between healthcare staff are correlated to patient safety incidents. However, the organisational factors responsible for these issues are unexplored. Root cause analyses (RCA) use human factors thinking to analyse the systems behind severe patient safety...... and characteristics of verbal communication errors such as handover errors and error during teamwork. Results Raters found description of verbal communication errors in 44 reports (52%). These included handover errors (35 (86%)), communication errors between different staff groups (19 (43%)), misunderstandings (13...

  4. Simulation and analysis of plutonium reprocessing plant data

    International Nuclear Information System (INIS)

    Burr, T.; Coulter, A.; Wangen, L.

    1996-01-01

    It will be difficult for large-throughput reprocessing plants to meet International Atomic Energy Agency (IAEA) detection goals for protracted diversion of plutonium by materials accounting alone. Therefore, the IAEA is considering supplementing traditional material balance analysis with analysis of solution monitoring data (frequent snapshots of such solution parameters as level, density, and temperature for all major process vessels). Analysis of solution monitoring data will enhance safeguards by improving anomaly detection and resolution, maintaining continuity of knowledge, and validating and improving measurement error models. However, there are costs associated with accessing and analyzing the data. To minimize these costs, analysis methods should be as complete as possible simple to implement, and require little human effort. As a step toward that goal, the authors have implemented simple analysis methods for use in an off-line situation. These methods use solution level to recognize major tank activities, such as tank-to-tank transfers and sampling. In this paper, the authors describe their application to realistic simulated data (the methods were developed by using both real and simulated data), and they present some quantifiable benefits of solution monitoring

  5. A CAD based geometry model for simulation and analysis of particle detector data

    Energy Technology Data Exchange (ETDEWEB)

    Milde, Michael; Losekamm, Martin; Poeschl, Thomas; Greenwald, Daniel; Paul, Stephan [Technische Universitaet Muenchen, 85748 Garching (Germany)

    2016-07-01

    The development of a new particle detector requires a good understanding of its setup. A detailed model of the detector's geometry is not only needed during construction, but also for simulation and data analysis. To arrive at a consistent description of the detector geometry a representation is needed that can be easily implemented in different software tools used during data analysis. We developed a geometry representation based on CAD files that can be easily used within the Geant4 simulation framework and analysis tools based on the ROOT framework. This talk presents the structure of the geometry model and show its implementation using the example of the event reconstruction developed for the Multi-purpose Active-target Particle Telescope (MAPT). The detector consists of scintillating plastic fibers and can be used as a tracking detector and calorimeter with omnidirectional acceptance. To optimize the angular resolution and the energy reconstruction of measured particles, a detailed detector model is needed at all stages of the reconstruction.

  6. Brain Based Learning in Science Education in Turkey: Descriptive Content and Meta Analysis of Dissertations

    Science.gov (United States)

    Yasar, M. Diyaddin

    2017-01-01

    This study aimed at performing content analysis and meta-analysis on dissertations related to brain-based learning in science education to find out the general trend and tendency of brain-based learning in science education and find out the effect of such studies on achievement and attitude of learners with the ultimate aim of raising awareness…

  7. Hanford Site Composite Analysis Technical Approach Description: Hanford Site Disposition Baseline.

    Energy Technology Data Exchange (ETDEWEB)

    Cobb, M. A. [CH2M HILL Plateau Remediation Company, Richland, WA (United States); Dockter, R. E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-10-02

    The permeability of ground surfaces within the U.S. Department of Energy’s (DOE) Hanford Site strongly influences boundary conditions when simulating the movement of groundwater using the Subsurface Transport Over Multiple Phases model. To conduct site-wide modeling of cumulative impacts to groundwater from past, current, and future waste management activities, a site-wide assessment of the permeability of surface conditions is needed. The surface condition of the vast majority of the Hanford Site has been and continues to be native soils vegetated with dryland grasses and shrubs.

  8. Coupling an analytical description of anti-scatter grids with simulation software of radiographic systems using Monte Carlo code; Couplage d'une methode de description analytique de grilles anti diffusantes avec un logiciel de simulation de systemes radiographiques base sur un code Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Rinkel, J.; Dinten, J.M.; Tabary, J

    2004-07-01

    The use of focused anti-scatter grids on digital radiographic systems with two-dimensional detectors produces acquisitions with a decreased scatter to primary ratio and thus improved contrast and resolution. Simulation software is of great interest in optimizing grid configuration according to a specific application. Classical simulators are based on complete detailed geometric descriptions of the grid. They are accurate but very time consuming since they use Monte Carlo code to simulate scatter within the high-frequency grids. We propose a new practical method which couples an analytical simulation of the grid interaction with a radiographic system simulation program. First, a two dimensional matrix of probability depending on the grid is created offline, in which the first dimension represents the angle of impact with respect to the normal to the grid lines and the other the energy of the photon. This matrix of probability is then used by the Monte Carlo simulation software in order to provide the final scattered flux image. To evaluate the gain of CPU time, we define the increasing factor as the increase of CPU time of the simulation with as opposed to without the grid. Increasing factors were calculated with the new model and with classical methods representing the grid with its CAD model as part of the object. With the new method, increasing factors are shorter by one to two orders of magnitude compared with the second one. These results were obtained with a difference in calculated scatter of less than five percent between the new and the classical method. (authors)

  9. Protein Data Bank Japan (PDBj): updated user interfaces, resource description framework, analysis tools for large structures.

    Science.gov (United States)

    Kinjo, Akira R; Bekker, Gert-Jan; Suzuki, Hirofumi; Tsuchiya, Yuko; Kawabata, Takeshi; Ikegawa, Yasuyo; Nakamura, Haruki

    2017-01-04

    The Protein Data Bank Japan (PDBj, http://pdbj.org), a member of the worldwide Protein Data Bank (wwPDB), accepts and processes the deposited data of experimentally determined macromolecular structures. While maintaining the archive in collaboration with other wwPDB partners, PDBj also provides a wide range of services and tools for analyzing structures and functions of proteins. We herein outline the updated web user interfaces together with RESTful web services and the backend relational database that support the former. To enhance the interoperability of the PDB data, we have previously developed PDB/RDF, PDB data in the Resource Description Framework (RDF) format, which is now a wwPDB standard called wwPDB/RDF. We have enhanced the connectivity of the wwPDB/RDF data by incorporating various external data resources. Services for searching, comparing and analyzing the ever-increasing large structures determined by hybrid methods are also described. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  10. Descriptive scientific analysis: progress of the educational system of Saudi Arabia

    Directory of Open Access Journals (Sweden)

    Annemarie PROFANTER

    2017-06-01

    Full Text Available Saudi Arabia has set itself the goal of developing a knowledge-based society. Demographic changes and economic growth within short time in the Kingdom of Saudi Arabia led to radical changes of the educational system. Higher education institutions are in expansion and international collaborations are being intensified. The policy of gender segregation based on a neopatriarchal society favored by the tribal system is an important cultural element of Saudi society and influences the Saudi educational culture.This article provides a scientific description and analyzes the main elements of the Saudi higher education system using the few data available due to the limited release of official statistics. Prince Mohammad Bin Fahd University is analyzed as a case study based on the author’s experience who taught there as an academic years of 2006 and 2008. Being the first private institution to admit both male and female students it had to face several challenges. Furthermore, the impact of international collaborations is identified by exploring the «King Abdullah Scholarship Programme» which gives thousands of students the opportunity to study abroad.Education while having a global function also fulfills a national function. Therefore, collaborations with Western universities in the Kingdom have created challenges for the recent generations in balancing Western values imposed throughout their higher education with their traditional culture. Due to the policy of gender segregation, the Saudi educational system represents different obstructions and opportunities particularly for female students.

  11. Descriptive distribution and phylogenetic analysis of feline infectious peritonitis virus isolates of Malaysia

    Directory of Open Access Journals (Sweden)

    Arshad Habibah

    2010-01-01

    Full Text Available Abstract The descriptive distribution and phylogeny of feline coronaviruses (FCoVs were studied in cats suspected of having feline infectious peritonitis (FIP in Malaysia. Ascitic fluids and/or biopsy samples were subjected to a reverse transcription polymerase chain reaction (RT-PCR targeted for a conserved region of 3'untranslated region (3'UTR of the FCoV genome. Eighty nine percent of the sampled animals were positive for the presence of FCoV. Among the FCoV positive cats, 80% of cats were males and 64% were below 2 years of age. The FCoV positive cases included 56% domestic short hair (DSH, 40% Persian, and 4% Siamese cats. The nucleotide sequences of 10 selected amplified products from FIP cases were determined. The sequence comparison revealed that the field isolates had 96% homology with a few point mutations. The extent of homology decreased to 93% when compared with reference strains. The overall branching pattern of phylogenetic tree showed two distinct clusters, where all Malaysian isolates fall into one main genetic cluster. These findings provided the first genetic information of FCoV in Malaysia.

  12. Descriptive distribution and phylogenetic analysis of feline infectious peritonitis virus isolates of Malaysia.

    Science.gov (United States)

    Sharif, Saeed; Arshad, Siti S; Hair-Bejo, Mohd; Omar, Abdul R; Zeenathul, Nazariah A; Fong, Lau S; Rahman, Nor-Alimah; Arshad, Habibah; Shamsudin, Shahirudin; Isa, Mohd-Kamarudin A

    2010-01-06

    The descriptive distribution and phylogeny of feline coronaviruses (FCoVs) were studied in cats suspected of having feline infectious peritonitis (FIP) in Malaysia. Ascitic fluids and/or biopsy samples were subjected to a reverse transcription polymerase chain reaction (RT-PCR) targeted for a conserved region of 3'untranslated region (3'UTR) of the FCoV genome. Eighty nine percent of the sampled animals were positive for the presence of FCoV. Among the FCoV positive cats, 80% of cats were males and 64% were below 2 years of age. The FCoV positive cases included 56% domestic short hair (DSH), 40% Persian, and 4% Siamese cats. The nucleotide sequences of 10 selected amplified products from FIP cases were determined. The sequence comparison revealed that the field isolates had 96% homology with a few point mutations. The extent of homology decreased to 93% when compared with reference strains. The overall branching pattern of phylogenetic tree showed two distinct clusters, where all Malaysian isolates fall into one main genetic cluster. These findings provided the first genetic information of FCoV in Malaysia.

  13. Cost analysis of simulated base-catalyzed biodiesel production processes

    International Nuclear Information System (INIS)

    Tasić, Marija B.; Stamenković, Olivera S.; Veljković, Vlada B.

    2014-01-01

    Highlights: • Two semi-continuous biodiesel production processes from sunflower oil are simulated. • Simulations were based on the kinetics of base-catalyzed methanolysis reactions. • The total energy consumption was influenced by the kinetic model. • Heterogeneous base-catalyzed process is a preferable industrial technology. - Abstract: The simulation and economic feasibility evaluation of semi-continuous biodiesel production from sunflower oil were based on the kinetics of homogeneously (Process I) and heterogeneously (Process II) base-catalyzed methanolysis reactions. The annual plant’s capacity was determined to be 8356 tonnes of biodiesel. The total energy consumption was influenced by the unit model describing the methanolysis reaction kinetics. The energy consumption of the Process II was more than 2.5 times lower than that of the Process I. Also, the simulation showed the Process I had more and larger process equipment units, compared with the Process II. Based on lower total capital investment costs and biodiesel selling price, the Process II was economically more feasible than the Process I. Sensitivity analysis was conducted using variable sunflower oil and biodiesel prices. Using a biodiesel selling price of 0.990 $/kg, Processes I and II were shown to be economically profitable if the sunflower oil price was 0.525 $/kg and 0.696 $/kg, respectively

  14. Simulated interprofessional education: an analysis of teaching and learning processes.

    Science.gov (United States)

    van Soeren, Mary; Devlin-Cop, Sandra; Macmillan, Kathleen; Baker, Lindsay; Egan-Lee, Eileen; Reeves, Scott

    2011-11-01

    Simulated learning activities are increasingly being used in health professions and interprofessional education (IPE). Specifically, IPE programs are frequently adopting role-play simulations as a key learning approach. Despite this widespread adoption, there is little empirical evidence exploring the teaching and learning processes embedded within this type of simulation. This exploratory study provides insight into the nature of these processes through the use of qualitative methods. A total of 152 clinicians, 101 students and 9 facilitators representing a range of health professions, participated in video-recorded role-plays and debrief sessions. Videotapes were analyzed to explore emerging issues and themes related to teaching and learning processes related to this type of interprofessional simulated learning experience. In addition, three focus groups were conducted with a subset of participants to explore perceptions of their educational experiences. Five key themes emerged from the data analysis: enthusiasm and motivation, professional role assignment, scenario realism, facilitator style and background and team facilitation. Our findings suggest that program developers need to be mindful of these five themes when using role-plays in an interprofessional context and point to the importance of deliberate and skilled facilitation in meeting desired learning outcomes.

  15. LArSoft: toolkit for simulation, reconstruction and analysis of liquid argon TPC neutrino detectors

    Science.gov (United States)

    Snider, E. L.; Petrillo, G.

    2017-10-01

    LArSoft is a set of detector-independent software tools for the simulation, reconstruction and analysis of data from liquid argon (LAr) neutrino experiments The common features of LAr time projection chambers (TPCs) enable sharing of algorithm code across detectors of very different size and configuration. LArSoft is currently used in production simulation and reconstruction by the ArgoNeuT, DUNE, LArlAT, MicroBooNE, and SBND experiments. The software suite offers a wide selection of algorithms and utilities, including those for associated photo-detectors and the handling of auxiliary detectors outside the TPCs. Available algorithms cover the full range of simulation and reconstruction, from raw waveforms to high-level reconstructed objects, event topologies and classification. The common code within LArSoft is contributed by adopting experiments, which also provide detector-specific geometry descriptions, and code for the treatment of electronic signals. LArSoft is also a collaboration of experiments, Fermilab and associated software projects which cooperate in setting requirements, priorities, and schedules. In this talk, we outline the general architecture of the software and the interaction with external libraries and detector-specific code. We also describe the dynamics of LArSoft software development between the contributing experiments, the projects supporting the software infrastructure LArSoft relies on, and the core LArSoft support project.

  16. Image-Based Reconstruction and Analysis of Dynamic Scenes in a Landslide Simulation Facility

    Science.gov (United States)

    Scaioni, M.; Crippa, J.; Longoni, L.; Papini, M.; Zanzi, L.

    2017-12-01

    The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time) reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC) technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.

  17. IMAGE-BASED RECONSTRUCTION AND ANALYSIS OF DYNAMIC SCENES IN A LANDSLIDE SIMULATION FACILITY

    Directory of Open Access Journals (Sweden)

    M. Scaioni

    2017-12-01

    Full Text Available The application of image processing and photogrammetric techniques to dynamic reconstruction of landslide simulations in a scaled-down facility is described. Simulations are also used here for active-learning purpose: students are helped understand how physical processes happen and which kinds of observations may be obtained from a sensor network. In particular, the use of digital images to obtain multi-temporal information is presented. On one side, using a multi-view sensor set up based on four synchronized GoPro 4 Black® cameras, a 4D (3D spatial position and time reconstruction of the dynamic scene is obtained through the composition of several 3D models obtained from dense image matching. The final textured 4D model allows one to revisit in dynamic and interactive mode a completed experiment at any time. On the other side, a digital image correlation (DIC technique has been used to track surface point displacements from the image sequence obtained from the camera in front of the simulation facility. While the 4D model may provide a qualitative description and documentation of the experiment running, DIC analysis output quantitative information such as local point displacements and velocities, to be related to physical processes and to other observations. All the hardware and software equipment adopted for the photogrammetric reconstruction has been based on low-cost and open-source solutions.

  18. Application of subset simulation methods to dynamic fault tree analysis

    International Nuclear Information System (INIS)

    Liu Mengyun; Liu Jingquan; She Ding

    2015-01-01

    Although fault tree analysis has been implemented in the nuclear safety field over the past few decades, it was recently criticized for the inability to model the time-dependent behaviors. Several methods are proposed to overcome this disadvantage, and dynamic fault tree (DFT) has become one of the research highlights. By introducing additional dynamic gates, DFT is able to describe the dynamic behaviors like the replacement of spare components or the priority of failure events. Using Monte Carlo simulation (MCS) approach to solve DFT has obtained rising attention, because it can model the authentic behaviors of systems and avoid the limitations in the analytical method. In this paper, it provides an overview and MCS information for DFT analysis, including the sampling of basic events and the propagation rule for logic gates. When calculating rare-event probability, large amount of simulations in standard MCS are required. To improve the weakness, subset simulation (SS) approach is applied. Using the concept of conditional probability and Markov Chain Monte Carlo (MCMC) technique, the SS method is able to accelerate the efficiency of exploring the failure region. Two cases are tested to illustrate the performance of SS approach, and the numerical results suggest that it gives high efficiency when calculating complicated systems with small failure probabilities. (author)

  19. Simulated, Emulated, and Physical Investigative Analysis (SEPIA) of networked systems.

    Energy Technology Data Exchange (ETDEWEB)

    Burton, David P.; Van Leeuwen, Brian P.; McDonald, Michael James; Onunkwo, Uzoma A.; Tarman, Thomas David; Urias, Vincent E.

    2009-09-01

    This report describes recent progress made in developing and utilizing hybrid Simulated, Emulated, and Physical Investigative Analysis (SEPIA) environments. Many organizations require advanced tools to analyze their information system's security, reliability, and resilience against cyber attack. Today's security analysis utilize real systems such as computers, network routers and other network equipment, computer emulations (e.g., virtual machines) and simulation models separately to analyze interplay between threats and safeguards. In contrast, this work developed new methods to combine these three approaches to provide integrated hybrid SEPIA environments. Our SEPIA environments enable an analyst to rapidly configure hybrid environments to pass network traffic and perform, from the outside, like real networks. This provides higher fidelity representations of key network nodes while still leveraging the scalability and cost advantages of simulation tools. The result is to rapidly produce large yet relatively low-cost multi-fidelity SEPIA networks of computers and routers that let analysts quickly investigate threats and test protection approaches.

  20. Automated analysis for detecting beams in laser wakefield simulations

    International Nuclear Information System (INIS)

    Ushizima, Daniela M.; Rubel, Oliver; Prabhat, Mr.; Weber, Gunther H.; Bethel, E. Wes; Aragon, Cecilia R.; Geddes, Cameron G.R.; Cormier-Michel, Estelle; Hamann, Bernd; Messmer, Peter; Hagen, Hans

    2008-01-01

    Laser wakefield particle accelerators have shown the potential to generate electric fields thousands of times higher than those of conventional accelerators. The resulting extremely short particle acceleration distance could yield a potential new compact source of energetic electrons and radiation, with wide applications from medicine to physics. Physicists investigate laser-plasma internal dynamics by running particle-in-cell simulations; however, this generates a large dataset that requires time-consuming, manual inspection by experts in order to detect key features such as beam formation. This paper describes a framework to automate the data analysis and classification of simulation data. First, we propose a new method to identify locations with high density of particles in the space-time domain, based on maximum extremum point detection on the particle distribution. We analyze high density electron regions using a lifetime diagram by organizing and pruning the maximum extrema as nodes in a minimum spanning tree. Second, we partition the multivariate data using fuzzy clustering to detect time steps in a experiment that may contain a high quality electron beam. Finally, we combine results from fuzzy clustering and bunch lifetime analysis to estimate spatially confined beams. We demonstrate our algorithms successfully on four different simulation datasets

  1. PRANAS: A New Platform for Retinal Analysis and Simulation

    Directory of Open Access Journals (Sweden)

    Bruno Cessac

    2017-09-01

    Full Text Available The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr, standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.

  2. Fluid Flow Simulation and Energetic Analysis of Anomalocarididae Locomotion

    Science.gov (United States)

    Mikel-Stites, Maxwell; Staples, Anne

    2014-11-01

    While an abundance of animal locomotion simulations have been performed modeling the motions of living arthropods and aquatic animals, little quantitative simulation and reconstruction of gait parameters has been done to model the locomotion of extinct animals, many of which bear little physical resemblance to their modern descendants. To that end, this project seeks to analyze potential swimming patterns used by the anomalocaridid family, (specifically Anomalocaris canadensis, a Cambrian Era aquatic predator), and determine the most probable modes of movement. This will serve to either verify or cast into question the current assumed movement patterns and properties of these animals and create a bridge between similar flexible-bodied swimmers and their robotic counterparts. This will be accomplished by particle-based fluid flow simulations of the flow around the fins of the animal, as well as an energy analysis of a variety of sample gaits. The energy analysis will then be compared to the extant information regarding speed/energy use curves in an attempt to determine which modes of swimming were most energy efficient for a given range of speeds. These results will provide a better understanding of how these long-extinct animals moved, possibly allowing an improved understanding of their behavioral patterns, and may also lead to a novel potential platform for bio-inspired underwater autonomous vehicles (UAVs).

  3. Descriptive sensory analysis of Aceto Balsamico Tradizionale di Modena DOP and Aceto Balsamico Tradizionale di Reggio Emilia DOP.

    Science.gov (United States)

    Zeppa, Giuseppe; Gambigliani Zoccoli, Mario; Nasi, Enrico; Masini, Giovanni; Meglioli, Giuseppe; Zappino, Matteo

    2013-12-01

    Aceto Balsamico Tradizionale (ABT) is a typical Italian vinegar available in two different forms: Aceto Balsamico Tradizionale di Modena DOP (ABTM) and Aceto Balsamico Tradizionale di Reggio Emilia DOP (ABTRE). ABT is obtained by alcoholic fermentation and acetic bio-oxidation of cooked grape must and aged at least 12 years in wooden casks and is known and sold around the world. Despite this widespread recognition, data on sensory characteristics of these products are very scarce. Therefore a descriptive analysis was conducted to define a lexicon for the ABT sensory profile and to create a simple, stable and reproducible synthetic ABT for training panellists. A lexicon of 20 sensory parameters was defined and validated and a synthetic ABT was prepared as standard reference. Simple standards for panellist training were also defined and the sensory profiles of ABTM and ABTRE were obtained. The obtained results confirm that descriptive analysis can be used for the sensory characterisation of ABT and that the sensory profiles of ABTM and ABTRE are very different. Furthermore, the results demonstrate that a lexicon and proper standard references are essential for describing the sensory qualities of ABT both for technical purposes and to protect the product from commercial fraud. © 2013 Society of Chemical Industry.

  4. Sex Life Satisfaction in Sub-Saharan Africa: A Descriptive and Exploratory Analysis.

    Science.gov (United States)

    Cranney, Stephen

    2017-10-01

    Nearly all of the sex life satisfaction literature has dealt with developed-country settings, and nothing has been published on sex life satisfaction in sub-Saharan Africa. Not only is sub-Saharan African a substantively relevant area in its own right, but it also provides a useful point of comparison for patterns and relations found in developed-world contexts. A brief descriptive and exploratory study of sex life satisfaction in sub-Saharan Africa was conducted using the World Gallup Poll, a dataset with representative sex life satisfaction data for 31 countries and 25,483 cases. In general, there was little variation in weighted averages across countries, and most of the samples surveyed were satisfied with their sex lives, with the modal score being a perfect 10. Furthermore, what variation did exist could not be attributed to level of economic development or gender inequality. Within countries, sociodemographic associations generally comported with patterns found in other contexts: income, education, and being partnered were generally associated with sex life satisfaction, and for two of the four UN subregions (West Africa and East Africa), males were significantly more satisfied with their sex lives than women. The relationship with age demonstrated a curvilinear relationship, with the peak age of sexual satisfaction in the late 20s to early 30s depending on the geographic region. The age pattern was not due to health differences, but combining estimators after a seemingly unrelated regression suggests that 4-12% of the effect of income on sex life satisfaction was attributable to better health. In general, religiosity and perceived gravity of the HIV/AIDS problem in one's country were not significantly related to sexual satisfaction.

  5. Toward crustacean without chemicals: a descriptive analysis of consumer response using price comparisons

    Directory of Open Access Journals (Sweden)

    Charles Odilichukwu R. Okpala

    2016-10-01

    Full Text Available Background: To date, there seems to be limited-to-zero emphasis about how consumers perceive crustacean products subject to either chemical and or non-chemical preservative treatments. In addition, studies that investigated price comparisons of crustacean products subject to either chemical or chemical-free preservative methods seem unreported. Objective: This study focused on providing some foundational knowledge about how consumers perceive traditionally harvested crustaceans that are either chemical-treated and or free of chemicals, incorporating price comparisons using a descriptive approach. Design: The study design employed a questionnaire approach via interview using a computer-assisted telephone system and sampled 1,540 participants across five key locations in Italy. To actualize consumer sensitivity, ‘price’ was the focus given its crucial role as a consumption barrier. Prior to this, variables such as demographic characteristics of participants, frequency of purchasing, quality attributes/factors that limit the consumption of crustaceans were equally considered. Results: By price comparisons, consumers are likely to favor chemical-free (modified atmosphere packaging crustacean products amid a price increase of up to 15%. But, a further price increase such as by 25% could markedly damage consumers’ feelings, which might lead to a considerable number opting out in favor of either chemical-treated or other seafood products. Comparing locations, the studied variables showed no statistical differences (p>0.05. On the contrary, the response weightings fluctuated across the studied categories. Both response weightings and coefficient of variation helped reveal more about how responses deviated per variable categories. Conclusions: This study has revealed some foundational knowledge about how consumers perceive traditionally harvested crustaceans that were either chemical-treated or subject to chemical-free preservative up to price

  6. Toward crustacean without chemicals: a descriptive analysis of consumer response using price comparisons.

    Science.gov (United States)

    Okpala, Charles Odilichukwu R; Bono, Gioacchino; Pipitone, Vito; Vitale, Sergio; Cannizzaro, Leonardo

    2016-01-01

    To date, there seems to be limited-to-zero emphasis about how consumers perceive crustacean products subject to either chemical and or non-chemical preservative treatments. In addition, studies that investigated price comparisons of crustacean products subject to either chemical or chemical-free preservative methods seem unreported. This study focused on providing some foundational knowledge about how consumers perceive traditionally harvested crustaceans that are either chemical-treated and or free of chemicals, incorporating price comparisons using a descriptive approach. The study design employed a questionnaire approach via interview using a computer-assisted telephone system and sampled 1,540 participants across five key locations in Italy. To actualize consumer sensitivity, 'price' was the focus given its crucial role as a consumption barrier. Prior to this, variables such as demographic characteristics of participants, frequency of purchasing, quality attributes/factors that limit the consumption of crustaceans were equally considered. By price comparisons, consumers are likely to favor chemical-free (modified atmosphere packaging) crustacean products amid a price increase of up to 15%. But, a further price increase such as by 25% could markedly damage consumers' feelings, which might lead to a considerable number opting out in favor of either chemical-treated or other seafood products. Comparing locations, the studied variables showed no statistical differences ( p >0.05). On the contrary, the response weightings fluctuated across the studied categories. Both response weightings and coefficient of variation helped reveal more about how responses deviated per variable categories. This study has revealed some foundational knowledge about how consumers perceive traditionally harvested crustaceans that were either chemical-treated or subject to chemical-free preservative up to price sensitivity using Italy as a reference case, which is applicable to other parts

  7. Analysis of the tight-binding description of the structure of metallic 2D systems

    International Nuclear Information System (INIS)

    Baquero, R.

    1990-12-01

    Bidimensional metallic systems as interfaces, quantum wells and superlattices with sharp interfaces became recently available and their properties can now be experimentally studied in detail. To calculate the Local Density of States (LDOS) for surfaces, interfaces, quantum wells and superlattices we use empirical tight-binding Hamiltonians together with the Green function matching method (GFM). In this paper we show some examples of our results employing the method just outlined to describe metallic 2D systems. In particular, we refer briefly to the effect on the LDOS of the very recently established contraction of the first interatomic layer distance in the Ta(001) surface. We then discuss the Nb-V ideal (100) interface and conclude that under certain conditions the V-side of an interface can show magnetism as the V(001) surface does. As a last example, we present a calculation that relates the changes with gold coverage of the reaction rate of the catalytic reaction of cyclohexene into benzene on a Pt(001) surface to the changes on the LDOS of the outermost Pt atomic layer. We show that the behavior of the LDOS around the Fermi level is an important factor to the explanation of the behavior of this catalytic reaction. We conclude by stating that the empirical tight-binding method is a very simple and useful tool for the description of 2D metallic systems. The advantage is that the computational demands are low and all the ingredients to take full profit of this method are available (reliable tight-binding parameters and suitable methods for the calculation of the Green function). (author). 14 refs, 3 figs

  8. Lay involvement in the analysis of qualitative data in health services research: a descriptive study.

    Science.gov (United States)

    Garfield, S; Jheeta, S; Husson, F; Jacklin, A; Bischler, A; Norton, C; Franklin, B D

    2016-01-01

    There is a consensus that patients and the public should be involved in research in a meaningful way. However, to date, lay people have been mostly involved in developing research ideas and commenting on patient information.We previously published a paper describing our experience with lay partners conducting observations in a study of how patients in hospital are involved with their medicines. In a later part of the same study, lay partners were also involved in analysing interviews that a researcher had conducted with patients, carers and healthcare professionals about patient and carer involvement with medicines in hospital. We therefore wanted to build on our previous paper and report on our experiences with lay partners helping to conduct data analysis. We therefore interviewed the lay members and researchers involved in the analysis to find out their views.Both lay members and researchers reported that lay partners added value to the study by bringing their own perspectives and identifying further areas for the researcher to look for in the interviews. In this way researchers and lay partners were able to work together to produce a richer analysis than would have been possible from either alone. Background It is recognised that involving lay people in research in a meaningful rather than tokenistic way is both important and challenging. In this paper, we contribute to this debate by describing our experiences of lay involvement in data analysis. Methods We conducted semi-structured interviews with the lay partners and researchers involved in qualitative data analysis in a wider study of inpatient involvement in medication safety. The interviews were transcribed verbatim and coded using open thematic analysis. Results We interviewed three lay partners and the three researchers involved. These interviews demonstrated that the lay members added value to the analysis by bringing their own perspectives; these were systematically integrated into the analysis by the

  9. Feature-Based Statistical Analysis of Combustion Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, J; Krishnamoorthy, V; Liu, S; Grout, R; Hawkes, E; Chen, J; Pascucci, V; Bremer, P T

    2011-11-18

    We present a new framework for feature-based statistical analysis of large-scale scientific data and demonstrate its effectiveness by analyzing features from Direct Numerical Simulations (DNS) of turbulent combustion. Turbulent flows are ubiquitous and account for transport and mixing processes in combustion, astrophysics, fusion, and climate modeling among other disciplines. They are also characterized by coherent structure or organized motion, i.e. nonlocal entities whose geometrical features can directly impact molecular mixing and reactive processes. While traditional multi-point statistics provide correlative information, they lack nonlocal structural information, and hence, fail to provide mechanistic causality information between organized fluid motion and mixing and reactive processes. Hence, it is of great interest to capture and track flow features and their statistics together with their correlation with relevant scalar quantities, e.g. temperature or species concentrations. In our approach we encode the set of all possible flow features by pre-computing merge trees augmented with attributes, such as statistical moments of various scalar fields, e.g. temperature, as well as length-scales computed via spectral analysis. The computation is performed in an efficient streaming manner in a pre-processing step and results in a collection of meta-data that is orders of magnitude smaller than the original simulation data. This meta-data is sufficient to support a fully flexible and interactive analysis of the features, allowing for arbitrary thresholds, providing per-feature statistics, and creating various global diagnostics such as Cumulative Density Functions (CDFs), histograms, or time-series. We combine the analysis with a rendering of the features in a linked-view browser that enables scientists to interactively explore, visualize, and analyze the equivalent of one terabyte of simulation data. We highlight the utility of this new framework for combustion

  10. Nuclear Fuel Cycle Analysis and Simulation Tool (FAST)

    Energy Technology Data Exchange (ETDEWEB)

    Ko, Won Il; Kwon, Eun Ha; Kim, Ho Dong

    2005-06-15

    This paper describes the Nuclear Fuel Cycle Analysis and Simulation Tool (FAST) which has been developed by the Korea Atomic Energy Research Institute (KAERI). Categorizing various mix of nuclear reactors and fuel cycles into 11 scenario groups, the FAST calculates all the required quantities for each nuclear fuel cycle component, such as mining, conversion, enrichment and fuel fabrication for each scenario. A major advantage of the FAST is that the code employs a MS Excel spread sheet with the Visual Basic Application, allowing users to manipulate it with ease. The speed of the calculation is also quick enough to make comparisons among different options in a considerably short time. This user-friendly simulation code is expected to be beneficial to further studies on the nuclear fuel cycle to find best options for the future all proliferation risk, environmental impact and economic costs considered.

  11. Contact Thermal Analysis and Wear Simulation of a Brake Block

    Directory of Open Access Journals (Sweden)

    Nándor Békési

    2013-01-01

    Full Text Available The present paper describes an experimental test and a coupled contact-thermal-wear analysis of a railway wheel/brake block system through the braking process. During the test, the friction, the generated heat, and the wear were evaluated. It was found that the contact between the brake block and the wheel occurs in relatively small and slowly moving hot spots, caused by the wear and the thermal effects. A coupled simulation method was developed including numerical frictional contact, transient thermal and incremental wear calculations. In the 3D simulation, the effects of the friction, the thermal expansion, the wear, and the temperature-dependent material properties were also considered. A good agreement was found between the results of the test and the calculations, both for the thermal and wear results. The proposed method is suitable for modelling the slowly oscillating wear caused by the thermal expansions in the contact area.

  12. An educational model for ensemble streamflow simulation and uncertainty analysis

    Directory of Open Access Journals (Sweden)

    A. AghaKouchak

    2013-02-01

    Full Text Available This paper presents the hands-on modeling toolbox, HBV-Ensemble, designed as a complement to theoretical hydrology lectures, to teach hydrological processes and their uncertainties. The HBV-Ensemble can be used for in-class lab practices and homework assignments, and assessment of students' understanding of hydrological processes. Using this modeling toolbox, students can gain more insights into how hydrological processes (e.g., precipitation, snowmelt and snow accumulation, soil moisture, evapotranspiration and runoff generation are interconnected. The educational toolbox includes a MATLAB Graphical User Interface (GUI and an ensemble simulation scheme that can be used for teaching uncertainty analysis, parameter estimation, ensemble simulation and model sensitivity. HBV-Ensemble was administered in a class for both in-class instruction and a final project, and students submitted their feedback about the toolbox. The results indicate that this educational software had a positive impact on students understanding and knowledge of uncertainty in hydrological modeling.

  13. Nuclear fuel cycle cost analysis using a probabilistic simulation technique

    International Nuclear Information System (INIS)

    Won, Il Ko; Jong, Won Choi; Chul, Hyung Kang; Jae, Sol Lee; Kun, Jai Lee

    1998-01-01

    A simple approach was described to incorporate the Monte Carlo simulation technique into a fuel cycle cost estimate. As a case study, the once-through and recycle fuel cycle options were tested with some alternatives (ie. the change of distribution type for input parameters), and the simulation results were compared with the values calculated by a deterministic method. A three-estimate approach was used for converting cost inputs into the statistical parameters of assumed probabilistic distributions. It was indicated that the Monte Carlo simulation by a Latin Hypercube Sampling technique and subsequent sensitivity analyses were useful for examining uncertainty propagation of fuel cycle costs, and could more efficiently provide information to decisions makers than a deterministic method. It was shown from the change of distribution types of input parameters that the values calculated by the deterministic method were set around a 40 th ∼ 50 th percentile of the output distribution function calculated by probabilistic simulation. Assuming lognormal distribution of inputs, however, the values calculated by the deterministic method were set around an 85 th percentile of the output distribution function calculated by probabilistic simulation. It was also indicated from the results of the sensitivity analysis that the front-end components were generally more sensitive than the back-end components, of which the uranium purchase cost was the most important factor of all. It showed, also, that the discount rate made many contributions to the fuel cycle cost, showing the rank of third or fifth of all components. The results of this study could be useful in applications to another options, such as the Dcp (Direct Use of PWR spent fuel In Candu reactors) cycle with high cost uncertainty

  14. Analysis of bending process using forming simulation; Seikei simulation ni yoru press niji seikei kaiseki

    Energy Technology Data Exchange (ETDEWEB)

    Hamaguchi, T; Ogawa, T; Tamai, H [Mazda Motor Corp., Hiroshima (Japan)

    1997-10-01

    FEM simulation system is becoming an effective tool in the production engineering, especially in the evaluation of press formability. We have been applying it to the evaluation of defect phenomena, such as breakage and wrinkling, which occur in the drawing process to produce auto body parts. We tried a new application which treat dimensional precision and the other defect in the flanging or bending process after trimming. In this paper, we introduced the result of development and an example applied in the analysis. 1 refs., 8 figs.

  15. Carbon dioxide capture processes: Simulation, design and sensitivity analysis

    DEFF Research Database (Denmark)

    Zaman, Muhammad; Lee, Jay Hyung; Gani, Rafiqul

    2012-01-01

    equilibrium and associated property models are used. Simulations are performed to investigate the sensitivity of the process variables to change in the design variables including process inputs and disturbances in the property model parameters. Results of the sensitivity analysis on the steady state...... performance of the process to the L/G ratio to the absorber, CO2 lean solvent loadings, and striper pressure are presented in this paper. Based on the sensitivity analysis process optimization problems have been defined and solved and, a preliminary control structure selection has been made.......Carbon dioxide is the main greenhouse gas and its major source is combustion of fossil fuels for power generation. The objective of this study is to carry out the steady-state sensitivity analysis for chemical absorption of carbon dioxide capture from flue gas using monoethanolamine solvent. First...

  16. TMAP/Mod 1: Tritium Migration Analysis Program code description and user's manual

    International Nuclear Information System (INIS)

    Merrill, B.J.; Jones, J.L.; Holland, D.F.

    1986-01-01

    The Tritium Migration Analysis Program (TMAP) has been developed by the Fusion Safety Program of EG and G Idaho, Inc., at the Idaho National Engineering Laboratory (INEL) as a safety analysis code to analyze tritium loss from fusion systems during normal operation and under accident conditions. TMAP is a one-dimensional code that determines tritium movement and inventories in a system of interconnected enclosures and wall structures. In addition, the thermal response of structures is modeled to provide temperature information required for calculations of tritium movement. The program is written in FORTRAN 4 and has been implemented on the National Magnetic Fusion Energy Computing Center (NMFECC)

  17. Hanford Site Composite Analysis Technical Approach Description: Groundwater Pathway Dose Calculation.

    Energy Technology Data Exchange (ETDEWEB)

    Morgans, D. L. [CH2M Hill Plateau Remediation Company, Richland, WA (United States); Lindberg, S. L. [Intera Inc., Austin, TX (United States)

    2017-09-20

    The purpose of this technical approach document (TAD) is to document the assumptions, equations, and methods used to perform the groundwater pathway radiological dose calculations for the revised Hanford Site Composite Analysis (CA). DOE M 435.1-1, states, “The composite analysis results shall be used for planning, radiation protection activities, and future use commitments to minimize the likelihood that current low-level waste disposal activities will result in the need for future corrective or remedial actions to adequately protect the public and the environment.”

  18. Visualization and Analysis of Climate Simulation Performance Data

    Science.gov (United States)

    Röber, Niklas; Adamidis, Panagiotis; Behrens, Jörg

    2015-04-01

    Visualization is the key process of transforming abstract (scientific) data into a graphical representation, to aid in the understanding of the information hidden within the data. Climate simulation data sets are typically quite large, time varying, and consist of many different variables sampled on an underlying grid. A large variety of climate models - and sub models - exist to simulate various aspects of the climate system. Generally, one is mainly interested in the physical variables produced by the simulation runs, but model developers are also interested in performance data measured along with these simulations. Climate simulation models are carefully developed complex software systems, designed to run in parallel on large HPC systems. An important goal thereby is to utilize the entire hardware as efficiently as possible, that is, to distribute the workload as even as possible among the individual components. This is a very challenging task, and detailed performance data, such as timings, cache misses etc. have to be used to locate and understand performance problems in order to optimize the model implementation. Furthermore, the correlation of performance data to the processes of the application and the sub-domains of the decomposed underlying grid is vital when addressing communication and load imbalance issues. High resolution climate simulations are carried out on tens to hundreds of thousands of cores, thus yielding a vast amount of profiling data, which cannot be analyzed without appropriate visualization techniques. This PICO presentation displays and discusses the ICON simulation model, which is jointly developed by the Max Planck Institute for Meteorology and the German Weather Service and in partnership with DKRZ. The visualization and analysis of the models performance data allows us to optimize and fine tune the model, as well as to understand its execution on the HPC system. We show and discuss our workflow, as well as present new ideas and

  19. Descriptive Analysis and Strategic Options to Defeat Commodity-Based Threat Financing Methodologies Related to Gold

    Science.gov (United States)

    2015-09-01

    auditing firms. 14. SUBJECT TERMS Counter threat finance, commodity-based money laundering , terrorist financing, social network analysis, bright...51 2. Asia/Pacific Group on Money Laundering ................................52 3. Caribbean Financial Action Task Force...53 4. Eurasian Group on Combating Money Laundering and Financing of Terrorism

  20. An Evaluation of Normal versus Lognormal Distribution in Data Description and Empirical Analysis

    Science.gov (United States)

    Diwakar, Rekha

    2017-01-01

    Many existing methods of statistical inference and analysis rely heavily on the assumption that the data are normally distributed. However, the normality assumption is not fulfilled when dealing with data which does not contain negative values or are otherwise skewed--a common occurrence in diverse disciplines such as finance, economics, political…

  1. Self-Injurious Behavior and Functional Analysis: Where Are the Descriptions of Participant Protections?

    Science.gov (United States)

    Weeden, Marc; Mahoney, Amanda; Poling, Alan

    2010-01-01

    This study examined the reporting of participant protections in studies involving functional analysis and self-injurious behavior and published from 1994 through 2008. Results indicated that session termination criteria were rarely reported and other specific participant safeguards were seldom described. The absence of such information in no way…

  2. Analysis of airborne radiometric data. Volume 2. Description, listing, and operating instructions for the code DELPHI/MAZAS. Final report

    International Nuclear Information System (INIS)

    Sperling, M.; Shreve, D.C.

    1978-01-01

    The computer code DELPHI is an interactive English language command system for the analysis of airborne radiometric data. The code includes modules for data reduction, data simulation, time filtering, data adjustment and graphical presentation of the results. DELPHI is implemented in FORTRAN on a DEC-10 computer. This volume gives a brief set of operations instructions, samples of the output obtained from hard copies of the display on a Tektronix terminal and finally a listing of the code

  3. Analysis of airborne radiometric data. Volume 2. Description, listing, and operating instructions for the code DELPHI/MAZAS. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Sperling, M.; Shreve, D.C.

    1978-12-01

    The computer code DELPHI is an interactive English language command system for the analysis of airborne radiometric data. The code includes modules for data reduction, data simulation, time filtering, data adjustment and graphical presentation of the results. DELPHI is implemented in FORTRAN on a DEC-10 computer. This volume gives a brief set of operations instructions, samples of the output obtained from hard copies of the display on a Tektronix terminal and finally a listing of the code.

  4. National Geo-Database for Biofuel Simulations and Regional Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Izaurralde, Roberto C.; Zhang, Xuesong; Sahajpal, Ritvik; Manowitz, David H.

    2012-04-01

    performance of EPIC and, when necessary, improve its parameterization. We investigated three scenarios. In the first, we simulated a historical (current) baseline scenario composed mainly of corn-, soybean-, and wheat-based rotations as grown existing croplands east of the Rocky Mountains in 30 states. In the second scenario, we simulated a modified baseline in which we harvested corn and wheat residues to supply feedstocks to potential cellulosic ethanol biorefineries distributed within the study area. In the third scenario, we simulated the productivity of perennial cropping systems such as switchgrass or perennial mixtures grown on either marginal or Conservation Reserve Program (CRP) lands. In all cases we evaluated the environmental impacts (e.g., soil carbon changes, soil erosion, nitrate leaching, etc.) associated with the practices. In summary, we have reported on the development of a spatially explicit national geodatabase to conduct biofuel simulation studies and provided initial simulation results on the potential of annual and perennial cropping systems to serve as feedstocks for the production of cellulosic ethanol. To accomplish this, we have employed sophisticated spatial analysis methods in combination with the process-based biogeochemical model EPIC. This work provided the opportunity to test the hypothesis that marginal lands can serve as sources of cellulosic feedstocks and thus contribute to avoid potential conflicts between bioenergy and food production systems. This work, we believe, opens the door for further analysis on the characteristics of cellulosic feedstocks as major contributors to the development of a sustainable bioenergy economy.

  5. Tube Bulge Process : Theoretical Analysis and Finite Element Simulations

    International Nuclear Information System (INIS)

    Velasco, Raphael; Boudeau, Nathalie

    2007-01-01

    This paper is focused on the determination of mechanics characteristics for tubular materials, using tube bulge process. A comparative study is made between two different models: theoretical model and finite element analysis. The theoretical model is completely developed, based first on a geometrical analysis of the tube profile during bulging, which is assumed to strain in arc of circles. Strain and stress analysis complete the theoretical model, which allows to evaluate tube thickness and state of stress, at any point of the free bulge region. Free bulging of a 304L stainless steel is simulated using Ls-Dyna 970. To validate FE simulations approach, a comparison between theoretical and finite elements models is led on several parameters such as: thickness variation at the free bulge region pole with bulge height, tube thickness variation with z axial coordinate, and von Mises stress variation with plastic strain. Finally, the influence of geometrical parameters deviations on flow stress curve is observed using analytical model: deviations of the tube outer diameter, its initial thickness and the bulge height measurement are taken into account to obtain a resulting error on plastic strain and von Mises stress

  6. Kinematic simulation and analysis of robot based on MATLAB

    Science.gov (United States)

    Liao, Shuhua; Li, Jiong

    2018-03-01

    The history of industrial automation is characterized by quick update technology, however, without a doubt, the industrial robot is a kind of special equipment. With the help of MATLAB matrix and drawing capacity in the MATLAB environment each link coordinate system set up by using the d-h parameters method and equation of motion of the structure. Robotics, Toolbox programming Toolbox and GUIDE to the joint application is the analysis of inverse kinematics and path planning and simulation, preliminary solve the problem of college students the car mechanical arm positioning theory, so as to achieve the aim of reservation.

  7. Markov chain analysis of single spin flip Ising simulations

    International Nuclear Information System (INIS)

    Hennecke, M.

    1997-01-01

    The Markov processes defined by random and loop-based schemes for single spin flip attempts in Monte Carlo simulations of the 2D Ising model are investigated, by explicitly constructing their transition matrices. Their analysis reveals that loops over all lattice sites using a Metropolis-type single spin flip probability often do not define ergodic Markov chains, and have distorted dynamical properties even if they are ergodic. The transition matrices also enable a comparison of the dynamics of random versus loop spin selection and Glauber versus Metropolis probabilities

  8. Predicting SPE Fluxes: Coupled Simulations and Analysis Tools

    Science.gov (United States)

    Gorby, M.; Schwadron, N.; Linker, J.; Caplan, R. M.; Wijaya, J.; Downs, C.; Lionello, R.

    2017-12-01

    Presented here is a nuts-and-bolts look at the coupled framework of Predictive Science Inc's Magnetohydrodynamics Around a Sphere (MAS) code and the Energetic Particle Radiation Environment Module (EPREM). MAS simulated coronal mass ejection output from a variety of events can be selected as the MHD input to EPREM and a variety of parameters can be set to run against: bakground seed particle spectra, mean free path, perpendicular diffusion efficiency, etc.. A standard set of visualizations are produced as well as a library of analysis tools for deeper inquiries. All steps will be covered end-to-end as well as the framework's user interface and availability.

  9. A Description of the Revised ATHEANA (A Technique for Human Event Analysis)

    International Nuclear Information System (INIS)

    FORESTER, JOHN A.; BLEY, DENNIS C.; COOPER, SUSANE; KOLACZKOWSKI, ALAN M.; THOMPSON, CATHERINE; RAMEY-SMITH, ANN; WREATHALL, JOHN

    2000-01-01

    This paper describes the most recent version of a human reliability analysis (HRA) method called ''A Technique for Human Event Analysis'' (ATHEANA). The new version is documented in NUREG-1624, Rev. 1 [1] and reflects improvements to the method based on comments received from a peer review that was held in 1998 (see [2] for a detailed discussion of the peer review comments) and on the results of an initial trial application of the method conducted at a nuclear power plant in 1997 (see Appendix A in [3]). A summary of the more important recommendations resulting from the peer review and trial application is provided and critical and unique aspects of the revised method are discussed

  10. Description of a portable wireless device for high-frequency body temperature acquisition and analysis.

    Science.gov (United States)

    Cuesta-Frau, David; Varela, Manuel; Aboy, Mateo; Miró-Martínez, Pau

    2009-01-01

    We describe a device for dual channel body temperature monitoring. The device can operate as a real time monitor or as a data logger, and has Bluetooth capabilities to enable for wireless data download to the computer used for data analysis. The proposed device is capable of sampling temperature at a rate of 1 sample per minute with a resolution of 0.01 °C . The internal memory allows for stand-alone data logging of up to 10 days. The device has a battery life of 50 hours in continuous real-time mode. In addition to describing the proposed device in detail, we report the results of a statistical analysis conducted to assess its accuracy and reproducibility.

  11. Detailed description and user`s manual of high burnup fuel analysis code EXBURN-I

    Energy Technology Data Exchange (ETDEWEB)

    Suzuki, Motoe [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Saitou, Hiroaki

    1997-11-01

    EXBURN-I has been developed for the analysis of LWR high burnup fuel behavior in normal operation and power transient conditions. In the high burnup region, phenomena occur which are different in quality from those expected for the extension of behaviors in the mid-burnup region. To analyze these phenomena, EXBURN-I has been formed by the incorporation of such new models as pellet thermal conductivity change, burnup-dependent FP gas release rate, and cladding oxide layer growth to the basic structure of low- and mid-burnup fuel analysis code FEMAXI-IV. The present report describes in detail the whole structure of the code, models, and materials properties. Also, it includes a detailed input manual and sample output, etc. (author). 55 refs.

  12. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  13. The NWRA Classification Infrastructure: description and extension to the Discriminant Analysis Flare Forecasting System (DAFFS)

    Science.gov (United States)

    Leka, K. D.; Barnes, Graham; Wagner, Eric

    2018-04-01

    A classification infrastructure built upon Discriminant Analysis (DA) has been developed at NorthWest Research Associates for examining the statistical differences between samples of two known populations. Originating to examine the physical differences between flare-quiet and flare-imminent solar active regions, we describe herein some details of the infrastructure including: parametrization of large datasets, schemes for handling "null" and "bad" data in multi-parameter analysis, application of non-parametric multi-dimensional DA, an extension through Bayes' theorem to probabilistic classification, and methods invoked for evaluating classifier success. The classifier infrastructure is applicable to a wide range of scientific questions in solar physics. We demonstrate its application to the question of distinguishing flare-imminent from flare-quiet solar active regions, updating results from the original publications that were based on different data and much smaller sample sizes. Finally, as a demonstration of "Research to Operations" efforts in the space-weather forecasting context, we present the Discriminant Analysis Flare Forecasting System (DAFFS), a near-real-time operationally-running solar flare forecasting tool that was developed from the research-directed infrastructure.

  14. Description of the Northwest hazardous waste site data base and preliminary analysis of site characteristics

    International Nuclear Information System (INIS)

    Woodruff, D.L.; Hartz, K.E.; Triplett, M.B.

    1988-08-01

    The Northwest Hazardous Waste RD and D Center (the Center) conducts research, development, and demonstration (RD and D) activities for hazardous and radioactive mixed-waste technologies applicable to remediating sites in the states of Idaho, Montana, Oregon, and Washington. To properly set priorities for these RD and D activities and to target development efforts it is necessary to understand the nature of the sites requiring remediation. A data base of hazardous waste site characteristics has been constructed to facilitate this analysis. The data base used data from EPA's Region X Comprehensive Environmental Response, Compensation, and Liability Information System (CERCLIS) and from Preliminary Assessment/Site Investigation (PA/SI) forms for sites in Montana. The Center's data base focuses on two sets of sites--those on the National Priorities List (NPL) and other sites that are denoted as ''active'' CERCLIS sites. Active CERCLIS sites are those sites that are undergoing active investigation and analysis. The data base contains information for each site covering site identification and location, type of industry associated with the site, waste categories present (e.g., heavy metals, pesticides, etc.), methods of disposal (e.g., tanks, drums, land, etc.), waste forms (e.g., liquid, solid, etc.), and hazard targets (e.g., surface water, groundwater, etc.). As part of this analysis, the Northwest region was divided into three geographic subregions to identify differences in disposal site characteristics within the Northwest. 2 refs., 18 figs., 5 tabs

  15. Dimensional analysis, similarity, analogy, and the simulation theory

    International Nuclear Information System (INIS)

    Davis, A.A.

    1978-01-01

    Dimensional analysis, similarity, analogy, and cybernetics are shown to be four consecutive steps in application of the simulation theory. This paper introduces the classes of phenomena which follow the same formal mathematical equations as models of the natural laws and the interior sphere of restraints groups of phenomena in which one can introduce simplfied nondimensional mathematical equations. The simulation by similarity in a specific field of physics, by analogy in two or more different fields of physics, and by cybernetics in nature in two or more fields of mathematics, physics, biology, economics, politics, sociology, etc., appears as a unique theory which permits one to transport the results of experiments from the models, convenably selected to meet the conditions of researches, constructions, and measurements in the laboratories to the originals which are the primary objectives of the researches. Some interesting conclusions which cannot be avoided in the use of simplified nondimensional mathematical equations as models of natural laws are presented. Interesting limitations on the use of simulation theory based on assumed simplifications are recognized. This paper shows as necessary, in scientific research, that one write mathematical models of general laws which will be applied to nature in its entirety. The paper proposes the extent of the second law of thermodynamics as the generalized law of entropy to model life and its activities. This paper shows that the physical studies and philosophical interpretations of phenomena and natural laws cannot be separated in scientific work; they are interconnected and one cannot be put above the others

  16. Bragg's Law diffraction simulations for electron backscatter diffraction analysis

    International Nuclear Information System (INIS)

    Kacher, Josh; Landon, Colin; Adams, Brent L.; Fullwood, David

    2009-01-01

    In 2006, Angus Wilkinson introduced a cross-correlation-based electron backscatter diffraction (EBSD) texture analysis system capable of measuring lattice rotations and elastic strains to high resolution. A variation of the cross-correlation method is introduced using Bragg's Law-based simulated EBSD patterns as strain free reference patterns that facilitates the use of the cross-correlation method with polycrystalline materials. The lattice state is found by comparing simulated patterns to collected patterns at a number of regions on the pattern using the cross-correlation function and calculating the deformation from the measured shifts of each region. A new pattern can be simulated at the deformed state, and the process can be iterated a number of times to converge on the absolute lattice state. By analyzing an iteratively rotated single crystal silicon sample and recovering the rotation, this method is shown to have an angular resolution of ∼0.04 o and an elastic strain resolution of ∼7e-4. As an example of applications, elastic strain and curvature measurements are used to estimate the dislocation density in a single grain of a compressed polycrystalline Mg-based AZ91 alloy.

  17. Analysis of Measured and Simulated Supraglottal Acoustic Waves.

    Science.gov (United States)

    Fraile, Rubén; Evdokimova, Vera V; Evgrafova, Karina V; Godino-Llorente, Juan I; Skrelin, Pavel A

    2016-09-01

    To date, although much attention has been paid to the estimation and modeling of the voice source (ie, the glottal airflow volume velocity), the measurement and characterization of the supraglottal pressure wave have been much less studied. Some previous results have unveiled that the supraglottal pressure wave has some spectral resonances similar to those of the voice pressure wave. This makes the supraglottal wave partially intelligible. Although the explanation for such effect seems to be clearly related to the reflected pressure wave traveling upstream along the vocal tract, the influence that nonlinear source-filter interaction has on it is not as clear. This article provides an insight into this issue by comparing the acoustic analyses of measured and simulated supraglottal and voice waves. Simulations have been performed using a high-dimensional discrete vocal fold model. Results of such comparative analysis indicate that spectral resonances in the supraglottal wave are mainly caused by the regressive pressure wave that travels upstream along the vocal tract and not by source-tract interaction. On the contrary and according to simulation results, source-tract interaction has a role in the loss of intelligibility that happens in the supraglottal wave with respect to the voice wave. This loss of intelligibility mainly corresponds to spectral differences for frequencies above 1500 Hz. Copyright © 2016 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  18. Fast Monte Carlo for ion beam analysis simulations

    International Nuclear Information System (INIS)

    Schiettekatte, Francois

    2008-01-01

    A Monte Carlo program for the simulation of ion beam analysis data is presented. It combines mainly four features: (i) ion slowdown is computed separately from the main scattering/recoil event, which is directed towards the detector. (ii) A virtual detector, that is, a detector larger than the actual one can be used, followed by trajectory correction. (iii) For each collision during ion slowdown, scattering angle components are extracted form tables. (iv) Tables of scattering angle components, stopping power and energy straggling are indexed using the binary representation of floating point numbers, which allows logarithmic distribution of these tables without the computation of logarithms to access them. Tables are sufficiently fine-grained that interpolation is not necessary. Ion slowdown computation thus avoids trigonometric, inverse and transcendental function calls and, as much as possible, divisions. All these improvements make possible the computation of 10 7 collisions/s on current PCs. Results for transmitted ions of several masses in various substrates are well comparable to those obtained using SRIM-2006 in terms of both angular and energy distributions, as long as a sufficiently large number of collisions is considered for each ion. Examples of simulated spectrum show good agreement with experimental data, although a large detector rather than the virtual detector has to be used to properly simulate background signals that are due to plural collisions. The program, written in standard C, is open-source and distributed under the terms of the GNU General Public License

  19. Design and Simulation of MEMS Devices using Interval Analysis

    International Nuclear Information System (INIS)

    Shanmugavalli, M; Uma, G; Vasuki, B; Umapathy, M

    2006-01-01

    Modeling and simulation of MEMS devices are used to optimize the design, to improve the performance of the device, to reduce time to market, to minimize development time and cost by avoiding unnecessary design cycles and foundry runs. The major design objectives in any device design, is to meet the required functional parameters and the reliability of the device. The functional parameters depend on the geometry of the structure, material properties and process parameters. All model parameters act as input to optimize the functional parameters. The major difficulty the designer faces is the dimensions and properties used in the simulation of the MEMS devices can not be exactly followed during fabrication. In order to overcome this problem, the designer must test the device in simulation for bound of parameters involved in it. The paper demonstrates the use of interval methods to assess the electromechanical behaviour of micro electromechanical systems (MEMS) under the presence of manufacturing and process uncertainties. Interval method guides the design of pullin voltage analysis of fixed-fixed beam to achieve a robust and reliable design in a most efficient way. The methods are implemented numerically using Coventorware and analytically using Intlab

  20. Spatially continuous approach to the description of incoherencies in fast reactor accident analysis

    International Nuclear Information System (INIS)

    Luck, L.B.

    1976-12-01

    A generalized cell-type approach is developed in which individual subassemblies are represented as a unit. By appropriate characterization of the results of separate detailed investigations, spatial variations within a cell are represented as a superposition. The advantage of this approach is that costly detailed cell-type information is generated only once or a very few times. Spatial information obtained by the cell treatment is properly condensed in order to drastically reduce the transient computation time. Approximate treatments of transient phenomena are developed based on the use of distributions of volume and reactivity worth with temperature and other reactor parameters. Incoherencies during transient are physically dependent on the detailed variations in the initial state. Therefore, stationary volumetric distributions which contain in condensed form the detailed initial incoherency information provides a proper basis for the transient treatment. Approximate transient volumetric distributions are generated by a suitable transformation of the stationary distribution to reflect the changes in the transient temperature field. Evaluation of transient changes is based on results of conventional uniform channel calculations and a superposition of lateral variations as they are derived from prior cell investigations. Specific formulations are developed for the treatment of reactivity feedback. Doppler and sodium expansion reactivity feedback is related to condensed temperature-worth distributions. Transient evaluation of the worth distribution is based on the relation between stationary and transient volumetric distributions, which contains the condensed temperature field information. Coolant voiding is similarly treated with proper distribution information. Results show that the treatments developed for the transient phase up to and including sodium boiling constitute a fast and effective simulation of inter- and intra-subassembly incoherence effects

  1. Comparing conventional Descriptive Analysis and Napping®-UFP against physiochemical measurements: a case study using apples.

    Science.gov (United States)

    Pickup, William; Bremer, Phil; Peng, Mei

    2018-03-01

    The extensive time and cost associated with conventional sensory profiling methods has spurred sensory researchers to develop rapid method alternatives, such as Napping® with Ultra-Flash Profiling (UFP). Napping®-UFP generates sensory maps by requiring untrained panellists to separate samples based on perceived sensory similarities. Evaluations of this method have been restrained to manufactured/formulated food models, and predominantly structured on comparisons against the conventional descriptive method. The present study aims to extend the validation of Napping®-UFP (N = 72) to natural biological products; and to evaluate this method against Descriptive Analysis (DA; N = 8) with physiochemical measurements as an additional evaluative criterion. The results revealed that sample configurations generated by DA and Napping®-UFP were not significantly correlated (RV = 0.425, P = 0.077); however, they were both correlated with the product map generated based on the instrumental measures (P Napping®-UFP were driven by different sensory attributes, indicating potential structural differences between these two methods in configuring samples. Overall, these findings lent support for the extended use of Napping®-UFP for evaluations of natural biological products. Although DA was shown to be a better method for establishing sensory-instrumental relationships, Napping®-UFP exhibited strengths in generating informative sample configurations based on holistic perception of products. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  2. A Descriptive Analysis of Care Provided by Law Enforcement Prior to EMS Arrival in the United States.

    Science.gov (United States)

    Klassen, Aaron B; Core, S Brent; Lohse, Christine M; Sztajnkrycer, Matthew D

    2018-04-01

    Study Objectives Law enforcement is increasingly viewed as a key component in the out-of-hospital chain of survival, with expanded roles in cardiac arrest, narcotic overdose, and traumatic bleeding. Little is known about the nature of care provided by law enforcement prior to the arrival of Emergency Medical Services (EMS) assets. The purpose of the current study was to perform a descriptive analysis of events reported to a national EMS database. This study was a descriptive analysis of the 2014 National Emergency Medical Services Information System (NEMSIS) public release research data set, containing EMS emergency response data from 41 states. Code E09_02 1200 specifically identifies care provided by law enforcement prior to EMS arrival. A total of 25,835,729 unique events were reported. Of events in which pre-arrival care was documented, 2.0% received prior aid by law enforcement. Patients receiving law enforcement care prior to EMS arrival were more likely to be younger (52.8 [SD=23.3] years versus 58.7 [SD=23.3] years), male (54.8% versus 46.7%), and white (80.3% versus 77.5%). Basic Life Support (BLS) EMS response was twice as likely in patients receiving prior aid by law enforcement. Multiple-casualty incidents were five times more likely with prior aid by law enforcement. Compared with prior aid by other services, law enforcement pre-arrival care was more likely with motor vehicle accidents, firearm assaults, knife assaults, blunt assaults, and drug overdoses, and less likely at falls and childbirths. Cardiac arrest was significantly more common in patients receiving prior aid by law enforcement (16.5% versus 2.6%). Tourniquet application and naloxone administration were more common in the law enforcement prior aid group. Where noted, law enforcement pre-arrival care occurs in 2.0% of EMS patient encounters. The majority of cases involve cardiac arrest, motor vehicle accidents, and assaults. Better understanding of the nature of law enforcement care is

  3. A Student Assessment Tool for Standardized Patient Simulations (SAT-SPS): Psychometric analysis.

    Science.gov (United States)

    Castro-Yuste, Cristina; García-Cabanillas, María José; Rodríguez-Cornejo, María Jesús; Carnicer-Fuentes, Concepción; Paloma-Castro, Olga; Moreno-Corral, Luis Javier

    2018-05-01

    The evaluation of the level of clinical competence acquired by the student is a complex process that must meet various requirements to ensure its quality. The psychometric analysis of the data collected by the assessment tools used is a fundamental aspect to guarantee the student's competence level. To conduct a psychometric analysis of an instrument which assesses clinical competence in nursing students at simulation stations with standardized patients in OSCE-format tests. The construct of clinical competence was operationalized as a set of observable and measurable behaviors, measured by the newly-created Student Assessment Tool for Standardized Patient Simulations (SAT-SPS), which was comprised of 27 items. The categories assigned to the items were 'incorrect or not performed' (0), 'acceptable' (1), and 'correct' (2). 499 nursing students. Data were collected by two independent observers during the assessment of the students' performance at a four-station OSCE with standardized patients. Descriptive statistics were used to summarize the variables. The difficulty levels and floor and ceiling effects were determined for each item. Reliability was analyzed using internal consistency and inter-observer reliability. The validity analysis was performed considering face validity, content and construct validity (through exploratory factor analysis), and criterion validity. Internal reliability and inter-observer reliability were higher than 0.80. The construct validity analysis suggested a three-factor model accounting for 37.1% of the variance. These three factors were named 'Nursing process', 'Communication skills', and 'Safe practice'. A significant correlation was found between the scores obtained and the students' grades in general, as well as with the grades obtained in subjects with clinical content. The assessment tool has proven to be sufficiently reliable and valid for the assessment of the clinical competence of nursing students using standardized patients

  4. Hanford Site Composite Analysis Technical Approach Description: Automated Quality Assurance Process Design.

    Energy Technology Data Exchange (ETDEWEB)

    Dockter, Randy E. [CH2M HILL Plateau Remediation Company, Richland, WA (United States)

    2017-07-31

    The U.S. Department of Energy (DOE) in DOE O 435.1 Chg. 1, Radioactive Waste Management, requires the preparation and maintenance of a composite analysis (CA). The primary purpose of the CA is to provide a reasonable expectation that the primary public dose limit is not likely to be exceeded by multiple source terms that may significantly interact with plumes originating at a low-level waste disposal facility. The CA is used to facilitate planning and land use decisions that help assure disposal facility authorization will not result in long-term compliance problems; or, to determine management alternatives, corrective actions, or assessment needs if potential problems are identified.

  5. Hanford Site Composite Analysis Technical Approach Description: Radionuclide Inventory and Waste Site Selection Process.

    Energy Technology Data Exchange (ETDEWEB)

    Nichols, Will E.; Mehta, Sunil

    2017-09-13

    The updated Hanford Site Composite Analysis will provide an all-pathways dose projection to a hypothetical future member of the public from all planned low-level radioactive waste disposal facilities and potential contributions from all other projected end-state sources of radioactive material left at Hanford following site closure. Its primary purpose is to support the decision-making process of the U.S. Department of Energy (DOE) under DOE O 435.1-1, Radioactive Waste Management (DOE, 2001), related to managing low-level waste disposal facilities at the Hanford Site.

  6. Performance-Based Technology Selection Filter description report. INEL Buried Waste Integrated Demonstration System Analysis project

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, M.C.; Morrison, J.L.; Morneau, R.A.; Rudin, M.J.; Richardson, J.G.

    1992-05-01

    A formal methodology has been developed for identifying technology gaps and assessing innovative or postulated technologies for inclusion in proposed Buried Waste Integrated Demonstration (BWID) remediation systems. Called the Performance-Based Technology Selection Filter, the methodology provides a formalized selection process where technologies and systems are rated and assessments made based on performance measures, and regulatory and technical requirements. The results are auditable, and can be validated with field data. This analysis methodology will be applied to the remedial action of transuranic contaminated waste pits and trenches buried at the Idaho National Engineering Laboratory (INEL).

  7. ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE-EVENT SIMULATION

    Science.gov (United States)

    2016-03-24

    ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...in the United States. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION...UNLIMITED. AFIT-ENV-MS-16-M-166 ANALYSIS OF INPATIENT HOSPITAL STAFF MENTAL WORKLOAD BY MEANS OF DISCRETE -EVENT SIMULATION Erich W

  8. Moment analysis description of wetting and redistribution plumes in wettable and water-repellent soils

    Science.gov (United States)

    Xiong, Yunwu; Furman, Alex; Wallach, Rony

    2012-02-01

    SummaryWater repellency has a significant impact on water flow patterns in the soil profile. Transient 2D flow in wettable and natural water-repellent soils was monitored in a transparent flow chamber. The substantial differences in plume shape and spatial water content distribution during the wetting and subsequent redistribution stages were related to the variation of contact angle while in contact with water. The observed plumes shape, internal water content distribution in general and the saturation overshoot behind the wetting front in particular in the repellent soils were associated with unstable flow. Moment analysis was applied to characterize the measured plumes during the wetting and subsequent redistribution. The center of mass and spatial variances determined for the measured evolving plumes were fitted by a model that accounts for capillary and gravitational driving forces in a medium of temporally varying wettability. Ellipses defined around the stable and unstable plumes' centers of mass and whose semi-axes represented a particular number of spatial variances were used to characterize plume shape and internal moisture distribution. A single probability curve was able to characterize the corresponding fractions of the total added water in the different ellipses for all measured plumes, which testify the competence and advantage of the moment analysis method.

  9. 3-D description of fracture surfaces and stress-sensitivity analysis for naturally fractured reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, S.Q.; Jioa, D.; Meng, Y.F.; Fan, Y.

    1997-08-01

    Three kinds of reservoir cores (limestone, sandstone, and shale with natural fractures) were used to study the effect of morphology of fracture surfaces on stress sensitivity. The cores, obtained from the reservoirs with depths of 2170 to 2300 m, have fractures which are mated on a large scale, but unmated on a fine scale. A specially designed photoelectric scanner with a computer was used to describe the topography of the fracture surfaces. Then, theoretical analysis of the fracture closure was carried out based on the fracture topography generated. The scanning results show that the asperity has almost normal distributions for all three types of samples. For the tested samples, the fracture closure predicted by the elastic-contact theory is different from the laboratory measurements because plastic deformation of the aspirates plays an important role under the testing range of normal stresses. In this work, the traditionally used elastic-contact theory has been modified to better predict the stress sensitivity of reservoir fractures. Analysis shows that the standard deviation of the probability density function of asperity distribution has a great effect on the fracture closure rate.

  10. Analysis of forest structure using thematic mapper simulator data

    Science.gov (United States)

    Peterson, D. L.; Westman, W. E.; Brass, J. A.; Stephenson, N. J.; Ambrosia, V. G.; Spanner, M. A.

    1986-01-01

    The potential of Thematic Mapper Simulator (TMS) data for sensing forest structure information has been explored by principal components and feature selection techniques. In a survey of forest structural properties conducted for 123 field sites of the Sequoia National Park, the canopy closure could be well estimated (r = 0.62 to 0.69) by a variety of channel bands and band ratios, without reference to the forest type. Estimation of the basal area was less successful (r = 0.51 or less) on the average, but could be improved for certain forest types when data were stratified by floristic composition. To achieve such a stratification, individual sites were ordinated by a detrended correspondence analysis based on the canopy of dominant species. The analysis of forest structure in the Sequoia data suggests that total basal area can be best predicted in stands of lower density, and in younger even-aged managed stands.

  11. Dynamic Simulation and Analysis of Human Walking Mechanism

    Science.gov (United States)

    Azahari, Athirah; Siswanto, W. A.; Ngali, M. Z.; Salleh, S. Md.; Yusup, Eliza M.

    2017-01-01

    Behaviour such as gait or posture may affect a person with the physiological condition during daily activities. The characteristic of human gait cycle phase is one of the important parameter which used to described the human movement whether it is in normal gait or abnormal gait. This research investigates four types of crouch walking (upright, interpolated, crouched and severe) by simulation approach. The assessment are conducting by looking the parameters of hamstring muscle joint, knee joint and ankle joint. The analysis results show that based on gait analysis approach, the crouch walking have a weak pattern of walking and postures. Short hamstring and knee joint is the most influence factor contributing to the crouch walking due to excessive hip flexion that typically accompanies knee flexion.

  12. Probability theory versus simulation of petroleum potential in play analysis

    Science.gov (United States)

    Crovelli, R.A.

    1987-01-01

    An analytic probabilistic methodology for resource appraisal of undiscovered oil and gas resources in play analysis is presented. This play-analysis methodology is a geostochastic system for petroleum resource appraisal in explored as well as frontier areas. An objective was to replace an existing Monte Carlo simulation method in order to increase the efficiency of the appraisal process. Underlying the two methods is a single geologic model which considers both the uncertainty of the presence of the assessed hydrocarbon and its amount if present. The results of the model are resource estimates of crude oil, nonassociated gas, dissolved gas, and gas for a geologic play in terms of probability distributions. The analytic method is based upon conditional probability theory and a closed form solution of all means and standard deviations, along with the probabilities of occurrence. ?? 1987 J.C. Baltzer A.G., Scientific Publishing Company.

  13. Ethical reasoning through simulation: a phenomenological analysis of student experience.

    Science.gov (United States)

    Lewis, Gareth; McCullough, Melissa; Maxwell, Alexander P; Gormley, Gerard J

    2016-01-01

    Medical students transitioning into professional practice feel underprepared to deal with the emotional complexities of real-life ethical situations. Simulation-based learning (SBL) may provide a safe environment for students to probe the boundaries of ethical encounters. Published studies of ethics simulation have not generated sufficiently deep accounts of student experience to inform pedagogy. The aim of this study was to understand students' lived experiences as they engaged with the emotional challenges of managing clinical ethical dilemmas within a SBL environment. This qualitative study was underpinned by an interpretivist epistemology. Eight senior medical students participated in an interprofessional ward-based SBL activity incorporating a series of ethically challenging encounters. Each student wore digital video glasses to capture point-of-view (PoV) film footage. Students were interviewed immediately after the simulation and the PoV footage played back to them. Interviews were transcribed verbatim. An interpretative phenomenological approach, using an established template analysis approach, was used to iteratively analyse the data. Four main themes emerged from the analysis: (1) 'Authentic on all levels?', (2)'Letting the emotions flow', (3) 'Ethical alarm bells' and (4) 'Voices of children and ghosts'. Students recognised many explicit ethical dilemmas during the SBL activity but had difficulty navigating more subtle ethical and professional boundaries. In emotionally complex situations, instances of moral compromise were observed (such as telling an untruth). Some participants felt unable to raise concerns or challenge unethical behaviour within the scenarios due to prior negative undergraduate experiences. This study provided deep insights into medical students' immersive and embodied experiences of ethical reasoning during an authentic SBL activity. By layering on the human dimensions of ethical decision-making, students can understand their

  14. Analysis of control rod behavior based on numerical simulation

    International Nuclear Information System (INIS)

    Ha, D. G.; Park, J. K.; Park, N. G.; Suh, J. M.; Jeon, K. L.

    2010-01-01

    The main function of a control rod is to control core reactivity change during operation associated with changes in power, coolant temperature, and dissolved boron concentration by the insertion and withdrawal of control rods from the fuel assemblies. In a scram, the control rod assemblies are released from the CRDMs (Control Rod Drive Mechanisms) and, due to gravity, drop rapidly into the fuel assemblies. The control rod insertion time during a scram must be within the time limits established by the overall core safety analysis. To assure the control rod operational functions, the guide thimbles shall not obstruct the insertion and withdrawal of the control rods or cause any damage to the fuel assembly. When fuel assembly bow occurs, it can affect both the operating performance and the core safety. In this study, the drag forces of the control rod are estimated by a numerical simulation to evaluate the guide tube bow effect on control rod withdrawal. The contact condition effects are also considered. A full scale 3D model is developed for the evaluation, and ANSYS - commercial numerical analysis code - is used for this numerical simulation. (authors)

  15. The PandaRoot framework for simulation, reconstruction and analysis

    International Nuclear Information System (INIS)

    Spataro, Stefano

    2011-01-01

    The PANDA experiment at the future facility FAIR will study anti-proton proton and anti-proton nucleus collisions in a beam momentum range from 2 GeV/c up to 15 GeV/c. The PandaRoot framework is part of the FairRoot project, a common software framework for the future FAIR experiments, and is currently used to simulate detector performances and to evaluate different detector concepts. It is based on the packages ROOT and Virtual MonteCarlo with Geant3 and Geant4. Different reconstruction algorithms for tracking and particle identification are under development and optimization, in order to achieve the performance requirements of the experiment. In the central tracker a first track fit is performed using a conformal map transformation based on a helix assumption, then the track is used as input for a Kalman Filter (package genfit), using GEANE as track follower. The track is then correlated to the pid detectors (e.g. Cerenkov detectors, EM Calorimeter or Muon Chambers) to evaluate a global particle identification probability, using a Bayesian approach or multivariate methods. Further implemented packages in PandaRoot are: the analysis tools framework Rho, the kinematic fitter package for vertex and mass constraint fits, and a fast simulation code based upon parametrized detector responses. PandaRoot was also tested on an Alien-based GRID infrastructure. The contribution will report about the status of PandaRoot and show some example results for analysis of physics benchmark channels.

  16. LNG pool fire simulation for domino effect analysis

    International Nuclear Information System (INIS)

    Masum Jujuly, Muhammad; Rahman, Aziz; Ahmed, Salim; Khan, Faisal

    2015-01-01

    A three-dimensional computational fluid dynamics (CFD) simulation of liquefied natural gas (LNG) pool fire has been performed using ANSYS CFX-14. The CFD model solves the fundamental governing equations of fluid dynamics, namely, the continuity, momentum and energy equations. Several built-in sub-models are used to capture the characteristics of pool fire. The Reynolds-averaged Navier–Stokes (RANS) equation for turbulence and the eddy-dissipation model for non-premixed combustion are used. For thermal radiation, the Monte Carlo (MC) radiation model is used with the Magnussen soot model. The CFD results are compared with a set of experimental data for validation; the results are consistent with experimental data. CFD results show that the wind speed has significant contribution on the behavior of pool fire and its domino effects. The radiation contours are also obtained from CFD post processing, which can be applied for risk analysis. The outcome of this study will be helpful for better understanding of the domino effects of pool fire in complex geometrical settings of process industries. - Highlights: • Simulation of pool fire using computational fluid dynamics (CFD) model. • Integration of CFD based pool fire model with domino effect. • Application of the integrated CFD based domino effect analysis

  17. LOOS: an extensible platform for the structural analysis of simulations.

    Science.gov (United States)

    Romo, Tod D; Grossfield, Alan

    2009-01-01

    We have developed LOOS (Lightweight Object-Oriented Structure-analysis library) as an object-oriented library designed to facilitate the rapid development of tools for the structural analysis of simulations. LOOS supports the native file formats of most common simulation packages including AMBER, CHARMM, CNS, Gromacs, NAMD, Tinker, and X-PLOR. Encapsulation and polymorphism are used to simultaneously provide a stable interface to the programmer and make LOOS easily extensible. A rich atom selection language based on the C expression syntax is included as part of the library. LOOS enables students and casual programmer-scientists to rapidly write their own analytical tools in a compact and expressive manner resembling scripting. LOOS is written in C++ and makes extensive use of the Standard Template Library and Boost, and is freely available under the GNU General Public License (version 3) LOOS has been tested on Linux and MacOS X, but is written to be portable and should work on most Unix-based platforms.

  18. Implementation of force distribution analysis for molecular dynamics simulations

    Directory of Open Access Journals (Sweden)

    Seifert Christian

    2011-04-01

    Full Text Available Abstract Background The way mechanical stress is distributed inside and propagated by proteins and other biopolymers largely defines their function. Yet, determining the network of interactions propagating internal strain remains a challenge for both, experiment and theory. Based on molecular dynamics simulations, we developed force distribution analysis (FDA, a method that allows visualizing strain propagation in macromolecules. Results To be immediately applicable to a wide range of systems, FDA was implemented as an extension to Gromacs, a commonly used package for molecular simulations. The FDA code comes with an easy-to-use command line interface and can directly be applied to every system built using Gromacs. We provide an additional R-package providing functions for advanced statistical analysis and presentation of the FDA data. Conclusions Using FDA, we were able to explain the origin of mechanical robustness in immunoglobulin domains and silk fibers. By elucidating propagation of internal strain upon ligand binding, we previously also successfully revealed the functionality of a stiff allosteric protein. FDA thus has the potential to be a valuable tool in the investigation and rational design of mechanical properties in proteins and nano-materials.

  19. MDAnalysis: a toolkit for the analysis of molecular dynamics simulations.

    Science.gov (United States)

    Michaud-Agrawal, Naveen; Denning, Elizabeth J; Woolf, Thomas B; Beckstein, Oliver

    2011-07-30

    MDAnalysis is an object-oriented library for structural and temporal analysis of molecular dynamics (MD) simulation trajectories and individual protein structures. It is written in the Python language with some performance-critical code in C. It uses the powerful NumPy package to expose trajectory data as fast and efficient NumPy arrays. It has been tested on systems of millions of particles. Many common file formats of simulation packages including CHARMM, Gromacs, Amber, and NAMD and the Protein Data Bank format can be read and written. Atoms can be selected with a syntax similar to CHARMM's powerful selection commands. MDAnalysis enables both novice and experienced programmers to rapidly write their own analytical tools and access data stored in trajectories in an easily accessible manner that facilitates interactive explorative analysis. MDAnalysis has been tested on and works for most Unix-based platforms such as Linux and Mac OS X. It is freely available under the GNU General Public License from http://mdanalysis.googlecode.com. Copyright © 2011 Wiley Periodicals, Inc.

  20. Disability in Mexico: a comparative analysis between descriptive models and historical periods using a timeline

    Directory of Open Access Journals (Sweden)

    Hugo Sandoval

    2017-07-01

    Full Text Available Some interpretations frequently argue that three Disability Models (DM (Charity, Medical/Rehabilitation, and Social correspond to historical periods in terms of chronological succession. These views permeate a priori within major official documents on the subject in Mexico. This paper intends to test whether this association is plausible by applying a timeline method. A document search was made with inclusion and exclusion criteria in databases to select representative studies with which to depict milestones in the timelines for each period. The following is demonstrated: 1 models should be considered as categories of analysis and not as historical periods, in that the prevalence of elements of the three models is present to date, and 2 the association between disability models and historical periods results in teleological interpretations of the history of disability in Mexico.

  1. Sequencing, description and phylogenetic analysis of the mitochondrial genome of Sarcocheilichthys sinensis sinensis (Cypriniformes: Cyprinidae).

    Science.gov (United States)

    Li, Chen; He, Liping; Chen, Chong; Cai, Lingchao; Chen, Pingping; Yang, Shoubao

    2016-01-01

    Sarcocheilichthys sinensis sinensis (Bleeker, 1871), is a small benthopelagic freshwater species with high nutritional and ornamental value. In this study, the complete mitochondrial genome of S. sinensis sinensis was determined; the phylogenetic analysis with another individual and closely related species of Sarcocheilichthys fishes was carried out. The complete mitogenome of S. sinensis sinensis was 16683 bp in length, consist of 13 protein-coding genes, 2 rRNA genes, 22 tRNA genes and 2 non-coding regions: (D-loop and OL). It indicated that D-loop, ND2, and CytB may be appropriate molecular markers for studying population genetics and conservation biology of Sarcocheilichthys fishes.

  2. Description and analysis of design and intended use for Epidemiologic Dynamic Data Collection Platform in China.

    Science.gov (United States)

    Qi, Xiaopeng; Egana, Nilva; Meng, Yujie; Chen, Qianqian; Peng, Zhiyong; Ma, Jiaqi

    2014-01-01

    Disease surveillance systems can be extremely valuable tools and a critical step in system implementation is data collection. In order to obtain quality data efficiently and align the public health business process, Epidemiologic Dynamic Data Collection platform (EDDC) was developed and applied in China. We describe the design of EDDC and assess the platform from six dimensions (service, system, information, use, users and benefit) under the DeLone and McLean Information System Success Model. Objective indicators were extracted from each dimension with the aim of describing the system in detail. The characteristics of functions, performances, usages and benefits of EDDC were reflected under the analysis framework. The limitations and future directions of EDDC are offered for wide use in public health data collection.

  3. Description and user's manual of light water reactor fuel analysis code FEMAXI-IV (Ver.2)

    International Nuclear Information System (INIS)

    Suzuki, Motoe; Saitou, Hiroaki.

    1997-03-01

    FEMAXI-IV is an advanced version of FEMAXI-III, the analysis code of light water reactor fuel behavior in which various functions and improvements have been incorporated. The present report describes in detail the basic theories and structure, the models and numerical solutions applied, and the material properties adopted in the version 2 which is an improved version of the first version of FEMAXI-IV. In FEMAXI-IV (Ver.2), bugs have been fixed, pellet thermal conductivity properties have been updated, and thermal-stress-induced FP gas release model have been incorporated. In order to facilitate effective and wide-ranging application of the code, types and methods of input/output of the code are also described, and a sample output in an actual form is included. (author)

  4. Late onset autosomal dominant cerebellar ataxia a family description and linkage analysis with the hla system

    Directory of Open Access Journals (Sweden)

    Walter O. Arruda

    1991-09-01

    Full Text Available A family suffering an autosomal dominant form of late onset hereditary cerebellar ataxia is described. Eight affected family members were personally studied, and data from another four were obtained through anamnesis. The mean age of onset was 37.1±5.4 years (27-47 years. The clinical picture consisted basically of a pure ataxic cerebellar syndrome. CT-scan disclosed diffuse cerebellar atrophy with relative sparing of the brainstem (and no involvement of supratentorial structures. Neurophysiological studies (nerve conduction, VEP and BAEP were normal. Twenty-six individuals were typed for HLA histocompatibility antigens. Lod scores were calculated with the computer program LINKMAP. Close linkage of the ataxia gene with the HLA system in this family could be excluded - 0==0,02, z=(-2,17 - and the overall analysis of the lod scores suggest another chromossomal location than chromosome 6.

  5. MATADOR (Methods for the Analysis of Transport And Deposition Of Radionuclides) code description and User's Manual

    International Nuclear Information System (INIS)

    Avci, H.I.; Raghuram, S.; Baybutt, P.

    1985-04-01

    A new computer code called MATADOR (Methods for the Analysis of Transport And Deposition Of Radionuclides) has been developed to replace the CORRAL-2 computer code which was written for the Reactor Safety Study (WASH-1400). This report is a User's Manual for MATADOR. MATADOR is intended for use in system risk studies to analyze radionuclide transport and deposition in reactor containments. The principal output of the code is information on the timing and magnitude of radionuclide releases to the environment as a result of severely degraded core accidents. MATADOR considers the transport of radionuclides through the containment and their removal by natural deposition and by engineered safety systems such as sprays. It is capable of analyzing the behavior of radionuclides existing either as vapors or aerosols in the containment. The code requires input data on the source terms into the containment, the geometry of the containment, and thermal-hydraulic conditions in the containment

  6. The time analysis of multiple internal reflections in tunneling description through the barrier

    International Nuclear Information System (INIS)

    Ol'khovskij, V.S.; Majdanyuk, S.P.

    2000-01-01

    The problem of scattering of a nonrelativistic particle on a nucleus is considered, the interaction potential between which has a spherically symmetrical view. As the further development of the time analysis of processes of tunneling the non-stationary method of the solution of a problem is represented, due to which begins the possibility to describe in a time dependence the propagation (tunneling) of a particle and in details to study this process in an interesting instant or concerning a concrete point of space. At calculating the time parameters the method has shown itself simple and convenient. At finding the expressions for stationary wave functions for problems with an interaction potential, which radial part has more composite view than rectangular, this method is more effective than reference stationary approaches

  7. The application of the SXF lattice description and the UAL software environment to the analysis of the LHC

    CERN Document Server

    Fischer, W; Ptitsyn, V I

    1999-01-01

    A software environment for accelerator modeling has been developed which includes the UAL (Unified Accelerator Library), a collection of accelerator physics libraries with a Perl interface for scripting, and the SXF (Standard eX-change Format), a format for accelerator description which extends the MAD sequence by including deviations from design values. SXF interfaces have been written for several programs, including MAD9 and MAD8 via the doom database, Cosy, TevLat and UAL itself, which includes Teapot++. After an overview of the software we describe the application of the tools to the analysis of the LHC lattice stability, in the presence of alignment and coupling errors, and to the correction of the first turn and closed orbit in the machine. (7 refs).

  8. Description, field test and data analysis of a controlled-source EM system (EM-60). [Leach Hot Springs, Grass Valley

    Energy Technology Data Exchange (ETDEWEB)

    Morrison, H.F.; Goldstein, N.E.; Hoversten, M.; Oppliger, G.; Riveros, C.

    1978-10-01

    The three sections describe the transmitter, the receiver, and data interpretations and indicate the advances made toward the development of a large moment electromagnetic (EM) system employing a magnetic dipole source. A brief description is given of the EM-60 transmitter, its general design, and the consideration involved in the selection of a practical coil size and weight for routine field operations. A programmable, multichannel, multi-frequency, phase-sensitive receiver is described. A field test of the EM-60, the data analysis and interpretation procedures, and a comparison between the survey results and the results obtained using other electrical techniques are presented. The Leach Hot Springs area in Grass Valley, Pershing County, Nevada, was chosen for the first field site at which the entire system would be tested. The field tests showed the system capable of obtaining well-defined sounding curves (amplitude and phase of magnetic fields) from 1 kHz down to 0.1 Hz. (MHR)

  9. McMaster Mesonet soil moisture dataset: description and spatio-temporal variability analysis

    Directory of Open Access Journals (Sweden)

    K. C. Kornelsen

    2013-04-01

    Full Text Available This paper introduces and describes the hourly, high-resolution soil moisture dataset continuously recorded by the McMaster Mesonet located in the Hamilton-Halton Watershed in Southern Ontario, Canada. The McMaster Mesonet consists of a network of time domain reflectometer (TDR probes collecting hourly soil moisture data at six depths between 10 cm and 100 cm at nine locations per site, spread across four sites in the 1250 km2 watershed. The sites for the soil moisture arrays are designed to further improve understanding of soil moisture dynamics in a seasonal climate and to capture soil moisture transitions in areas that have different topography, soil and land cover. The McMaster Mesonet soil moisture constitutes a unique database in Canada because of its high spatio-temporal resolution. In order to provide some insight into the dominant processes at the McMaster Mesonet sites, a spatio-temporal and temporal stability analysis were conducted to identify spatio-temporal patterns in the data and to suggest some physical interpretation of soil moisture variability. It was found that the seasonal climate of the Great Lakes Basin causes a transition in soil moisture patterns at seasonal timescales. During winter and early spring months, and at the meadow sites, soil moisture distribution is governed by topographic redistribution, whereas following efflorescence in the spring and summer, soil moisture spatial distribution at the forested site was also controlled by vegetation canopy. Analysis of short-term temporal stability revealed that the relative difference between sites was maintained unless there was significant rainfall (> 20 mm or wet conditions a priori. Following a disturbance in the spatial soil moisture distribution due to wetting, the relative soil moisture pattern re-emerged in 18 to 24 h. Access to the McMaster Mesonet data can be provided by visiting www.hydrology.mcmaster.ca/mesonet.

  10. The Complex Epidemiology of Carbapenem-Resistant Enterobacter Infections: A Multicenter Descriptive Analysis.

    Science.gov (United States)

    Lazarovitch, Tsilia; Amity, Keren; Coyle, Joseph R; Ackerman, Benjamin; Tal-Jasper, Ruthy; Ofer-Friedman, Hadas; Hayakawa, Kayoko; Bogan, Christopher; Lephart, Paul R; Kaplansky, Tamir; Maskit, Moran; Azouri, Tal; Zaidenstein, Ronit; Perez, Federico; Bonomo, Robert A; Kaye, Keith S; Marchaim, Dror

    2015-11-01

    The pandemic of carbapenem-resistant Enterobacteriaceae (CRE) was primarily due to clonal spread of bla KPC producing Klebsiella pneumoniae. Thus, thoroughly studied CRE cohorts have consisted mostly of K. pneumoniae. To conduct an extensive epidemiologic analysis of carbapenem-resistant Enterobacter spp. (CREn) from 2 endemic and geographically distinct centers. CREn were investigated at an Israeli center (Assaf Harofeh Medical Center, January 2007 to July 2012) and at a US center (Detroit Medical Center, September 2008 to September 2009). bla KPC genes were queried by polymerase chain reaction. Repetitive extragenic palindromic polymerase chain reaction and pulsed-field gel electrophoresis were used to determine genetic relatedness. In this analysis, 68 unique patients with CREn were enrolled. Sixteen isolates (24%) were from wounds, and 33 (48%) represented colonization only. All isolates exhibited a positive Modified Hodge Test, but only 93% (27 of 29) contained bla KPC. Forty-three isolates (63%) were from elderly adults, and 5 (7.4%) were from neonates. Twenty-seven patients died in hospital (40.3% of infected patients). Enterobacter strains consisted of 4 separate clones from Assaf Harofeh Medical Center and of 4 distinct clones from Detroit Medical Center. In this study conducted at 2 distinct CRE endemic regions, there were unique epidemiologic features to CREn: (i) polyclonality, (ii) neonates accounting for more than 7% of cohort, and (iii) high rate of colonization (almost one-half of all cases represented colonization). Since false-positive Modified Hodge Tests in Enterobacter spp. are common, close monitoring of carbapenem resistance mechanisms (particularly carbapenemase production) among Enterobacter spp. is important.

  11. The record precipitation and flood event in Iberia in December 1876: description and synoptic analysis

    Directory of Open Access Journals (Sweden)

    Ricardo Machado Trigo

    2014-04-01

    Full Text Available The first week of December 1876 was marked by extreme weather conditions that affected the south-western sector of the Iberian Peninsula, leading to an all-time record flow in two large international rivers. As a direct consequence, several Portuguese and Spanish towns and villages located in the banks of both rivers suffered serious flood damage on 7 December 1876. These unusual floods were amplified by the preceding particularly autumn wet months, with October 1876 presenting extremely high precipitation anomalies for all western Iberia stations. Two recently digitised stations in Portugal (Lisbon and Evora, present a peak value on 5 December 1876. Furthermore, the values of precipitation registered between 28 November and 7 December were so remarkable that, the episode of 1876 still corresponds to the maximum average daily precipitation values for temporal scales between 2 and 10 days. Using several different data sources, such as historical newspapers of that time, meteorological data recently digitised from several stations in Portugal and Spain and the recently available 20th Century Reanalysis, we provide a detailed analysis on the socio-economic impacts, precipitation values and the atmospheric circulation conditions associated with this event. The atmospheric circulation during these months was assessed at the monthly, daily and sub-daily scales. All months considered present an intense negative NAO index value, with November 1876 corresponding to the lowest NAO value on record since 1865. We have also computed a multivariable analysis of surface and upper air fields in order to provide some enlightening into the evolution of the synoptic conditions in the week prior to the floods. These events resulted from the continuous pouring of precipitation registered between 28 November and 7 December, due to the consecutive passage of Atlantic low-pressure systems fuelled by the presence of an atmospheric-river tropical moisture flow over

  12. A generic framework for the description and analysis of energy security in an energy system

    International Nuclear Information System (INIS)

    Hughes, Larry

    2012-01-01

    While many energy security indicators and models have been developed for specific jurisdictions or types of energy, few can be considered sufficiently generic to be applicable to any energy system. This paper presents a framework that attempts to meet this objective by combining the International Energy Agency's definition of energy security with structured systems analysis techniques to create three energy security indicators and a process-flow energy systems model. The framework is applicable to those energy systems which can be described in terms of processes converting or transporting flows of energy to meet the energy–demand flows from downstream processes. Each process affects the environment and is subject to jurisdictional policies. The framework can be employed to capture the evolution of energy security in an energy system by analyzing the results of indicator-specific metrics applied to the energy, demand, and environment flows associated with the system's constituent processes. Energy security policies are treated as flows to processes and classified into one of three actions affecting the process's energy demand or the process or its energy input, or both; the outcome is determined by monitoring changes to the indicators. The paper includes a detailed example of an application of the framework. - Highlights: ► The IEA's definition of energy security is parsed into three energy security indicators: availability, affordability, and acceptability. ► Data flow diagrams and other systems analysis tools can represent an energy system and its processes, flows, and chains. ► Indicator-specific metrics applied to a process's flow determine the state of energy security in an energy system, an energy chain, or process. ► Energy policy is considered as a flow and policy outcomes are obtained by measuring flows with indicator-specific metrics. ► The framework is applicable to most jurisdictions and energy types.

  13. Challenges in coupled thermal-hydraulics and neutronics simulations for LWR safety analysis

    International Nuclear Information System (INIS)

    Ivanov, Kostadin; Avramova, Maria

    2007-01-01

    The simulation of nuclear power plant accident conditions requires three-dimensional (3D) modeling of the reactor core to ensure a realistic description of physical phenomena. The operational flexibility of Light Water Reactor (LWR) plants can be improved by utilizing accurate 3D coupled neutronics/thermal-hydraulics calculations for safety margins evaluations. There are certain requirements to the coupling of thermal-hydraulic system codes and neutron-kinetics codes that ought to be considered. The objective of these requirements is to provide accurate solutions in a reasonable amount of CPU time in coupled simulations of detailed operational transient and accident scenarios. These requirements are met by the development and implementation of six basic components of the coupling methodologies: ways of coupling (internal or external coupling); coupling approach (integration algorithm or parallel processing); spatial mesh overlays; coupled time-step algorithms; coupling numerics (explicit, semi-implicit and implicit schemes); and coupled convergence schemes. These principles of the coupled simulations are discussed in details along with the scientific issues associated with the development of appropriate neutron cross-section libraries for coupled code transient modeling. The current trends in LWR nuclear power generation and regulation as well as the design of next generation LWR reactor concepts along with the continuing computer technology progress stimulate further development of these coupled code systems. These efforts have been focused towards extending the analysis capabilities as well as refining the scale and level of detail of the coupling. This article analyses the coupled phenomena and modeling challenges on both global (assembly-wise) and local (pin-wise) levels. The issues related to the consistent qualification of coupled code systems as well as their application to different types of LWR transients are presented. Finally, the advances in numerical

  14. Comparative Analysis of Disruption Tolerant Network Routing Simulations in the One and NS-3

    Science.gov (United States)

    2017-12-01

    The added levels of simulation increase the processing required by a simulation . ns-3’s simulation of other layers of the network stack permits...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS COMPARATIVE ANALYSIS OF DISRUPTION TOLERANT NETWORK ROUTING SIMULATIONS IN THE ONE AND NS-3...Thesis 03-23-2016 to 12-15-2017 4. TITLE AND SUBTITLE COMPARATIVE ANALYSIS OF DISRUPTION TOLERANT NETWORK ROUTING SIMULATIONS IN THE ONE AND NS-3 5

  15. Cross-cultural comparisons among the sensory characteristics of fermented soybean using Korean and Japanese descriptive analysis panels.

    Science.gov (United States)

    Chung, L; Chung, S-J

    2007-11-01

    One of the most important initial steps in exporting a food product to another country from the R&D perspective is to describe and translate the sensory characteristics of a food product appropriately into the language of the target country. The objectives of this study were to describe and compare the sensory characteristics of Korean and Japanese style fermented soybean products, and to cross-culturally compare the lexicons of the identical product generated by the Korean and Japanese panelists. Four types of Korean and 4 types of Japanese style fermented soybean consisting of whole bean type and paste type were analyzed. Ten Korean and 9 Japanese panelists were recruited in Korea. Two separate descriptive analyses were conducted, with the panelists differing in their country of origin. Each group was trained, developed lexicon, and conducted descriptive analysis independently. Analysis of variance and various multivariate analyses were applied to delineate the sensory characteristics of the samples and to compare the cross-cultural differences in the usage of lexicon. The Korean and Japanese panelists generated 48 and 36 sensory attributes, respectively. Cross-cultural consensus was shown for evaluating the whole bean type fermented soybean and white miso, which were relatively distinctive samples. However, for the less distinctive samples, the panelists tend to rate higher in negative attributes for the fermented soybeans that originated from the other country. The Japanese panelists grouped the samples by their country of origin and soy sauce flavor was the main attribute for cross-cultural differentiation. However, the Korean panelists did not make a cross-cultural distinction among the samples.

  16. Hardware description languages

    Science.gov (United States)

    Tucker, Jerry H.

    1994-01-01

    Hardware description languages are special purpose programming languages. They are primarily used to specify the behavior of digital systems and are rapidly replacing traditional digital system design techniques. This is because they allow the designer to concentrate on how the system should operate rather than on implementation details. Hardware description languages allow a digital system to be described with a wide range of abstraction, and they support top down design techniques. A key feature of any hardware description language environment is its ability to simulate the modeled system. The two most important hardware description languages are Verilog and VHDL. Verilog has been the dominant language for the design of application specific integrated circuits (ASIC's). However, VHDL is rapidly gaining in popularity.

  17. Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

    CERN Document Server

    The ENVISION Collaboration

    2014-01-01

    Deliverable 6.2 - Software: upgraded MC simulation tools capable of simulating a complete in-beam ET experiment, from the beam to the detected events. Report with the description of one (or few) reference clinical case(s), including the complete patient model and beam characteristics

  18. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  19. Quantitative description of microstructure defects in hexagonal boron nitrides using X-ray diffraction analysis

    International Nuclear Information System (INIS)

    Schimpf, C.; Motylenko, M.; Rafaja, D.

    2013-01-01

    A routine for simultaneous quantification of turbostratic disorder, amount of puckering and the dislocation and stacking fault density in hexagonal materials was proposed and tested on boron nitride powder samples that were synthesised using different methods. The routine allows the individual microstructure defects to be recognised according to their effect on the anisotropy of the X-ray diffraction line broadening. For quantification of the microstructure defects, the total line broadening is regarded as a linear combination of the contributions from the particular defects. The total line broadening is obtained from the line profile fitting. As testing material, graphitic boron nitride (h-BN) was employed in the form of hot-isostatically pressed h-BN, pyrolytic h-BN or a h-BN, which was chemically vapour deposited at a low temperature. The kind of the dominant microstructure defects determined from the broadening of the X-ray diffraction lines was verified by high resolution transmission electron microscopy. Their amount was attempted to be verified by alternative methods. - Highlights: • Reliable method for quantification of microstructure defects in BN was suggested. • The method is based on the analysis of anisotropic XRD line broadening. • This XRD line broadening is unique and characteristic of the respective defect. • Thus, the quantification of coexistent microstructure defects is possible. • The method was tested on hexagonal BN, which was produced by different techniques

  20. Relevance of wide-field autofluorescence imaging in Birdshot retinochoroidopathy: descriptive analysis of 76 eyes.

    Science.gov (United States)

    Piffer, Anne-Laure Le; Boissonnot, Michèle; Gobert, Frédéric; Zenger, Anita; Wolf, Sebastian; Wolf, Ute; Korobelnik, Jean-François; Rougier, Marie-Bénédicte

    2014-09-01

    To study and classify retinal lesions in patients with birdshot disease using wide-field autofluorescence imaging and correlate them according to patients' visual status. A multicentre study was carried out on 76 eyes of 39 patients with birdshot disease, analysing colour images and under autofluorescence using the wide-field Optomap(®) imaging system. This was combined with a complete clinical exam and analysis of the macula with OCT. In over 80% of the eyes, a chorioretinal lesion has been observed under autofluorescence with a direct correlation between the extent of the lesion and visual status. The presence of macular hypo-autofluorescence was correlated with a decreased visual acuity, due to the presence of a macular oedema, active clinical inflammation or an epiretinal membrane. The hypo-autofluorescence observed correlated with the duration of the disease and the degree of inflammation in the affected eye, indicating a secondary lesion in the pigment epithelium in relation to the choroid. The pigment epithelium was affected in a diffuse manner, as in almost 50% of the eyes the wider peripheral retina was affected. Wide-field autofluorescence imaging could appear to be a useful examination when monitoring patients, to look for areas of macular hypo-autofluorescence responsible for an irreversible loss of vision. © 2013 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  1. CHP plant Legionowo Poland. Description of the electricity market in Poland/CHP-feasibility analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2003-04-01

    In 1997, a new energy law was passed in Poland. An important element of the law is that local energy is made obligatory. The law describes obligatory tasks and procedures for the Polish municipalities related to planning and organisation of the energy sector. With the objective of supporting the Polish municipalities in their obligations according to the energy law, the project 'Energy Planning in Poland at Municipal Level - Support to Decision Makers' was launched. As part of the project, Municipal Guideline Reports have been elaborated for three model municipalities. These guidelines present the basis for energy supply plans in the three municipalities. For the city of Legionowo, the following was recommended: 1. The planning processes initiated during the project should be continued/followed up, 2. Master Plan for the district heating system should be prepared, 3. The possibilities of establishment of a major natural gas-fired CHP plant of the combined cycle type should be investigated. The present report describes the electricity market in Poland, the market in which a CHP plant in Legionowo will have to operate. Furthermore the report presents the results of the feasibility analysis carried out for a new CHP plant in Legionowo. (BA)

  2. A Descriptive Analysis of Exercise Tolerance Test at Seremban Hospital : An Audit for the Year 2001

    Science.gov (United States)

    Mohamed, Abdul Latiff; Nee, Chan Chee; Azzad, Ahmed

    2004-01-01

    Our purpose is to report on the epidemiological variables and their association with the results of the exercise tolerance test (ETT) in the series of patients referred for standard diagnostic ETT at Seremban Hospital during the year 2001. ETT is widely performed, but, in Malaysia, an analysis of the associations between the epidemiological data and the results of the ETT has not been presented. All patients referred for ETT at Seremban Hospital who underwent exercise treadmill tests for the year 2001 were taken as the study population. Demographic details and patients with established heart disease (i.e. prior coronary bypass surgery, myocardial infarction, or congestive heart failure) were noted. Clinical and ETT variables were collected retrospectively from the hospital records. Testing and data management were performed in a standardized fashion with a computer-assisted protocol. This study showed that there was no significant predictive epidemiological variable on the results of the ETT. However, it was found that there was statistically significant difference between the peak exercise time of males and females undergoing the ETT. PMID:22973128

  3. Health information systems in Africa: descriptive analysis of data sources, information products and health statistics.

    Science.gov (United States)

    Mbondji, Peter Ebongue; Kebede, Derege; Soumbey-Alley, Edoh William; Zielinski, Chris; Kouvividila, Wenceslas; Lusamba-Dikassa, Paul-Samson

    2014-05-01

    To identify key data sources of health information and describe their availability in countries of the World Health Organization (WHO) African Region. An analytical review on the availability and quality of health information data sources in countries; from experience, observations, literature and contributions from countries. Forty-six Member States of the WHO African Region. No participants. The state of data sources, including censuses, surveys, vital registration and health care facility-based sources. In almost all countries of the Region, there is a heavy reliance on household surveys for most indicators, with more than 121 household surveys having been conducted in the Region since 2000. Few countries have civil registration systems that permit adequate and regular tracking of mortality and causes of death. Demographic surveillance sites function in several countries, but the data generated are not integrated into the national health information system because of concerns about representativeness. Health management information systems generate considerable data, but the information is rarely used because of concerns about bias, quality and timeliness. To date, 43 countries in the Region have initiated Integrated Disease Surveillance and Response. A multitude of data sources are used to track progress towards health-related goals in the Region, with heavy reliance on household surveys for most indicators. Countries need to develop comprehensive national plans for health information that address the full range of data needs and data sources and that include provision for building national capacities for data generation, analysis, dissemination and use. © The Royal Society of Medicine.

  4. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    Science.gov (United States)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  5. Safety in ready mixed concrete industry: descriptive analysis of injuries and development of preventive measures.

    Science.gov (United States)

    Akboğa, Özge; Baradan, Selim

    2017-02-07

    Ready mixed concrete (RMC) industry, one of the barebones of construction sector, has its distinctive occupational safety and health (OSH) risks. Employees experience risks that emerge during the fabrication of concrete, as well as its delivery to the construction site. Statistics show that usage and demand of RMC have been increasing along with the number of producers and workers. Unfortunately, adequate OSH measures to meet this rapid growth are not in place even in top RMC producing countries, such as Turkey. Moreover, lack of statistical data and academic research in this sector exacerbates this problem. This study aims to fill this gap by conducting data mining in Turkish Social Security Institution archives and performing univariate frequency and cross tabulation analysis on 71 incidents that RMC truck drivers were involved. Also, investigations and interviews were conducted in seven RMC plants in Turkey and Netherlands with OSH point of view. Based on the results of this research, problem areas were determined such as; cleaning truck mixer/pump is a hazardous activity where operators get injured frequently, and struck by falling objects is a major hazard at RMC industry. Finally, Job Safety Analyses were performed on these areas to suggest mitigation methods.

  6. ODM Data Analysis-A tool for the automatic validation, monitoring and generation of generic descriptive statistics of patient data.

    Science.gov (United States)

    Brix, Tobias Johannes; Bruland, Philipp; Sarfraz, Saad; Ernsting, Jan; Neuhaus, Philipp; Storck, Michael; Doods, Justin; Ständer, Sonja; Dugas, Martin

    2018-01-01

    A required step for presenting results of clinical studies is the declaration of participants demographic and baseline characteristics as claimed by the FDAAA 801. The common workflow to accomplish this task is to export the clinical data from the used electronic data capture system and import it into statistical software like SAS software or IBM SPSS. This software requires trained users, who have to implement the analysis individually for each item. These expenditures may become an obstacle for small studies. Objective of this work is to design, implement and evaluate an open source application, called ODM Data Analysis, for the semi-automatic analysis of clinical study data. The system requires clinical data in the CDISC Operational Data Model format. After uploading the file, its syntax and data type conformity of the collected data is validated. The completeness of the study data is determined and basic statistics, including illustrative charts for each item, are generated. Datasets from four clinical studies have been used to evaluate the application's performance and functionality. The system is implemented as an open source web application (available at https://odmanalysis.uni-muenster.de) and also provided as Docker image which enables an easy distribution and installation on local systems. Study data is only stored in the application as long as the calculations are performed which is compliant with data protection endeavors. Analysis times are below half an hour, even for larger studies with over 6000 subjects. Medical experts have ensured the usefulness of this application to grant an overview of their collected study data for monitoring purposes and to generate descriptive statistics without further user interaction. The semi-automatic analysis has its limitations and cannot replace the complex analysis of statisticians, but it can be used as a starting point for their examination and reporting.

  7. Uncertainty analysis of NDA waste measurements using computer simulations

    International Nuclear Information System (INIS)

    Blackwood, L.G.; Harker, Y.D.; Yoon, W.Y.; Meachum, T.R.

    2000-01-01

    Uncertainty assessments for nondestructive radioassay (NDA) systems for nuclear waste are complicated by factors extraneous to the measurement systems themselves. Most notably, characteristics of the waste matrix (e.g., homogeneity) and radioactive source material (e.g., particle size distribution) can have great effects on measured mass values. Under these circumstances, characterizing the waste population is as important as understanding the measurement system in obtaining realistic uncertainty values. When extraneous waste characteristics affect measurement results, the uncertainty results are waste-type specific. The goal becomes to assess the expected bias and precision for the measurement of a randomly selected item from the waste population of interest. Standard propagation-of-errors methods for uncertainty analysis can be very difficult to implement in the presence of significant extraneous effects on the measurement system. An alternative approach that naturally includes the extraneous effects is as follows: (1) Draw a random sample of items from the population of interest; (2) Measure the items using the NDA system of interest; (3) Establish the true quantity being measured using a gold standard technique; and (4) Estimate bias by deriving a statistical regression model comparing the measurements on the system of interest to the gold standard values; similar regression techniques for modeling the standard deviation of the difference values gives the estimated precision. Actual implementation of this method is often impractical. For example, a true gold standard confirmation measurement may not exist. A more tractable implementation is obtained by developing numerical models for both the waste material and the measurement system. A random sample of simulated waste containers generated by the waste population model serves as input to the measurement system model. This approach has been developed and successfully applied to assessing the quantity of

  8. Privacy Policies for Apps Targeted Toward Youth: Descriptive Analysis of Readability

    Science.gov (United States)

    Das, Gitanjali; Cheung, Cynthia; Nebeker, Camille; Bietz, Matthew

    2018-01-01

    Background Due to the growing availability of consumer information, the protection of personal data is of increasing concern. Objective We assessed readability metrics of privacy policies for apps that are either available to or targeted toward youth to inform strategies to educate and protect youth from unintentional sharing of personal data. Methods We reviewed the 1200 highest ranked apps from the Apple and Google Play Stores and systematically selected apps geared toward youth. After applying exclusion criteria, 99 highly ranked apps geared toward minors remained, 64 of which had a privacy policy. We obtained and analyzed these privacy policies using reading grade level (RGL) as a metric. Policies were further compared as a function of app category (free vs paid; entertainment vs social networking vs utility). Results Analysis of privacy policies for these 64 apps revealed an average RGL of 12.78, which is well above the average reading level (8.0) of adults in the United States. There was also a small but statistically significant difference in word count as a function of app category (entertainment: 2546 words, social networking: 3493 words, and utility: 1038 words; P=.02). Conclusions Although users must agree to privacy policies to access digital tools and products, readability analyses suggest that these agreements are not comprehensible to most adults, let alone youth. We propose that stakeholders, including pediatricians and other health care professionals, play a role in educating youth and their guardians about the use of Web-based services and potential privacy risks, including the unintentional sharing of personal data. PMID:29301737

  9. An analysis of the citation climate in neurosurgical literature and description of an interfield citation metric.

    Science.gov (United States)

    Madhugiri, Venkatesh S; Sasidharan, Gopalakrishnan M; Subeikshanan, Venkatesan; Dutt, Akshat; Ambekar, Sudheer; Strom, Shane F

    2015-05-01

    The citation climate in neurosurgical literature is largely undefined. To study the patterns of citation of articles in neurosurgery as a scientific field and to evaluate the performance of neurosurgery journals vis-à-vis journals in other fields. References cited in articles published in neurosurgery journals during a specified time period were analyzed to determine the age of articles cited in neurosurgical literature. In the next analysis, articles published in neurosurgical journals were followed up for 13 years after publication. The postpublication citation patterns were analyzed to determine the time taken to reach the maximally cited state and the time when articles stopped being cited. The final part of the study dealt with the evolution of a new interfield citation metric, which was then compared with other standardized citation indexes. The mean ± SD age of articles cited in neurosurgical literature was 11.6 ± 11.7 years (median, 8 years). Citations received by articles gradually increased to a peak (at 6.25 years after publication in neurosurgery) and then reached a steady state; articles were still cited well into the late postpublication period. Neurosurgical articles published in nonneurosurgical high-impact journals were cited more highly than those in neurosurgical journals, although they took approximately the same time to reach the maximally cited state (7.2 years). The most cited pure neurosurgery journal was Neurosurgery. The citation climate for neurosurgery was adequately described. The interfield citation metric was able to ensure cross-field comparability of journal performance. G1, group 1G2, group 2G3, group 3G4, group 4IFCM, interfield citation metric.

  10. Who attends recovery high schools after substance use treatment? A descriptive analysis of school aged youth.

    Science.gov (United States)

    Tanner-Smith, Emily E; Finch, Andrew J; Hennessy, Emily A; Moberg, D Paul

    2018-06-01

    Recovery high schools (RHSs) are an alternative high school option for adolescents with substance use disorders (SUDs), designed to provide a recovery-focused learning environment. The aims of this study were to examine the characteristics of youth who choose to attend RHSs, and to compare them with local and national comparison samples of youth in recovery from SUDs who were not enrolled in RHSs. We conducted secondary analysis of existing data to compare characteristics of youth in three samples: (1) adolescents with SUDs who enrolled in RHSs in Minnesota, Texas, and Wisconsin after discharge from treatment (RHSs; n = 171, 51% male, 86% White, 4% African American, 5% Hispanic); (2) a contemporaneously recruited local comparison sample of students with SUDs who did not enroll in RHSs (n = 123, 60% male, 77% White, 5% African American, 12% Hispanic); and (3) a national comparison sample of U.S. adolescents receiving SUD treatment (n = 12,967, 73% male, 37% White, 15% African American, 30% Hispanic). Students enrolled in RHSs had elevated levels of risk factors for substance use and relapse relative to both the local and national comparison samples. For instance, RHS students reported higher rates of pre-treatment drug use, past mental health treatment, and higher rates of post-treatment physical health problems than adolescents in the national comparison sample. We conclude that RHSs serve a population with greater co-occurring problem severity than the typical adolescent in SUD treatment; programming offered at RHSs should attend to these complex patterns of risk factors. SUD service delivery policy should consider RHSs as an intensive recovery support model for the most high-risk students with SUDs. Copyright © 2018 Elsevier Inc. All rights reserved.

  11. Description and evaluation of a peracetic acid air sampling and analysis method.

    Science.gov (United States)

    Nordling, John; Kinsky, Owen R; Osorio, Magdalena; Pechacek, Nathan

    2017-12-01

    Peracetic acid (PAA) is a corrosive chemical with a pungent odor, which is extensively used in occupational settings and causes various health hazards in exposed workers. Currently, there is no US government agency recommended method that could be applied universally for the sampling and analysis of PAA. Legacy methods for determining airborne PAA vapor levels frequently suffered from cross-reactivity with other chemicals, particularly hydrogen peroxide (H 2 O 2 ). Therefore, to remove the confounding factor of cross-reactivity, a new viable, sensitive method was developed for assessment of PAA exposure levels, based on the differential reaction kinetics of PAA with methyl p-tolylsulfide (MTS), relative to H 2 O 2 , to preferentially derive methyl p-tolysulfoxide (MTSO). By quantifying MTSO concentration produced in the liquid capture solution from an air sampler, using an internal standard, and utilizing the reaction stoichiometry of PAA and MTS, the original airborne concentration of PAA is determined. After refining this liquid trap high-performance liquid chromatography (HPLC) method in the laboratory, it was tested in five workplace settings where PAA products were used. PAA levels ranged from the detection limit of 0.013 parts per million (ppm) to 0.4 ppm. The results indicate a viable and potentially dependable method to assess the concentrations of PAA vapors under occupational exposure scenarios, though only a small number of field measurements were taken while field testing this method. However, the low limit of detection and precision offered by this method makes it a strong candidate for further testing and validation to expand the uses of this liquid trap HPLC method.

  12. Retrospective analysis of spinal trauma in patients with ankylosing spondylitis: a descriptive study in Indian population.

    Science.gov (United States)

    Mahajan, R; Chhabra, H S; Srivastava, A; Venkatesh, R; Kanagaraju, V; Kaul, R; Tandon, V; Nanda, A; Sangondimath, G; Patel, N

    2015-05-01

    This study aims to understand the demographics, mode of trauma, hospital stay, complications, neurological improvement, mortality and expenditure incurred by Indian patients with spinal trauma and ankylosing spondylitis (AS). Retrospective analysis of the patient data admitted to a tertiary referral hospital from 2008 to 2013 with the diagnosis of AS and spinal trauma was carried out. The variables studied were demographics, mode of trauma, neurological status, neurological improvement, involved vertebral level, duration of hospital stay, comorbid factors, expenditure and complications during the stay. Forty-six patients with diagnosis of AS with spine trauma were admitted over the last 5 years with a total of 52 fractures. All were male patients; 58.6% had injury because of trivial trauma and 78.2% patients presented with neurological injury. C5 C6, C6 C7, C7 D1 and D12 were the most common injured level. Fractures through intervertebral disc were most common in cervical spine. Of the patients, 52.7% had shown neurological improvement of at least grade 1(AIS). Mean expenditure of patient admitted with spinal cord injury (SCI) with AS is 7957 USD (United States dollar), which is around five times the per capita income in India (as per year 2013). Males with AS are much more prone to spinal fractures than females and its incidence may be higher than previously reported. Domestic falls are the most common mechanism of spinal trauma in this population. High velocity injuries are associated with complete SCI. The study reinforces the need for development of subsidized spinal care services for SCI management.

  13. Catastrophic antiphospholipid syndrome (CAPS): Descriptive analysis of 500 patients from the International CAPS Registry.

    Science.gov (United States)

    Rodríguez-Pintó, Ignasi; Moitinho, Marta; Santacreu, Irene; Shoenfeld, Yehuda; Erkan, Doruk; Espinosa, Gerard; Cervera, Ricard

    2016-12-01

    To analyze the clinical and immunologic manifestations of patients with catastrophic antiphospholipid syndrome (CAPS) from the "CAPS Registry". The demographic, clinical and serological features of 500 patients included in the website-based "CAPS Registry" were analyzed. Frequency distribution and measures of central tendency were used to describe the cohort. Comparison between groups regarding qualitative variables was undertaken by chi-square or Fisher exact test while T-test for independent variables was used to compare groups regarding continuous variables. 500 patients (female: 343 [69%]; mean age 38±17) accounting for 522 episodes of CAPS were included in the analysis. Forty percent of patients had an associated autoimmune disease, mainly systemic lupus erythematosus (SLE) (75%). The majority of CAPS episodes were triggered by a precipitating factor (65%), mostly infections (49%). Clinically, CAPS was characterized by several organ involvement affecting kidneys (73%), lungs (60%), brain (56%), heart (50%), and skin (47%). Lupus anticoagulant, IgG anticardiolipin and IgG anti-β2-glycprotein antibodies were the most often implicated antiphospholipid antibodies (83%, 81% and 78% respectively). Mortality accounted for 37% of episodes of CAPS. Several clinical differences could be observed based on the age of presentation and its association to SLE. Those cases triggered by a malignancy tended to occur in older patients, while CAPS episodes in young patients were associated with an infectious trigger and peripheral vessels involvement. Additionally, CAPS associated with SLE were more likely to have severe cardiac and brain involvement leading to a higher mortality (48%). Although the presentation of CAPS is characterized by multiorgan thrombosis and failure, clinical differences among patients exist based on age and underlying chronic diseases, e.g. malignancy and SLE. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Docetaxel-related fatigue in men with metastatic prostate cancer: a descriptive analysis.

    Science.gov (United States)

    Bergin, A R T; Hovey, E; Lloyd, A; Marx, G; Parente, P; Rapke, T; de Souza, P

    2017-09-01

    Fatigue is a prevalent and debilitating side effect of docetaxel chemotherapy in metastatic prostate cancer. A better understanding of the kinetics and nature of docetaxel-related fatigue may provide a framework for intervention. This secondary analysis was performed using the MOTIF database, from a phase III, randomised, double-blind, placebo-controlled study of modafinil (200 mg/day for 15 days) for docetaxel-related fatigue in men with metastatic prostate cancer [1]. The pattern of fatigue was analysed using the MDASI (MD Anderson Symptom Inventory) score. The impact of modafinil, cumulative docetaxel exposure, age and smoking status on fatigue kinetics were explored. Fatigue-related symptoms were assessed using the SOMA6 (fatigue and related symptoms) subset of the SPHERE (Somatic and Psychological Health Report). Mood was tracked using the short form 36 health survey questionnaire (SF-36). Across four docetaxel cycles, fatigue scores were higher in the first week and decreased over weeks two and three. Whilst men randomised to modafinil had reduced fatigue scores, cumulative docetaxel had little impact. Younger men (55-68 years) had significantly reduced fatigue scores, whereas current and ex-smokers had higher scores. There was no significant change in mood status or haemoglobin across treatment cycles. Men described both 'somnolence' and 'muscle fatigue' contributing significantly to their symptom complex. Assessment and management of docetaxel-related fatigue remains an important challenge. Given the complex, multifactorial nature of fatigue, identification through structured interview and interventions targeted to specific 'at risk' groups may be the most beneficial. Understanding the temporal pattern (kinetics) and nature of fatigue is critical to guide this process.

  15. Dental Blogs, Podcasts, and Associated Social Media: Descriptive Mapping and Analysis.

    Science.gov (United States)

    Melkers, Julia; Hicks, Diana; Rosenblum, Simone; Isett, Kimberley R; Elliott, Jacqueline

    2017-07-26

    Studies of social media in both medicine and dentistry have largely focused on the value of social media for marketing to and communicating with patients and for clinical education. There is limited evidence of how dental clinicians contribute to and use social media to disseminate and access information relevant to clinical care. The purpose of this study was to inventory and assess the entry, growth, sources, and content of clinically relevant social media in dentistry. We developed an inventory of blogs, podcasts, videos, and associated social media disseminating clinical information to dentists. We assessed hosts' media activity in terms of their combinations of modalities, entry and exit dates, frequency of posting, types of content posted, and size of audience. Our study showed that clinically relevant information is posted by dentists and hygienists on social media. Clinically relevant information was provided in 89 blogs and podcasts, and topic analysis showed motives for blogging by host type: 55% (49 hosts) were practicing dentists or hygienists, followed by consultants (27 hosts, 30%), media including publishers and discussion board hosts (8 hosts, 9%), and professional organizations and corporations. We demonstrated the participation of and potential for practicing dentists and hygienists to use social media to share clinical and other information with practicing colleagues. There is a clear audience for these social media sites, suggesting a changing mode of information diffusion in dentistry. This study was a first effort to fill the gap in understanding the nature and potential role of social media in clinical dentistry. ©Julia Melkers, Diana Hicks, Simone Rosenblum, Kimberley R Isett, Jacqueline Elliott. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 26.07.2017.

  16. Fournier gangrene: description of 37 cases and analysis of associated health care costs.

    Science.gov (United States)

    Jiménez-Pacheco, A; Arrabal-Polo, M Á; Arias-Santiago, S; Arrabal-Martín, M; Nogueras-Ocaña, M; Zuluaga-Gómez, A

    2012-01-01

    Fournier gangrene is a urological emergency associated with a high mortality. It is a necrotizing fasciitis caused by polymicrobial infection originating in the anorectal or genitourinary area. The aim of this study was to analyze the epidemiological and clinical characteristics of Fournier gangrene along with the variables that influence disease course and mortality in patients treated in our department. We carried out a retrospective study of 37 patients diagnosed with Fournier gangrene between January 2001 and October 2010. All the patients were men, 43.2% had diabetes, and the mean age of the patients was 57.68 years. Statistically significant differences were observed between the age of surviving patients and that of patients who died (55.8 and 69.6 years, respectively). The mean hospital stay was 27.54 days and 32.4% of patients required admission to the intensive care unit. Etiology was unknown in 39.8% of cases. Polymicrobial infection was observed in 59.5% of cases. The mean health care cost associated with a patient diagnosed with Fournier gangrene admitted to intensive care and requiring at least 1 procedure in the operating room was €25,108.67. Mortality was 13.5%. Based on analysis of individual comorbid conditions, only ischemic heart disease displayed a statistically significant association with mortality due to Fournier gangrene; ischemic heart disease was also associated with longer hospital stay. Fournier gangrene is associated with high mortality despite appropriate early treatment. Although the condition is infrequent, the high associated health care costs suggest that primary and secondary prevention measures should be implemented. Copyright © 2011 Elsevier España, S.L. and AEDV. All rights reserved.

  17. [Fournier gangrene: description of 37 cases and analysis of associated health care costs].

    Science.gov (United States)

    Jiménez-Pacheco, A; Arrabal-Polo, M Á; Arias-Santiago, S; Arrabal-Martín, M; Nogueras-Ocaña, M; Zuluaga-Gómez, A

    2012-01-01

    Fournier gangrene is a urological emergency associated with a high mortality. It is a necrotizing fasciitis caused by polymicrobial infection originating in the anorectal or genitourinary area. The aim of this study was to analyze the epidemiological and clinical characteristics of Fournier gangrene along with the variables that influence disease course and mortality in patients treated in our department. We carried out a retrospective study of 37 patients diagnosed with Fournier gangrene between January 2001 and October 2010. All of the patients were men, 43.2% had diabetes, and the mean age of the patients was 57.68 years. Statistically significant differences were observed between the age of surviving patients and that of patients who died (55.8 and 69.6 years, respectively). The mean hospital stay was 27.54 days and 32.4% of patients required admission to the intensive care unit. Etiology was unknown in 39.8% of cases. Polymicrobial infection was observed in 59.5% of cases. The mean health care cost associated with a patient diagnosed with Fournier gangrene admitted to intensive care and requiring at least 1 procedure in the operating room was €25,108.67. Mortality was 13.5%. Based on analysis of individual comorbid conditions, only ischemic heart disease displayed a statistically significant association with mortality due to Fournier gangrene; ischemic heart disease was also associated with longer hospital stay. Fournier gangrene is associated with high mortality despite appropriate early treatment. Although the condition is infrequent, the high associated health care costs suggest that primary and secondary prevention measures should be implemented. Copyright © 2011 Elsevier España, S.L. and AEDV. All rights reserved.

  18. The green bank northern celestial cap pulsar survey. I. Survey description, data analysis, and initial results

    Energy Technology Data Exchange (ETDEWEB)

    Stovall, K.; Dartez, L. P.; Ford, A. J.; Garcia, A.; Hinojosa, J.; Jenet, F. A.; Leake, S. [Center for Advanced Radio Astronomy, University of Texas at Brownsville, One West University Boulevard, Brownsville, TX 78520 (United States); Lynch, R. S.; Archibald, A. M.; Karako-Argaman, C.; Kaspi, V. M. [Department of Physics, McGill University, 3600 University Street, Montreal, QC H3A 2T8 (Canada); Ransom, S. M. [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22901 (United States); Banaszak, S.; Biwer, C. M.; Day, D.; Flanigan, J.; Kaplan, D. L. [Physics Department, University of Wisconsin-Milwaukee, Milwaukee, WI 53211 (United States); Boyles, J. [Department of Physics and Astronomy, Western Kentucky University, Bowling Green, KY 42101 (United States); Hessels, J. W. T.; Kondratiev, V. I., E-mail: stovall.kevin@gmail.com [ASTRON, the Netherlands Institute for Radio Astronomy, Postbus 2, 7990 AA Dwingeloo (Netherlands); and others

    2014-08-10

    We describe an ongoing search for pulsars and dispersed pulses of radio emission, such as those from rotating radio transients (RRATs) and fast radio bursts, at 350 MHz using the Green Bank Telescope. With the Green Bank Ultimate Pulsar Processing Instrument, we record 100 MHz of bandwidth divided into 4096 channels every 81.92 μs. This survey will cover the entire sky visible to the Green Bank Telescope (δ > –40°, or 82% of the sky) and outside of the Galactic Plane will be sensitive enough to detect slow pulsars and low dispersion measure (<30 pc cm{sup –3}) millisecond pulsars (MSPs) with a 0.08 duty cycle down to 1.1 mJy. For pulsars with a spectral index of –1.6, we will be 2.5 times more sensitive than previous and ongoing surveys over much of our survey region. Here we describe the survey, the data analysis pipeline, initial discovery parameters for 62 pulsars, and timing solutions for 5 new pulsars. PSR J0214+5222 is an MSP in a long-period (512 days) orbit and has an optical counterpart identified in archival data. PSR J0636+5129 is an MSP in a very short-period (96 minutes) orbit with a very low mass companion (8 M{sub J}). PSR J0645+5158 is an isolated MSP with a timing residual RMS of 500 ns and has been added to pulsar timing array experiments. PSR J1434+7257 is an isolated, intermediate-period pulsar that has been partially recycled. PSR J1816+4510 is an eclipsing MSP in a short-period orbit (8.7 hr) and may have recently completed its spin-up phase.

  19. Analysis of magnetic-dipole transitions in tungsten plasmas using detailed and configuration-average descriptions

    Science.gov (United States)

    Na, Xieyu; Poirier, Michel

    2017-06-01

    This paper is devoted to the analysis of transition arrays of magnetic-dipole (M1) type in highly charged ions. Such transitions play a significant role in highly ionized plasmas, for instance in the tungsten plasma present in tokamak devices. Using formulas recently published and their implementation in the Flexible Atomic Code for M1-transition array shifts and widths, absorption and emission spectra arising from transitions inside the 3*n complex of highly-charged tungsten ions are analyzed. A comparison of magnetic-dipole transitions with electric-dipole (E1) transitions shows that, while the latter are better described by transition array formulas, M1 absorption and emission structures reveal some insufficiency of these formulas. It is demonstrated that the detailed spectra account for significantly richer structures than those predicted by the transition array formalism. This is due to the fact that M1 transitions may occur between levels inside the same relativistic configuration, while such inner configuration transitions are not accounted for by the currently available averaging expression. In addition, because of configuration interaction, transition processes involving more than one electron jump, such as 3p1/23d5/2 → 3p3/23d3/2, are possible but not accounted for in the transition array formulas. These missing transitions are collected in pseudo-arrays using a post-processing method described in this paper. The relative influence of inner- and inter-configuration transitions is carefully analyzed in cases of tungsten ions with net charge around 50. The need for an additional theoretical development is emphasized.

  20. Myasthenia gravis: descriptive analysis of life-threatening events in a recent nationwide registry.

    Science.gov (United States)

    Ramos-Fransi, A; Rojas-García, R; Segovia, S; Márquez-Infante, C; Pardo, J; Coll-Cantí, J; Jericó, I; Illa, I

    2015-07-01

    Myasthenia gravis (MG) may become life-threatening if patients have respiratory insufficiency or dysphagia. This study aimed to determine the incidence, demographic characteristics, risk factors, response to treatment and outcome of these life-threatening events (LTEs) in a recent, population-based sample of MG patients. A retrospective analysis of MG patients who presented with an LTE between 2000 and 2013 was performed. Participants were identified from a neuromuscular diseases registry in Spain that includes 648 patients with MG (NMD-ES). Sixty-two (9.56%) patients had an LTE. Thirty-two were classified as class V according to the MG Foundation of America, and 30 as class IVB. Fifty per cent were previously diagnosed with MG and median duration of the disease before the LTE was 24 months (3-406). The most common related factor was infection (n = 18). All patients received intravenous human immunoglobulin; 11 had a second infusion and six had plasma exchange. Median time to feeding tube removal was 13 days (1-434). Median time to weaning from ventilation was 12 days (3-176), and it was significantly shorter in late onset MG (≥50 years) (P = 0.019). LTEs improved <2 weeks in 55.8% but did not improve until after 1 month in 20% of patients. Four patients died. No other factors influenced mortality or duration of LTEs. The percentage of LTEs in MG patients was low, particularly amongst those previously diagnosed and treated for the disease. The significant percentage of treatment-resistant LTEs indicates that more effective treatment approaches are needed for this vulnerable sub-population. © 2015 EAN.