WorldWideScience

Sample records for carlos apache indians

  1. San Carlos Apache Tribe - Energy Organizational Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, James; Albert, Steve

    2012-04-01

    The San Carlos Apache Tribe (SCAT) was awarded $164,000 in late-2011 by the U.S. Department of Energy (U.S. DOE) Tribal Energy Program's "First Steps Toward Developing Renewable Energy and Energy Efficiency on Tribal Lands" Grant Program. This grant funded:  The analysis and selection of preferred form(s) of tribal energy organization (this Energy Organization Analysis, hereinafter referred to as "EOA").  Start-up staffing and other costs associated with the Phase 1 SCAT energy organization.  An intern program.  Staff training.  Tribal outreach and workshops regarding the new organization and SCAT energy programs and projects, including two annual tribal energy summits (2011 and 2012). This report documents the analysis and selection of preferred form(s) of a tribal energy organization.

  2. Solar Feasibility Study May 2013 - San Carlos Apache Tribe

    Energy Technology Data Exchange (ETDEWEB)

    Rapp, Jim [Parametrix; Duncan, Ken [San Carlos Apache Tribe; Albert, Steve [Parametrix

    2013-05-01

    The San Carlos Apache Tribe (Tribe) in the interests of strengthening tribal sovereignty, becoming more energy self-sufficient, and providing improved services and economic opportunities to tribal members and San Carlos Apache Reservation (Reservation) residents and businesses, has explored a variety of options for renewable energy development. The development of renewable energy technologies and generation is consistent with the Tribe’s 2011 Strategic Plan. This Study assessed the possibilities for both commercial-scale and community-scale solar development within the southwestern portions of the Reservation around the communities of San Carlos, Peridot, and Cutter, and in the southeastern Reservation around the community of Bylas. Based on the lack of any commercial-scale electric power transmission between the Reservation and the regional transmission grid, Phase 2 of this Study greatly expanded consideration of community-scale options. Three smaller sites (Point of Pines, Dudleyville/Winkleman, and Seneca Lake) were also evaluated for community-scale solar potential. Three building complexes were identified within the Reservation where the development of site-specific facility-scale solar power would be the most beneficial and cost-effective: Apache Gold Casino/Resort, Tribal College/Skill Center, and the Dudleyville (Winkleman) Casino.

  3. Vegetative response to water availability on the San Carlos Apache Reservation

    Science.gov (United States)

    Petrakis, Roy; Wu, Zhuoting; McVay, Jason; Middleton, Barry R.; Dye, Dennis G.; Vogel, John M.

    2016-01-01

    On the San Carlos Apache Reservation in east-central Arizona, U.S.A., vegetation types such as ponderosa pine forests, pinyon-juniper woodlands, and grasslands have significant ecological, cultural, and economic value for the Tribe. This value extends beyond the tribal lands and across the Western United States. Vegetation across the Southwestern United States is susceptible to drought conditions and fluctuating water availability. Remotely sensed vegetation indices can be used to measure and monitor spatial and temporal vegetative response to fluctuating water availability conditions. We used the Moderate Resolution Imaging Spectroradiometer (MODIS)-derived Modified Soil Adjusted Vegetation Index II (MSAVI2) to measure the condition of three dominant vegetation types (ponderosa pine forest, woodland, and grassland) in response to two fluctuating environmental variables: precipitation and the Standardized Precipitation Evapotranspiration Index (SPEI). The study period covered 2002 through 2014 and focused on a region within the San Carlos Apache Reservation. We determined that grassland and woodland had a similar moderate to strong, year-round, positive relationship with precipitation as well as with summer SPEI. This suggests that these vegetation types respond negatively to drought conditions and are more susceptible to initial precipitation deficits. Ponderosa pine forest had a comparatively weaker relationship with monthly precipitation and summer SPEI, indicating that it is more buffered against short-term drought conditions. This research highlights the response of multiple, dominant vegetation types to seasonal and inter-annual water availability. This research demonstrates that multi-temporal remote sensing imagery can be an effective tool for the large scale detection of vegetation response to adverse impacts from climate change and support potential management practices such as increased monitoring and management of drought-affected areas. Different

  4. Nonsuicidal Self-Injury in an American Indian Reservation Community: Results from the White Mountain Apache Surveillance System, 2007-2008

    Science.gov (United States)

    Cwik, Mary F.; Barlow, Allison; Tingey, Lauren; Larzelere-Hinton, Francene; Goklish, Novalene; Walkup, John T.

    2011-01-01

    Objective: To describe characteristics and correlates of nonsuicidal self-injury (NSSI) among the White Mountain Apache Tribe. NSSI has not been studied before in American Indian samples despite associated risks for suicide, which disproportionately affect American Indian youth. Method: Apache case managers collected data through a tribally…

  5. Apache Kafka

    CERN Document Server

    Garg, Nishant

    2013-01-01

    The book will follow a step-by-step tutorial approach which will show the readers how to use Apache Kafka for messaging from scratch.Apache Kafka is for readers with software development experience, but no prior exposure to Apache Kafka or similar technologies is assumed. This book is also for enterprise application developers and big data enthusiasts who have worked with other publisher-subscriber based systems and now want to explore Apache Kafka as a futuristic scalable solution.

  6. Apache Maven cookbook

    CERN Document Server

    Bharathan, Raghuram

    2015-01-01

    If you are a Java developer or a manager who has experience with Apache Maven and want to extend your knowledge, then this is the ideal book for you. Apache Maven Cookbook is for those who want to learn how Apache Maven can be used for build automation. It is also meant for those familiar with Apache Maven, but want to understand the finer nuances of Maven and solve specific problems.

  7. Subsurface Analysis of the Mesaverde Group on and near the Jicarilla Apache Indian Reservation, New Mexico-its implication on Sites of Oil and Gas Accumulation

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie

    2001-08-21

    The purpose of the phase 2 Mesaverde study part of the Department of Energy funded project ''Analysis of oil-bearing Cretaceous Sandstone Hydrocarbon Reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico'' was to define the facies of the oil-producing units within the subsurface units of the Mesaverde Group and integrate these results with outcrop studies that defined the depositional environments of these facies within a sequence stratigraphic context. The focus of this report will center on (1) integration of subsurface correlations with outcrop correlations of components of the Mesaverde, (2) application of the sequence stratigraphic model determined in the phase one study to these correlations, (3) determination of the facies distribution of the Mesaverde Group and their relationship to sites of oil and gas accumulation, (4) evaluation of the thermal maturity and potential source rocks for oil and gas in the Mesaverde Group, and (5) evaluation of the structural features on the Reservation as they may control sites of oil accumulation.

  8. The piloting of a culturally centered American Indian family prevention program: a CBPR partnership between Mescalero Apache and the University of New Mexico.

    Science.gov (United States)

    Belone, Lorenda; Orosco, Ardena; Damon, Eloise; Smith-McNeal, Willymae; Rae, Rebecca; Sherpa, Mingma L; Myers, Orrin B; Omeh, Anslem O; Wallerstein, Nina

    2017-01-01

    The Mescalero Apache Family Listening Program (MAFLP) is a culturally centered family prevention program with third, fourth, and fifth graders; a parent/caregiver; and a family elder. The program follows a positive youth development model to develop stronger communication and shared cultural practices between elders, parents, and youth in the tribe to reduce substance initiation of use among the youth. The MAFLP was created using a community-based participatory research (CBPR) approach in partnership with the University of New Mexico. The research focus of MAFLP is centered on the adaptation of a family curriculum from a Navajo and Pueblo version of the Family Listening Program to an Apache version, the establishment of a (Apache) Tribal Research Team, and the piloting of the curriculum with Apache families. MAFLP was piloted twice, and evaluation measures were collected focused on formative and impact evaluation. This article provides a background on Mescalero Apache then introduces the Navajo and Pueblo version of a Family Listening and Family Circle Program, respectively, next, the CBPR research partnership between Mescalero Apache and the University of New Mexico and the creation of a Mescalero Apache Tribal Research Team followed by the development and adaptation of a Mescalero Apache Family Listening Program including implementation and evaluation, and concluding with preliminary findings.

  9. Location, Reprocessing, and Analysis of Two Dimensional Seismic Reflection Data on the Jicarilla Apache Indian Reservation, New Mexico, Final Report, September 1, 1997-February 1, 2000

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie; Taylor, David J.; Huffman, Jr., A. Curtis

    2000-06-08

    Multichannel surface seismic reflection data recording is a standard industry tool used to examine various aspects of geology, especially the stratigraphic characteristics and structural style of sedimentary formations in the subsurface. With the help of the Jicarilla Apache Tribe and the Bureau of Indian Affairs we were able to locate over 800 kilometers (500 miles) of multichannel seismic reflection data located on the Jicarilla Apache Indian reservation. Most of the data was received in hardcopy form, but there were data sets where either the demultiplexed digital field data or the processed data accompanied the hardcopy sections. The seismic data was acquired from the mid 1960's to the early 1990's. The most extensive seismic coverage is in the southern part of the reservation, although there are two good surveys located on the northeastern and northwestern parts of the reservation. Most of the data show that subsurface formations are generally flat-lying in the southern and western portion of the reservation. There is, however, a significant amount of structure imaged on seismic data located over the San Juan Basin margin along the east-central and northern part of the reservation. Several west to east trending lines in these areas show a highly faulted monoclinal structure from the deep basin in the west up onto the basin margin to the east. Hydrocarbon exploration in flat lying formations is mostly stratigraphic in nature. Where there is structure in the subsurface and indications are that rocks have been folded, faulted, and fractured, exploration has concentrated on structural traps and porosity/permeability "sweet spots" caused by fracturing. Therefore, an understanding of the tectonics influencing the entire section is critical in understanding mechanisms for generating faults and fractures in the Cretaceous. It is apparent that much of the hydrocarbon production on the reservation is from fracture porosity in either source or reservoir

  10. Learning Apache Kafka

    CERN Document Server

    Garg, Nishant

    2015-01-01

    This book is for readers who want to know more about Apache Kafka at a hands-on level; the key audience is those with software development experience but no prior exposure to Apache Kafka or similar technologies. It is also useful for enterprise application developers and big data enthusiasts who have worked with other publisher-subscriber-based systems and want to explore Apache Kafka as a futuristic solution.

  11. Apache The Definitive Guide

    CERN Document Server

    Laurie, Ben

    2003-01-01

    Apache is far and away the most widely used web server platform in the world. This versatile server runs more than half of the world's existing web sites. Apache is both free and rock-solid, running more than 21 million web sites ranging from huge e-commerce operations to corporate intranets and smaller hobby sites. With this new third edition of Apache: The Definitive Guide, web administrators new to Apache will come up to speed quickly, and experienced administrators will find the logically organized, concise reference sections indispensable, and system programmers interested in customizin

  12. Learning Apache Karaf

    CERN Document Server

    Edstrom, Johan; Kesler, Heath

    2013-01-01

    The book is a fast-paced guide full of step-by-step instructions covering all aspects of application development using Apache Karaf.Learning Apache Karaf will benefit all Java developers and system administrators who need to develop for and/or operate Karaf's OSGi-based runtime. Basic knowledge of Java is assumed.

  13. The APACHE Project

    Directory of Open Access Journals (Sweden)

    Giacobbe P.

    2013-04-01

    Full Text Available First, we summarize the four-year long efforts undertaken to build the final setup of the APACHE Project, a photometric transit search for small-size planets orbiting bright, low-mass M dwarfs. Next, we describe the present status of the APACHE survey, officially started in July 2012 at the site of the Astronomical Observatory of the Autonomous Region of the Aosta Valley, in the Western Italian Alps. Finally, we briefly discuss the potentially far-reaching consequences of a multi-technique characterization program of the (potentially planet-bearing APACHE targets.

  14. Apache Solr beginner's guide

    CERN Document Server

    Serafini, Alfredo

    2013-01-01

    Written in a friendly, example-driven format, the book includes plenty of step-by-step instructions and examples that are designed to help you get started with Apache Solr.This book is an entry level text into the wonderful world of Apache Solr. The book will center around a couple of simple projects such as setting up Solr and all the stuff that comes with customizing the Solr schema and configuration. This book is for developers looking to start using Apache Solr who are stuck or intimidated by the difficulty of setting it up and using it.For anyone wanting to embed a search engine in their

  15. Instant Apache Wicket 6

    CERN Document Server

    Longo, João Sávio Ceregatti

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. This Starter style guide takes the reader through the basic workflow of Apache Wicket in a practical and friendly style.Instant Apache Wicket 6 is for people who want to learn the basics of Apache Wicket 6 and who already have some experience with Java and object-oriented programming. Basic knowledge of web concepts like HTTP and Ajax will be an added advantage.

  16. Apache Solr essentials

    CERN Document Server

    Gazzarini, Andrea

    2015-01-01

    If you are a competent developer with experience of working with technologies similar to Apache Solr and want to develop efficient search applications, then this book is for you. Familiarity with the Java programming language is required.

  17. Apache Mahout essentials

    CERN Document Server

    Withanawasam, Jayani

    2015-01-01

    If you are a Java developer or data scientist, haven't worked with Apache Mahout before, and want to get up to speed on implementing machine learning on big data, then this is the perfect guide for you.

  18. Apache Mahout cookbook

    CERN Document Server

    Giacomelli, Piero

    2013-01-01

    Apache Mahout Cookbook uses over 35 recipes packed with illustrations and real-world examples to help beginners as well as advanced programmers get acquainted with the features of Mahout.""Apache Mahout Cookbook"" is great for developers who want to have a fresh and fast introduction to Mahout coding. No previous knowledge of Mahout is required, and even skilled developers or system administrators will benefit from the various recipes presented

  19. Apache Tomcat 7 Essentials

    CERN Document Server

    Khare, Tanuj

    2012-01-01

    This book is a step-by-step tutorial for anyone wanting to learn Apache Tomcat 7 from scratch. There are plenty of illustrations and examples to escalate you from a novice to an expert with minimal strain. If you are a J2EE administrator, migration administrator, technical architect, or a project manager for a web hosting domain, and are interested in Apache Tomcat 7, then this book is for you. If you are someone responsible for installation, configuration, and management of Tomcat 7, then too, this book will be of help to you.

  20. Instant Apache Stanbol

    CERN Document Server

    Bachmann-Gmür, Reto

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Instant Apache Stanbol How-to will enable you to become an expert in content management with semantics at the core, with the help of practical recipes. Instant Apache Stanbol How-to is for Java developers who would like to extend Stanbol or would just like to use Stanbol without caring about its internals. A few recipes that show how to extend Stanbol require some familiarity with Java and JavaScript.

  1. Instant Apache Maven starter

    CERN Document Server

    Turatti, Maurizio

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks.The book follows a starter approach for using Maven to create and build a new Java application or Web project from scratch.Instant Apache Maven Starter is great for Java developers new to Apache Maven, but also for experts looking for immediate information. Moreover, only 20% of the necessary information about Maven is used in 80% of the activities. This book aims to focus on the most important information, those pragmatic parts you actually use

  2. The piloting of a culturally centered American Indian family prevention program: a CBPR partnership between Mescalero Apache and the University of New Mexico

    OpenAIRE

    Belone, Lorenda; Orosco, Ardena; Damon, Eloise; Smith-McNeal, Willymae; Rae, Rebecca; Sherpa, Mingma L.; Myers, Orrin B.; Omeh, Anslem O.; Wallerstein, Nina

    2017-01-01

    The Mescalero Apache Family Listening Program (MAFLP) is a culturally centered family prevention program with third, fourth, and fifth graders; a parent/caregiver; and a family elder. The program follows a positive youth development model to develop stronger communication and shared cultural practices between elders, parents, and youth in the tribe to reduce substance initiation of use among the youth. The MAFLP was created using a community-based participatory research (CBPR) approach in par...

  3. Apaches push privatization

    International Nuclear Information System (INIS)

    Daniels, S.

    1994-01-01

    Trying to drum up business for what would be the first private temporary storage facility for spent nuclear fuel rods, the Mescalero Apaches are inviting officials of 30 utilities to convene March 10 at the tribe's New Mexico reservation. The state public utilities commission will also attend the meeting, which grew from an agreement the tribe signed last month with Minneapolis-based Northern States Power Co

  4. Mastering Apache Cassandra

    CERN Document Server

    Neeraj, Nishant

    2013-01-01

    Mastering Apache Cassandra is a practical, hands-on guide with step-by-step instructions. The smooth and easy tutorial approach focuses on showing people how to utilize Cassandra to its full potential.This book is aimed at intermediate Cassandra users. It is best suited for startups where developers have to wear multiple hats: programmer, DevOps, release manager, convincing clients, and handling failures. No prior knowledge of Cassandra is required.

  5. Apache 2 Pocket Reference For Apache Programmers & Administrators

    CERN Document Server

    Ford, Andrew

    2008-01-01

    Even if you know the Apache web server inside and out, you still need an occasional on-the-job reminder -- especially if you're moving to the newer Apache 2.x. Apache 2 Pocket Reference gives you exactly what you need to get the job done without forcing you to plow through a cumbersome, doorstop-sized reference. This Book provides essential information to help you configure and maintain the server quickly, with brief explanations that get directly to the point. It covers Apache 2.x, giving web masters, web administrators, and programmers a quick and easy reference solution. This pocket r

  6. Instant Apache Sqoop

    CERN Document Server

    Jain, Ankit

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. Instant Apache Sqoop is full of step-by-step instructions and practical examples along with challenges to test and improve your knowledge.This book is great for developers who are looking to get a good grounding in how to effectively and efficiently move data between RDBMS and the Hadoop ecosystem. It's assumed that you will have some experience in Hadoop already as well as some familiarity with HBase and Hive.

  7. Apache Cordova 3 programming

    CERN Document Server

    Wargo, John M

    2013-01-01

    Written for experienced mobile developers, Apache Cordova 3 Programming is a complete introduction to Apache Cordova 3 and Adobe PhoneGap 3. It describes what makes Cordova important and shows how to install and use the tools, the new Cordova CLI, the native SDKs, and more. If you’re brand new to Cordova, this book will be just what you need to get started. If you’re familiar with an older version of Cordova, this book will show you in detail how to use all of the new stuff that’s in Cordova 3 plus stuff that has been around for a while (like the Cordova core APIs). After walking you through the process of downloading and setting up the framework, mobile expert John M. Wargo shows you how to install and use the command line tools to manage the Cordova application lifecycle and how to set up and use development environments for several of the more popular Cordova supported mobile device platforms. Of special interest to new developers are the chapters on the anatomy of a Cordova application, as well ...

  8. Interrupting White Mountain Apache Language Shift: An Insider's View.

    Science.gov (United States)

    Adley-SantaMaria, Bernadette

    1999-01-01

    A White Mountain Apache (WMA) doctoral student collaborating with a non-Indian linguist on a grammar book project discusses the status of the WMA language; causes of WMA language shift; aspects of insider-outsider collaboration; implications for revitalization and maintenance of indigenous languages; and the responsibilities of individuals,…

  9. Sequence Stratigraphic Analysis and Facies Architecture of the Cretaceous Mancos Shale on and Near the Jicarilla Apache Indian Reservation, New Mexico-their relation to Sites of Oil Accumulation

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie

    2001-08-21

    The purpose of phase 1 and phase 2 of the Department of Energy funded project Analysis of oil- bearing Cretaceous Sandstone Hydrocarbon Reservoirs, exclusive of the Dakota Sandstone, on the Jicarilla Apache Indian Reservation, New Mexico was to define the facies of the oil producing units within the Mancos Shale and interpret the depositional environments of these facies within a sequence stratigraphic context. The focus of this report will center on (1) redefinition of the area and vertical extent of the ''Gallup sandstone'' or El Vado Sandstone Member of the Mancos Shale, (2) determination of the facies distribution within the ''Gallup sandstone'' and other oil-producing sandstones within the lower Mancos, placing these facies within the overall depositional history of the San Juan Basin, (3) application of the principals of sequence stratigraphy to the depositional units that comprise the Mancos Shale, and (4) evaluation of the structural features on the Reservation as they may control sites of oil accumulation.

  10. Apache Flink: Distributed Stream Data Processing

    CERN Document Server

    Jacobs, Kevin; CERN. Geneva. IT Department

    2016-01-01

    The amount of data is growing significantly over the past few years. Therefore, the need for distributed data processing frameworks is growing. Currently, there are two well-known data processing frameworks with an API for data batches and an API for data streams which are named Apache Flink and Apache Spark. Both Apache Spark and Apache Flink are improving upon the MapReduce implementation of the Apache Hadoop framework. MapReduce is the first programming model for distributed processing on large scale that is available in Apache Hadoop. This report compares the Stream API and the Batch API for both frameworks.

  11. Apache ZooKeeper essentials

    CERN Document Server

    Haloi, Saurav

    2015-01-01

    Whether you are a novice to ZooKeeper or already have some experience, you will be able to master the concepts of ZooKeeper and its usage with ease. This book assumes you to have some prior knowledge of distributed systems and high-level programming knowledge of C, Java, or Python, but no experience with Apache ZooKeeper is required.

  12. Instant Apache Camel message routing

    CERN Document Server

    Ibryam, Bilgin

    2013-01-01

    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. This short, instruction-based guide shows you how to perform application integration using the industry standard Enterprise Integration Patterns.This book is intended for Java developers who are new to Apache Camel and message- oriented applications.

  13. SEQUENCE STRATIGRAPHIC ANALYSIS AND FACIES ARCHITECTURE OF THE CRETACEOUS MANCOS SHALE ON AND NEAR THE JICARILLA APACHE INDIAN RESERVATION, NEW MEXICO-THEIR RELATION TO SITES OF OIL ACCUMULATION

    Energy Technology Data Exchange (ETDEWEB)

    Jennie Ridgley

    2000-03-31

    Oil distribution in the lower part of the Mancos Shale seems to be mainly controlled by fractures and by sandier facies that are dolomite-cemented. Structure in the area of the Jicarilla Apache Indian Reservation consists of the broad northwest- to southeast-trending Chaco slope, the deep central basin, and the monocline that forms the eastern boundary of the San Juan Basin. Superimposed on the regional structure are broad low-amplitude folds. Fractures seem best developed in the areas of these folds. Using sequence stratigraphic principals, the lower part of the Mancos Shale has been subdivided into four main regressive and transgressive components. These include facies that are the basinal time equivalents to the Gallup Sandstone, an overlying interbedded sandstone and shale sequence time equivalent to the transgressive Mulatto Tongue of the Mancos Shale, the El Vado Sandstone Member which is time equivalent to part of the Dalton Sandstone, and an unnamed interbedded sandstone and shale succession time equivalent to the regressive Dalton Sandstone and transgressive Hosta Tongue of the Mesaverde Group. Facies time equivalent to the Gallup Sandstone underlie an unconformity of regional extent. These facies are gradually truncated from south to north across the Reservation. The best potential for additional oil resources in these facies is in the southern part of the Reservation where the top sandier part of these facies is preserved. The overlying unnamed wedge of transgressive rocks produces some oil but is underexplored, except for sandstones equivalent to the Tocito Sandstone. This wedge of rocks is divided into from two to five units. The highest sand content in this wedge occurs where each of the four subdivisions above the Tocito terminates to the south and is overstepped by the next youngest unit. These terminal areas should offer the best targets for future oil exploration. The El Vado Sandstone Member overlies the transgressive wedge. It produces most of

  14. SEQUENCE STRATIGRAPHIC ANALYSIS AND FACIES ARCHITECTURE OF THE CRETACEOUS MANCOS SHALE ON AND NEAR THE JICARILLA APACHE INDIAN RESERVATION, NEW MEXICO-THEIR RELATION TO SITES OF OIL ACCUMULATION

    International Nuclear Information System (INIS)

    Jennie Ridgley

    2000-01-01

    Oil distribution in the lower part of the Mancos Shale seems to be mainly controlled by fractures and by sandier facies that are dolomite-cemented. Structure in the area of the Jicarilla Apache Indian Reservation consists of the broad northwest- to southeast-trending Chaco slope, the deep central basin, and the monocline that forms the eastern boundary of the San Juan Basin. Superimposed on the regional structure are broad low-amplitude folds. Fractures seem best developed in the areas of these folds. Using sequence stratigraphic principals, the lower part of the Mancos Shale has been subdivided into four main regressive and transgressive components. These include facies that are the basinal time equivalents to the Gallup Sandstone, an overlying interbedded sandstone and shale sequence time equivalent to the transgressive Mulatto Tongue of the Mancos Shale, the El Vado Sandstone Member which is time equivalent to part of the Dalton Sandstone, and an unnamed interbedded sandstone and shale succession time equivalent to the regressive Dalton Sandstone and transgressive Hosta Tongue of the Mesaverde Group. Facies time equivalent to the Gallup Sandstone underlie an unconformity of regional extent. These facies are gradually truncated from south to north across the Reservation. The best potential for additional oil resources in these facies is in the southern part of the Reservation where the top sandier part of these facies is preserved. The overlying unnamed wedge of transgressive rocks produces some oil but is underexplored, except for sandstones equivalent to the Tocito Sandstone. This wedge of rocks is divided into from two to five units. The highest sand content in this wedge occurs where each of the four subdivisions above the Tocito terminates to the south and is overstepped by the next youngest unit. These terminal areas should offer the best targets for future oil exploration. The El Vado Sandstone Member overlies the transgressive wedge. It produces most of

  15. Nuclear data physics issues in Monte Carlo simulations of neutron and photon transport in the Indian context

    International Nuclear Information System (INIS)

    Ganesan, S.

    2009-01-01

    In this write-up, some of the basic issues of nuclear data physics in Monte Carlo simulation of neutron transport in the Indian context are dealt with. In this lecture, some of the aspects associated with usage of the ENDF/B system, and of the PREPRO code system developed by D.E. Cullen and distributed by the IAEA Nuclear Data Section are briefly touched upon. Some aspects of the SIGACE code system which was developed by the author in collaboration with IPR, Ahmedabad and the IAEA Nuclear Data Section are also briefly covered. The validation of the SIGACE package included investigations using the NJOY and the MCNP compatible ACE files. Appendix-1 of the paper provides some useful discussions pointing out that voluminous and high-quality nuclear physics data required for nuclear applications usually evolve from a national effort to provide state-of-the-art data that are based upon established needs and uncertainties. Appendix-2 deals with some interesting work that was carried out using the SIGACE Code for Generating High Temperature ACE Files. Appendix-3 mentions briefly Integral nuclear data validation studies and use of Monte Carlo codes and nuclear data. Appendix-4 provides a brief summary report on selected Indian nuclear data physics activities for the interested reader in the light of BARC/DAE treating the subject area of nuclear data physics as a thrust area in our atomic energy programme

  16. Learning Apache Solr high performance

    CERN Document Server

    Mohan, Surendra

    2014-01-01

    This book is an easy-to-follow guide, full of hands-on, real-world examples. Each topic is explained and demonstrated in a specific and user-friendly flow, from search optimization using Solr to Deployment of Zookeeper applications. This book is ideal for Apache Solr developers and want to learn different techniques to optimize Solr performance with utmost efficiency, along with effectively troubleshooting the problems that usually occur while trying to boost performance. Familiarity with search servers and database querying is expected.

  17. Instant Apache Camel messaging system

    CERN Document Server

    Sharapov, Evgeniy

    2013-01-01

    Get to grips with a new technology, understand what it is and what it can do for you, and then get to work with the most important features and tasks. A beginner's guide to Apache Camel that walks you through basic operations like installation and setup right through to developing simple applications.This book is a good starting point for Java developers who have to work on an application dealing with various systems and interfaces but who haven't yet started using Enterprise System Buses or Java Business Integration frameworks.

  18. Mescalero Apache Tribe Monitored Retrievable Storage (MRS)

    International Nuclear Information System (INIS)

    Peso, F.

    1992-01-01

    The Nuclear Waste Policy Act of 1982, as amended, authorizes the siting, construction and operation of a Monitored Retrievable Storage (MRS) facility. The MRS is intended to be used for the temporary storage of spent nuclear fuel from the nation's nuclear power plants beginning as early as 1998. Pursuant to the Nuclear Waste Policy Act, the Office of the Nuclear Waste Negotiator was created. On October 7, 1991, the Nuclear Waste Negotiator invited the governors of states and the Presidents of Indian tribes to apply for government grants in order to conduct a study to assess under what conditions, if any, they might consider hosting an MRS facility. Pursuant to this invitation, on October 11, 1991 the Mescalero Apache Indian Tribe of Mescalero, NM applied for a grant to conduct a phased, preliminary study of the safety, technical, political, environmental, social and economic feasibility of hosting an MRS. The preliminary study included: (1) An investigative education process to facilitate the Tribe's comprehensive understanding of the safety, environmental, technical, social, political, and economic aspects of hosting an MRS, and; (2) The development of an extensive program that is enabling the Tribe, in collaboration with the Negotiator, to reach an informed and carefully researched decision regarding the conditions, (if any), under which further pursuit of the MRS would be considered. The Phase 1 grant application enabled the Tribe to begin the initial activities necessary to determine whether further consideration is warranted for hosting the MRS facility. The Tribe intends to pursue continued study of the MRS in order to meet the following objectives: (1) Continuing the education process towards a comprehensive understanding of the safety, environmental, technical, social and economic aspects of the MRS; (2) Conducting an effective public participation and information program; (3) Participating in MRS meetings

  19. Apache Flume distributed log collection for Hadoop

    CERN Document Server

    D'Souza, Subas

    2013-01-01

    A starter guide that covers Apache Flume in detail.Apache Flume: Distributed Log Collection for Hadoop is intended for people who are responsible for moving datasets into Hadoop in a timely and reliable manner like software engineers, database administrators, and data warehouse administrators

  20. Random Decision Forests on Apache Spark

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    About the speaker Tom White has been an Apache Hadoop committer since February 2007, and is a member of the Apache Software Foundation. He works for Cloudera, a company set up to offer Hadoop support and training. Previously he was as an independent Hadoop consultant, work...

  1. What's New in Apache Web Server 22?

    CERN Document Server

    Bowen, Rich

    2007-01-01

    What's New in Apache Web Server 2.2? shows you all the new features you'll know to set up and administer the Apache 2.2 web server. Learn how to take advantage of its improved caching, proxying, authentication, and other improvements in your Web 2.0 applications.

  2. Apache Flume distributed log collection for Hadoop

    CERN Document Server

    Hoffman, Steve

    2015-01-01

    If you are a Hadoop programmer who wants to learn about Flume to be able to move datasets into Hadoop in a timely and replicable manner, then this book is ideal for you. No prior knowledge about Apache Flume is necessary, but a basic knowledge of Hadoop and the Hadoop File System (HDFS) is assumed.

  3. Conservation priorities in the Apache Highlands ecoregion

    Science.gov (United States)

    Dale Turner; Rob Marshall; Carolyn A. F. Enquist; Anne Gondor; David F. Gori; Eduardo Lopez; Gonzalo Luna; Rafaela Paredes Aguilar; Chris Watts; Sabra Schwartz

    2005-01-01

    The Apache Highlands ecoregion incorporates the entire Madrean Archipelago/Sky Island region. We analyzed the current distribution of 223 target species and 26 terrestrial ecological systems there, and compared them with constraints on ecosystem integrity (e.g., road density) to determine the most efficient set of areas needed to maintain current biodiversity. The...

  4. Optimizing CMS build infrastructure via Apache Mesos

    CERN Document Server

    Abduracmanov, David; Degano, Alessandro; Elmer, Peter; Eulisse, Giulio; Mendez, David; Muzaffar, Shahzad

    2015-12-23

    The Offline Software of the CMS Experiment at the Large Hadron Collider (LHC) at CERN consists of 6M lines of in-house code, developed over a decade by nearly 1000 physicists, as well as a comparable amount of general use open-source code. A critical ingredient to the success of the construction and early operation of the WLCG was the convergence, around the year 2000, on the use of a homogeneous environment of commodity x86-64 processors and Linux. Apache Mesos is a cluster manager that provides efficient resource isolation and sharing across distributed applications, or frameworks. It can run Hadoop, Jenkins, Spark, Aurora, and other applications on a dynamically shared pool of nodes. We present how we migrated our continuos integration system to schedule jobs on a relatively small Apache Mesos enabled cluster and how this resulted in better resource usage, higher peak performance and lower latency thanks to the dynamic scheduling capabilities of Mesos.

  5. Network Intrusion Detection System using Apache Storm

    Directory of Open Access Journals (Sweden)

    Muhammad Asif Manzoor

    2017-06-01

    Full Text Available Network security implements various strategies for the identification and prevention of security breaches. Network intrusion detection is a critical component of network management for security, quality of service and other purposes. These systems allow early detection of network intrusion and malicious activities; so that the Network Security infrastructure can react to mitigate these threats. Various systems are proposed to enhance the network security. We are proposing to use anomaly based network intrusion detection system in this work. Anomaly based intrusion detection system can identify the new network threats. We also propose to use Real-time Big Data Stream Processing Framework, Apache Storm, for the implementation of network intrusion detection system. Apache Storm can help to manage the network traffic which is generated at enormous speed and size and the network traffic speed and size is constantly increasing. We have used Support Vector Machine in this work. We use Knowledge Discovery and Data Mining 1999 (KDD’99 dataset to test and evaluate our proposed solution.

  6. LHCbDIRAC as Apache Mesos microservices

    CERN Multimedia

    Couturier, Ben

    2016-01-01

    The LHCb experiment relies on LHCbDIRAC, an extension of DIRAC, to drive its offline computing. This middleware provides a development framework and a complete set of components for building distributed computing systems. These components are currently installed and ran on virtual machines (VM) or bare metal hardware. Due to the increased load of work, high availability is becoming more and more important for the LHCbDIRAC services, and the current installation model is showing its limitations. Apache Mesos is a cluster manager which aims at abstracting heterogeneous physical resources on which various tasks can be distributed thanks to so called "framework". The Marathon framework is suitable for long running tasks such as the DIRAC services, while the Chronos framework meets the needs of cron-like tasks like the DIRAC agents. A combination of the service discovery tool Consul together with HAProxy allows to expose the running containers to the outside world while hiding their dynamic placements. Such an arc...

  7. LHCbDIRAC as Apache Mesos microservices

    Science.gov (United States)

    Haen, Christophe; Couturier, Benjamin

    2017-10-01

    The LHCb experiment relies on LHCbDIRAC, an extension of DIRAC, to drive its offline computing. This middleware provides a development framework and a complete set of components for building distributed computing systems. These components are currently installed and run on virtual machines (VM) or bare metal hardware. Due to the increased workload, high availability is becoming more and more important for the LHCbDIRAC services, and the current installation model is showing its limitations. Apache Mesos is a cluster manager which aims at abstracting heterogeneous physical resources on which various tasks can be distributed thanks to so called “frameworks” The Marathon framework is suitable for long running tasks such as the DIRAC services, while the Chronos framework meets the needs of cron-like tasks like the DIRAC agents. A combination of the service discovery tool Consul together with HAProxy allows to expose the running containers to the outside world while hiding their dynamic placements. Such an architecture brings a greater flexibility in the deployment of LHCbDirac services, allowing for easier deployment maintenance and scaling of services on demand (e..g LHCbDirac relies on 138 services and 116 agents). Higher reliability is also easier, as clustering is part of the toolset, which allows constraints on the location of the services. This paper describes the investigations carried out to package the LHCbDIRAC and DIRAC components into Docker containers and orchestrate them using the previously described set of tools.

  8. A Modified APACHE II Score for Predicting Mortality of Variceal ...

    African Journals Online (AJOL)

    Conclusion: Modified APACHE II score is effective in predicting outcome of patients with variceal bleeding. Score of L 15 points and long ICU stay are associated with high mortality. Keywords: liver cirrhosis, periportal fibrosis, portal hypertension, schistosomiasis udan Journal of Medical Sciences Vol. 2 (2) 2007: pp. 105- ...

  9. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    DEFF Research Database (Denmark)

    Majewski, Steven R.; Schiavon, Ricardo P.; Frinchaboy, Peter M.

    2017-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the...

  10. Ergonomic and anthropometric issues of the forward Apache crew station

    NARCIS (Netherlands)

    Oudenhuijzen, A.J.K.

    1999-01-01

    This paper describes the anthropometric accommodation in the Apache crew systems. These activities are part of a comprehensive project, in a cooperative effort from the Armstrong Laboratory at Wright Patterson Air Force Base (Dayton, Ohio, USA) and TNO Human Factors Research Institute (TNO HFRI) in

  11. Performance evaluation of Apache Mahout for mining large datasets

    OpenAIRE

    Bogza, Adriana Maria

    2016-01-01

    The main purpose of this project is to evaluate the performance of the Apache Mahout library, that contains data mining algorithms for data processing, using a twitter dataset. Performance is evaluated in terms of processing time, in-memory usage, I/O performance and algorithmic accuracy.

  12. Geologic influences on Apache trout habitat in the White Mountains of Arizona

    Science.gov (United States)

    Jonathan W. Long; Alvin L. Medina

    2006-01-01

    Geologic variation has important influences on habitat quality for species of concern, but it can be difficult to evaluate due to subtle variations, complex terminology, and inadequate maps. To better understand habitat of the Apache trout (Onchorhynchus apache or O. gilae apache Miller), a threatened endemic species of the White...

  13. Markov Chain Monte Carlo Methods-Simple Monte Carlo

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. Markov Chain Monte Carlo ... New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560 059, India. Systat Software Asia-Pacific (PI Ltd., Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560017, India.

  14. FEASIBILITY STUDY FOR A PETROLEUM REFINERY FOR THE JICARILLA APACHE TRIBE

    International Nuclear Information System (INIS)

    Jones, John D.

    2004-01-01

    A feasibility study for a proposed petroleum refinery for the Jicarilla Apache Indian Reservation was performed. The available crude oil production was identified and characterized. There is 6,000 barrels per day of crude oil production available for processing in the proposed refinery. The proposed refinery will utilize a lower temperature, smaller crude fractionation unit. It will have a Naphtha Hydrodesulfurizer and Reformer to produce high octane gasoline. The surplus hydrogen from the reformer will be used in a specialized hydrocracker to convert the heavier crude oil fractions to ultra low sulfur gasoline and diesel fuel products. The proposed refinery will produce gasoline, jet fuel, diesel fuel, and a minimal amount of lube oil. The refinery will require about $86,700,000 to construct. It will have net annual pre-tax profit of about $17,000,000. The estimated return on investment is 20%. The feasibility is positive subject to confirmation of long term crude supply. The study also identified procedures for evaluating processing options as a means for American Indian Tribes and Native American Corporations to maximize the value of their crude oil production

  15. AH-64E Apache Remanufacture (AH-64E Remanufacture)

    Science.gov (United States)

    2015-12-01

    64E, is the heavy attack helicopter of the current and future force. It is a twin engine, four-bladed, tandem seat, attack helicopter with 30... Digital Communications, Unmanned Aircraft Systems Data Links and Joint networking waveforms. The AH-64E is an Apache attack helicopter modified as...increases power and provides substantial performance gains. The AH-64E is fully network-centric capable with current digitized forces and FMF-equipped

  16. Satellite Imagery Production and Processing Using Apache Hadoop

    Science.gov (United States)

    Hill, D. V.; Werpy, J.

    2011-12-01

    The United States Geological Survey's (USGS) Earth Resources Observation and Science (EROS) Center Land Science Research and Development (LSRD) project has devised a method to fulfill its processing needs for Essential Climate Variable (ECV) production from the Landsat archive using Apache Hadoop. Apache Hadoop is the distributed processing technology at the heart of many large-scale, processing solutions implemented at well-known companies such as Yahoo, Amazon, and Facebook. It is a proven framework and can be used to process petabytes of data on thousands of processors concurrently. It is a natural fit for producing satellite imagery and requires only a few simple modifications to serve the needs of science data processing. This presentation provides an invaluable learning opportunity and should be heard by anyone doing large scale image processing today. The session will cover a description of the problem space, evaluation of alternatives, feature set overview, configuration of Hadoop for satellite image processing, real-world performance results, tuning recommendations and finally challenges and ongoing activities. It will also present how the LSRD project built a 102 core processing cluster with no financial hardware investment and achieved ten times the initial daily throughput requirements with a full time staff of only one engineer. Satellite Imagery Production and Processing Using Apache Hadoop is presented by David V. Hill, Principal Software Architect for USGS LSRD.

  17. Mechanical characterization of densely welded Apache Leap tuff

    International Nuclear Information System (INIS)

    Fuenkajorn, K.; Daemen, J.J.K.

    1991-06-01

    An empirical criterion is formulated to describe the compressive strength of the densely welded Apache Leap tuff. The criterion incorporates the effects of size, L/D ratio, loading rate and density variations. The criterion improves the correlation between the test results and the failure envelope. Uniaxial and triaxial compressive strengths, Brazilian tensile strength and elastic properties of the densely welded brown unit of the Apache Leap tuff have been determined using the ASTM standard test methods. All tuff samples are tested dry at room temperature (22 ± 2 degrees C), and have the core axis normal to the flow layers. The uniaxial compressive strength is 73.2 ± 16.5 MPa. The Brazilian tensile strength is 5.12 ± 1.2 MPa. The Young's modulus and Poisson's ratio are 22.6 ± 5.7 GPa and 0.20 ± 0.03. Smoothness and perpendicularity do not fully meet the ASTM requirements for all samples, due to the presence of voids and inclusions on the sample surfaces and the sample preparation methods. The investigations of loading rate, L/D radio and cyclic loading effects on the compressive strength and of the size effect on the tensile strength are not conclusive. The Coulomb strength criterion adequately represents the failure envelope of the tuff under confining pressures from 0 to 62 MPa. Cohesion and internal friction angle are 16 MPa and 43 degrees. The brown unit of the Apache Leap tuff is highly heterogeneous as suggested by large variations of the test results. The high intrinsic variability of the tuff is probably caused by the presence of flow layers and by nonuniform distributions of inclusions, voids and degree of welding. Similar variability of the properties has been found in publications on the Topopah Spring tuff at Yucca Mountain. 57 refs., 32 figs., 29 tabs

  18. Beginning PHP, Apache, MySQL web development

    CERN Document Server

    Glass, Michael K; Naramore, Elizabeth; Mailer, Gary; Stolz, Jeremy; Gerner, Jason

    2004-01-01

    An ideal introduction to the entire process of setting up a Web site using PHP (a scripting language), MySQL (a database management system), and Apache (a Web server)* Programmers will be up and running in no time, whether they're using Linux or Windows servers* Shows readers step by step how to create several Web sites that share common themes, enabling readers to use these examples in real-world projects* Invaluable reading for even the experienced programmer whose current site has outgrown the traditional static structure and who is looking for a way to upgrade to a more efficient, user-f

  19. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  20. Evaluation of Apache Hadoop for parallel data analysis with ROOT

    International Nuclear Information System (INIS)

    Lehrack, S; Duckeck, G; Ebke, J

    2014-01-01

    The Apache Hadoop software is a Java based framework for distributed processing of large data sets across clusters of computers, using the Hadoop file system (HDFS) for data storage and backup and MapReduce as a processing platform. Hadoop is primarily designed for processing large textual data sets which can be processed in arbitrary chunks, and must be adapted to the use case of processing binary data files which cannot be split automatically. However, Hadoop offers attractive features in terms of fault tolerance, task supervision and control, multi-user functionality and job management. For this reason, we evaluated Apache Hadoop as an alternative approach to PROOF for ROOT data analysis. Two alternatives in distributing analysis data were discussed: either the data was stored in HDFS and processed with MapReduce, or the data was accessed via a standard Grid storage system (dCache Tier-2) and MapReduce was used only as execution back-end. The focus in the measurements were on the one hand to safely store analysis data on HDFS with reasonable data rates and on the other hand to process data fast and reliably with MapReduce. In the evaluation of the HDFS, read/write data rates from local Hadoop cluster have been measured and compared to standard data rates from the local NFS installation. In the evaluation of MapReduce, realistic ROOT analyses have been used and event rates have been compared to PROOF.

  1. Biology and distribution of Lutzomyia apache as it relates to VSV

    Science.gov (United States)

    Phlebotomine sand flies are vectors of bacteria, parasites, and viruses. Lutzomyia apache was incriminated as a vector of vesicular stomatitis viruses(VSV)due to overlapping ranges of the sand fly and outbreaks of VSV. I report on newly discovered populations of L. apache in Wyoming from Albany and ...

  2. Managing Variant Calling Files the Big Data Way: Using HDFS and Apache Parquet

    NARCIS (Netherlands)

    Boufea, Aikaterini; Finkers, H.J.; Kaauwen, van M.P.W.; Kramer, M.R.; Athanasiadis, I.N.

    2017-01-01

    Big Data has been seen as a remedy for the efficient management of the ever-increasing genomic data. In this paper, we investigate the use of Apache Spark to store and process Variant Calling Files (VCF) on a Hadoop cluster. We demonstrate Tomatula, a software tool for converting VCF files to Apache

  3. 75 FR 68607 - BP Canada Energy Marketing Corp. Apache Corporation; Notice for Temporary Waivers

    Science.gov (United States)

    2010-11-08

    ... Energy Marketing Corp. Apache Corporation; Notice for Temporary Waivers November 1, 2010. Take notice that on October 29, 2010, BP Canada Energy Marketing Corp. and Apache Corporation filed with the... assistance with any FERC Online service, please e-mail [email protected] , or call (866) 208-3676...

  4. Validating the use of the APACHE II score in a tertiary South African ...

    African Journals Online (AJOL)

    In order to evaluate both outcome of intensive care unit (ICU) patients and ICU care, the riskadjusted mortality can be calculated using the APACHE II equation. ... to our ICU; (ii) investigate the impact of such variation on outcome; and (iii) validate the use of the APACHE II risk prediction model in a developing country.

  5. A Photometricity and Extinction Monitor at the Apache Point Observatory

    Science.gov (United States)

    Hogg, David W.; Finkbeiner, Douglas P.; Schlegel, David J.; Gunn, James E.

    2001-10-01

    An unsupervised software ``robot'' that automatically and robustly reduces and analyzes CCD observations of photometric standard stars is described. The robot measures extinction coefficients and other photometric parameters in real time and, more carefully, on the next day. It also reduces and analyzes data from an all-sky 10 μm camera to detect clouds; photometric data taken during cloudy periods are automatically rejected. The robot reports its findings to observers and data analysts via the World Wide Web. It can be used to assess photometricity and to build data on site conditions. The robot's automated and uniform site monitoring represents a minimum standard for any observing site with queue scheduling, a public data archive, or likely participation in any future National Virtual Observatory. Based on observations obtained at the Apache Point Observatory, which is owned and operated by the Astrophysical Research Consortium.

  6. CMS Analysis and Data Reduction with Apache Spark

    Energy Technology Data Exchange (ETDEWEB)

    Gutsche, Oliver [Fermilab; Canali, Luca [CERN; Cremer, Illia [Magnetic Corp., Waltham; Cremonesi, Matteo [Fermilab; Elmer, Peter [Princeton U.; Fisk, Ian [Flatiron Inst., New York; Girone, Maria [CERN; Jayatilaka, Bo [Fermilab; Kowalkowski, Jim [Fermilab; Khristenko, Viktor [CERN; Motesnitsalis, Evangelos [CERN; Pivarski, Jim [Princeton U.; Sehrish, Saba [Fermilab; Surdy, Kacper [CERN; Svyatkovskiy, Alexey [Princeton U.

    2017-10-31

    Experimental Particle Physics has been at the forefront of analyzing the world's largest datasets for decades. The HEP community was among the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems for distributed data processing, collectively called "Big Data" technologies have emerged from industry and open source projects to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and tools, promising a fresh look at analysis of very large datasets that could potentially reduce the time-to-physics with increased interactivity. Moreover these new tools are typically actively developed by large communities, often profiting of industry resources, and under open source licensing. These factors result in a boost for adoption and maturity of the tools and for the communities supporting them, at the same time helping in reducing the cost of ownership for the end-users. In this talk, we are presenting studies of using Apache Spark for end user data analysis. We are studying the HEP analysis workflow separated into two thrusts: the reduction of centrally produced experiment datasets and the end-analysis up to the publication plot. Studying the first thrust, CMS is working together with CERN openlab and Intel on the CMS Big Data Reduction Facility. The goal is to reduce 1 PB of official CMS data to 1 TB of ntuple output for analysis. We are presenting the progress of this 2-year project with first results of scaling up Spark-based HEP analysis. Studying the second thrust, we are presenting studies on using Apache Spark for a CMS Dark Matter physics search, comparing Spark's feasibility, usability and performance to the ROOT-based analysis.

  7. 78 FR 5197 - Notice of Intent To Repatriate a Cultural Item: Department of the Interior, Bureau of Land...

    Science.gov (United States)

    2013-01-24

    ... claimants come forward. Representatives of any Indian tribe that believes itself to be culturally affiliated.... Tribal cultural authorities of the Jicarilla Apache Nation, New Mexico; Mescalero Apache Tribe of the Mescalero Reservation, New Mexico; San Carlos Apache Tribe of the San Carlos Reservation, Arizona; Tonto...

  8. Apache Point Observatory Galactic Evolution Experiment (APOGEE) Spectrograph

    Science.gov (United States)

    Wilson, John C.; Hearty, F.; Skrutskie, M. F.; Majewski, S. R.; Schiavon, R.; Eisenstein, D.; Gunn, J.; Gillespie, B.; Weinberg, D.; Blank, B.; Henderson, C.; Smee, S.; Barkhouser, R.; Harding, A.; Hope, S.; Fitzgerald, G.; Stolberg, T.; Arns, J.; Nelson, M.; Brunner, S.; Burton, A.; Walker, E.; Lam, C.; Maseman, P.; Barr, J.; Leger, F.; Carey, L.; MacDonald, N.; Ebelke, G.; Beland, S.; Horne, T.; Young, E.; Rieke, G.; Rieke, M.; O'Brien, T.; Crane, J.; Carr, M.; Harrison, C.; Stoll, R.; Vernieri, M.; Holtzman, J.; Nidever, D.; Shetrone, M.; Allende-Prieto, C.; Johnson, J.; Frinchaboy, P.; Zasowski, G.; Garcia Perez, A.; Bizyaev, D.; Zhao, B.

    2012-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE) will observe approximately 100,000 giant stars in the Milky Way with a dedicated fiber-fed (300 fibers from the Sloan 2.5-m telescope) near-infrared (1.5-1.7 micron) high resolution (R 22,500) spectrograph as part of the Sloan Digital Sky Survey III (SDSS-III). By observing in the near-infrared, APOGEE can uniformly sample all Milky Way stellar populations (bulge, thin/thick disks and halo) in the same survey to dramatically improve our understanding of the kinematical and chemical enrichment history of our galaxy. The instrument design includes several innovations: a novel fiber gang connector that allows simultaneous optical connection of 300 fibers from the instrument into swappable plug plate cartridges, the first deployed mosaic volume phase holographic (VPH) grating, and a very large ( 0.4-m) aperture six-element refractive camera incorporating crystalline silicon elements to image 300 spectra onto three HAWAII-IIRG detectors simultaneously.

  9. APACHE POINT OBSERVATORY 3.5M AGILE OBSERVATIONS OF LCROSS

    Data.gov (United States)

    National Aeronautics and Space Administration — This archive contains observations of the 2009-10-09 impact of the LCROSS spacecraft on the moon by the AGILE instrument on the Apache Point Observatory 3.5m...

  10. Apache Trail, Tonto National Forest : Observations, Considerations, and Recommendations from the Interagency Transportation Assistance Group (TAG)

    Science.gov (United States)

    2016-03-03

    This report summarizes the observations and findings of an interagency transportation assistance group (TAG) convened to discuss the long-term future of Arizona State Route 88, also known as the Apache Trail, a historic road on the Tonto Nation...

  11. Prediction of heart disease using apache spark analysing decision trees and gradient boosting algorithm

    Science.gov (United States)

    Chugh, Saryu; Arivu Selvan, K.; Nadesh, RK

    2017-11-01

    Numerous destructive things influence the working arrangement of human body as hypertension, smoking, obesity, inappropriate medication taking which causes many contrasting diseases as diabetes, thyroid, strokes and coronary diseases. The impermanence and horribleness of the environment situation is also the reason for the coronary disease. The structure of Apache start relies on the evolution which requires gathering of the data. To break down the significance of use programming focused on data structure the Apache stop ought to be utilized and it gives various central focuses as it is fast in light as it uses memory worked in preparing. Apache Spark continues running on dispersed environment and chops down the data in bunches giving a high profitability rate. Utilizing mining procedure as a part of the determination of coronary disease has been exhaustively examined indicating worthy levels of precision. Decision trees, Neural Network, Gradient Boosting Algorithm are the various apache spark proficiencies which help in collecting the information.

  12. Analyzing large data sets from XGC1 magnetic fusion simulations using apache spark

    Energy Technology Data Exchange (ETDEWEB)

    Churchill, R. Michael [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States)

    2016-11-21

    Apache Spark is explored as a tool for analyzing large data sets from the magnetic fusion simulation code XGCI. Implementation details of Apache Spark on the NERSC Edison supercomputer are discussed, including binary file reading, and parameter setup. Here, an unsupervised machine learning algorithm, k-means clustering, is applied to XGCI particle distribution function data, showing that highly turbulent spatial regions do not have common coherent structures, but rather broad, ring-like structures in velocity space.

  13. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    International Nuclear Information System (INIS)

    Majewski, Steven R.; Brunner, Sophia; Burton, Adam; Chojnowski, S. Drew; Pérez, Ana E. García; Hearty, Fred R.; Lam, Charles R.; Schiavon, Ricardo P.; Frinchaboy, Peter M.; Prieto, Carlos Allende; Carrera, Ricardo; Barkhouser, Robert; Bizyaev, Dmitry; Blank, Basil; Henderson, Chuck; Cunha, Kátia; Epstein, Courtney; Johnson, Jennifer A.; Fitzgerald, Greg; Holtzman, Jon A.

    2017-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the Sloan 2.5 m Telescope, APOGEE has collected a half million high-resolution ( R  ∼ 22,500), high signal-to-noise ratio (>100), infrared (1.51–1.70 μ m) spectra for 146,000 stars, with time series information via repeat visits to most of these stars. This paper describes the motivations for the survey and its overall design—hardware, field placement, target selection, operations—and gives an overview of these aspects as well as the data reduction, analysis, and products. An index is also given to the complement of technical papers that describe various critical survey components in detail. Finally, we discuss the achieved survey performance and illustrate the variety of potential uses of the data products by way of a number of science demonstrations, which span from time series analysis of stellar spectral variations and radial velocity variations from stellar companions, to spatial maps of kinematics, metallicity, and abundance patterns across the Galaxy and as a function of age, to new views of the interstellar medium, the chemistry of star clusters, and the discovery of rare stellar species. As part of SDSS-III Data Release 12 and later releases, all of the APOGEE data products are publicly available.

  14. Generation of a Social Network Graph by Using Apache Spark

    Directory of Open Access Journals (Sweden)

    Y. A. Belov

    2016-01-01

    Full Text Available We plan to create a method of clustering a social network graph. For testing the method there is a need to generate a graph similar in structure to existing social networks. The article presents an algorithm for the graph distributed generation. We took into account basic properties such as power-law distribution of the users communities number, dense intersections of the social networks and others. This algorithm also considers the problems that are present in similar works of other authors, for example, the multiple edges problem in the generation process. A special feature of the created algorithm is the implementation depending on the communities number parameter rather than on the connected users number as it is done in other works. It is connected with a peculiarity of progressing the existing social network structure. There are properties of its graph in the paper. We described a table containing the variables needed for the algorithm. A step-by-step generation algorithm was compiled. Appropriate mathematical parameters were calculated for it. A generation is performed in a distributed way by Apache Spark framework. It was described in detail how the tasks division with the help of this framework runs. The Erdos-Renyi model for random graphs is used in the algorithm. It is the most suitable and easy one to implement. The main advantages of the created method are the small amount of resources in comparison with other similar generators and execution speed. Speed is achieved through distributed work and the fact that in any time network users have their own unique numbers and are ordered by these numbers, so there is no need to sort them out. The designed algorithm will promote not only the efficient clustering method creation. It can be useful in other development areas connected, for example, with the social networks search engines.

  15. The Apache Point Observatory Galactic Evolution Experiment (APOGEE)

    Energy Technology Data Exchange (ETDEWEB)

    Majewski, Steven R.; Brunner, Sophia; Burton, Adam; Chojnowski, S. Drew; Pérez, Ana E. García; Hearty, Fred R.; Lam, Charles R. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Schiavon, Ricardo P. [Gemini Observatory, 670 N. A’Ohoku Place, Hilo, HI 96720 (United States); Frinchaboy, Peter M. [Department of Physics and Astronomy, Texas Christian University, Fort Worth, TX 76129 (United States); Prieto, Carlos Allende; Carrera, Ricardo [Instituto de Astrofísica de Canarias, E-38200 La Laguna, Tenerife (Spain); Barkhouser, Robert [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States); Bizyaev, Dmitry [Apache Point Observatory and New Mexico State University, P.O. Box 59, Sunspot, NM, 88349-0059 (United States); Blank, Basil; Henderson, Chuck [Pulse Ray Machining and Design, 4583 State Route 414, Beaver Dams, NY 14812 (United States); Cunha, Kátia [Observatório Nacional, Rio de Janeiro, RJ 20921-400 (Brazil); Epstein, Courtney; Johnson, Jennifer A. [The Ohio State University, Columbus, OH 43210 (United States); Fitzgerald, Greg [New England Optical Systems, 237 Cedar Hill Street, Marlborough, MA 01752 (United States); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); and others

    2017-09-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), one of the programs in the Sloan Digital Sky Survey III (SDSS-III), has now completed its systematic, homogeneous spectroscopic survey sampling all major populations of the Milky Way. After a three-year observing campaign on the Sloan 2.5 m Telescope, APOGEE has collected a half million high-resolution ( R  ∼ 22,500), high signal-to-noise ratio (>100), infrared (1.51–1.70 μ m) spectra for 146,000 stars, with time series information via repeat visits to most of these stars. This paper describes the motivations for the survey and its overall design—hardware, field placement, target selection, operations—and gives an overview of these aspects as well as the data reduction, analysis, and products. An index is also given to the complement of technical papers that describe various critical survey components in detail. Finally, we discuss the achieved survey performance and illustrate the variety of potential uses of the data products by way of a number of science demonstrations, which span from time series analysis of stellar spectral variations and radial velocity variations from stellar companions, to spatial maps of kinematics, metallicity, and abundance patterns across the Galaxy and as a function of age, to new views of the interstellar medium, the chemistry of star clusters, and the discovery of rare stellar species. As part of SDSS-III Data Release 12 and later releases, all of the APOGEE data products are publicly available.

  16. Uncomfortable Experience: Lessons Lost in the Apache War

    Science.gov (United States)

    2015-03-01

    Cochise, the Chihenne (Ojo Caliente/ Hot Springs ) led by Victorio, the Bedonkohe led by Mangas Coloradas, and the Nednhi led by Juh are generally...Carlos and when Captain F.T. Bennett attempted to transfer Victorio’s Warm Springs group from their home in New Mexico surrounding Ojo Caliente, the...page text. In contrast, the doctrine references French experiences in Spain and Algeria sixteen times, and British operations in Ireland and

  17. Mortality Prediction in Patients Admitted in Surgical Intensive Care Unit by Using APACHE IV.

    Science.gov (United States)

    Wetr, Wetwet Wetw; Shoukat, Hassan; Muhammad, Yar; Gondal, Khalid Masood; Aslam, Imran

    2016-11-01

    To predict the mortality by the mean Acute Physiology and Chronic Health Evaluation (APACHE) IV score of all the patients admitted in a Surgical Intensive Care Unit (ICU) and comparing the score of the survivors and non-survivors. Descriptive study. Surgical Intensive Care Unit, Mayo Hospital, Lahore, from June 2013 to November 2014. All adult patients admitted in the Surgical ICU were included in this study. The demographics and other data of the patients were recorded. The APACHE IV scores of all patients were calculated at the time of admission. The scores of the survivors and the non-survivors were compared for prediction of survival and mortality. The age of these patients ranged from 13 to 70 (mean 38.39) years with 86 (55.48%) males and 69 (44.52%) females. The mean APACHE IV score of these patients was 34.96 ±14.93 ranging from 11 to 63 years. Eighty-three (53.55%) patients survived and 72 (46.45%) died. With respect to gender, 41 (47.67%) males out of 86 and 31 (44.92%) females out of 69 did not survive. The mortality increased with an increase in APACHE IV score and all the patients with score more than 39 did not survive. The predicted mortality can be assessed by APACHE IV score, so it is good for application among the surgical ICU patients.

  18. 76 FR 72969 - Proclaiming Certain Lands as Reservation for the Fort Sill Apache Indian Tribe

    Science.gov (United States)

    2011-11-28

    ... Affairs, Division of Real Estate Services, Mail Stop-4639-MIB, 1849 C Street NW., Washington, DC 20240... right-of-way, Township Twenty-four (24) south, Range Six (6) west, N.M.P.M., Luna County, New Mexico... this tract and on the North boundary of the Interstate 10 right-of-way; Thence adjoining the North...

  19. Jicarilla Apache Utility Authority Renewable Energy and Energy Efficiency Strategic Planning

    Energy Technology Data Exchange (ETDEWEB)

    Rabago, K.R.

    2008-06-28

    The purpose of this Strategic Plan Report is to provide an introduction and in-depth analysis of the issues and opportunities, resources, and technologies of energy efficiency and renewable energy that have potential beneficial application for the people of the Jicarilla Apache Nation and surrounding communities. The Report seeks to draw on the best available information that existed at the time of writing, and where necessary, draws on new research to assess this potential. This study provides a strategic assessment of opportunities for maximizing the potential for electrical energy efficiency and renewable energy development by the Jicarilla Apache Nation. The report analyzes electricity use on the Jicarilla Apache Reservation in buildings. The report also assesses particular resources and technologies in detail, including energy efficiency, solar, wind, geothermal, biomass, and small hydropower. The closing sections set out the elements of a multi-year, multi-phase strategy for development of resources to the maximum benefit of the Nation.

  20. Observations of the larval stages of Diceroprocta apache Davis (Homoptera: Tibicinidae)

    Science.gov (United States)

    Ellingson, A.R.; Andersen, D.C.; Kondratieff, B.C.

    2002-01-01

    Diceroprocta apache Davis is a locally abundant cicada in the riparian woodlands of the southwestern United States. While its ecological importance has often been hypothesized, very little is known of its specific life history. This paper presents preliminary information on life history of D. apache from larvae collected in the field at seasonal intervals as well as a smaller number of reared specimens. Morphological development of the fore-femoral comb closely parallels growth through distinct size classes. The data indicate the presence of five larval instars in D. apache. Development times from greenhouse-reared specimens suggest a 3-4 year life span and overlapping broods were present in the field. Sex ratios among pre-emergent larvae suggest the asynchronous emergence of sexes.

  1. Jicarilla Apache Utility Authority. Strategic Plan for Energy Efficiency and Renewable Energy Development

    International Nuclear Information System (INIS)

    Rabago, K.R.

    2008-01-01

    The purpose of this Strategic Plan Report is to provide an introduction and in-depth analysis of the issues and opportunities, resources, and technologies of energy efficiency and renewable energy that have potential beneficial application for the people of the Jicarilla Apache Nation and surrounding communities. The Report seeks to draw on the best available information that existed at the time of writing, and where necessary, draws on new research to assess this potential. This study provides a strategic assessment of opportunities for maximizing the potential for electrical energy efficiency and renewable energy development by the Jicarilla Apache Nation. The report analyzes electricity use on the Jicarilla Apache Reservation in buildings. The report also assesses particular resources and technologies in detail, including energy efficiency, solar, wind, geothermal, biomass, and small hydropower. The closing sections set out the elements of a multi-year, multi-phase strategy for development of resources to the maximum benefit of the Nation

  2. Kelayakan Raspberry Pi sebagai Web Server: Perbandingan Kinerja Nginx, Apache, dan Lighttpd pada Platform Raspberry Pi

    Directory of Open Access Journals (Sweden)

    Rahmad Dawood

    2014-04-01

    Full Text Available Raspberry Pi is a small-sized computer, but it can function like an ordinary computer. Because it can function like a regular PC then it is also possible to run a web server application on the Raspberry Pi. This paper will report results from testing the feasibility and performance of running a web server on the Raspberry Pi. The test was conducted on the current top three most popular web servers, which are: Apache, Nginx, and Lighttpd. The parameters used to evaluate the feasibility and performance of these web servers were: maximum request and reply time. The results from the test showed that it is feasible to run all three web servers on the Raspberry Pi but Nginx gave the best performance followed by Lighttpd and Apache.Keywords: Raspberry Pi, web server, Apache, Lighttpd, Nginx, web server performance

  3. Comparison of APACHE II and SAPS II Scoring Systems in Prediction of Critically ill Patients’ Outcome

    Directory of Open Access Journals (Sweden)

    Hamed Aminiahidashti

    2017-01-01

    Full Text Available Introduction: Using physiologic scoring systems for identifying high-risk patients for mortality has been considered recently. This study was designed to evaluate the values of Acute Physiology and Chronic Health Evaluation II (APACHE II and Simplified Acute Physiologic Score (SAPS II models in prediction of 1-month mortality of critically ill patients.Methods: The present prospective cross sectional study was performed on critically ill patients presented to emergency department during 6 months. Data required for calculation of the scores were gathered and performance of the models in prediction of 1-month mortality were assessed using STATA software 11.0.Results: 82 critically ill patients with the mean age of 53.45 ± 20.37 years were included (65.9% male. Their mortality rate was 48%. Mean SAPS II (p < 0.0001 and APACHE II (p = 0.0007 scores were significantly higher in dead patients. Area under the ROC curve of SAPS II and APACHE II for prediction of mortality were 0.75 (95% CI: 0.64 - 0.86 and 0.72 (95% CI: 0.60 - 0.83, respectively (p = 0.24. The slope and intercept of SAPS II were 1.02 and 0.04, respectively. In addition, these values were 0.92 and 0.09 for APACHE II, respectively.Conclusion: The findings of the present study showed that APACHE II and SAPS II had similar value in predicting 1-month mortality of patients. Discriminatory powers of the mentioned models were acceptable but their calibration had some amount of lack of fit, which reveals that APACHE II and SAPS II are partially perfect.

  4. APACHE II as an indicator of ventilator-associated pneumonia (VAP.

    Directory of Open Access Journals (Sweden)

    Kelser de Souza Kock

    2015-01-01

    Full Text Available Background and objectives: strategies for risk stratification in severe pathologies are extremely important. The aim of this study was to analyze the accuracy of the APACHE II score as an indicator of Ventilator-Associated Pneumonia (VAP in ICU patient sat Hospital Nossa Senhora da Conceição (HNSC Tubarão-SC. Methods: It was conducted a prospective cohort study with 120 patients admitted between March and August 2013, being held APACHE II in the first 24 hours of mechanical ventilation (MV. Patients were followed until the following gout comes: discharge or death. It was also analyzed the cause of ICU admission, age, gender, days of mechanical ventilation, length of ICU and outcome. Results: The incidence of VAP was 31.8% (38/120. Two variables showed a relative riskin the development of VAP, APACHE II above average (RR = 1,62; IC 95% 1,03-2,55 and males (RR = 1,56; IC 95 % 1,18-2,08. The duration of mechanical ventilation (days above average18.4± 14.9(p =0.001, ICU stay (days above average 20.4± 15.3(p =0.003 presented the development of VAP. The accuracy of APACHE II in predicting VAP score >23, showed a sensitivity of 84% and specificity of 33%. Inrelation to death, two variables showed relative risk, age above average (RR=2.08; 95% CI =1.34 to 3.23 and ICU stay above average (RR=2.05; CI 95 =1.28 to 3.28%. Conclusion: The APACHE II score above or equal 23 might to indicate the risk of VAP. Keywords: Pneumonia, Ventilator-Associated, Intensive Care Units, APACHE. Prognosis

  5. Research on Monte Carlo application based on Hadoop

    Directory of Open Access Journals (Sweden)

    Wu Minglei

    2018-01-01

    Full Text Available Monte Carlo method is also known as random simulation method. The more the number of experiments, the more accurate the results obtained. Therefore, a large number of random simulation is required in order to obtain a higher degree of accuracy, but the traditional stand-alone algorithm has been difficult to meet the needs of a large number of simulation. Hadoop platform is a distributed computing platform built on a large data background and an open source software under Apache. It is easier to write and run applications for processing massive amounts of data as an open source software platform. Therefore, this paper takes π value calculation as an example to realize the Monte Carlo algorithm based on Hadoop platform, and get the exact π value with the advantage of Hadoop platform in distributed processing.

  6. Validation of the APACHE IV model and its comparison with the APACHE II, SAPS 3, and Korean SAPS 3 models for the prediction of hospital mortality in a Korean surgical intensive care unit.

    Science.gov (United States)

    Lee, Hannah; Shon, Yoon-Jung; Kim, Hyerim; Paik, Hyesun; Park, Hee-Pyoung

    2014-08-01

    The Acute Physiology and Chronic Health Evaluation (APACHE) IV model has not yet been validated in Korea. The aim of this study was to compare the ability of the APACHE IV with those of APACHE II, Simplified Acute Physiology Score (SAPS) 3, and Korean SAPS 3 in predicting hospital mortality in a surgical intensive care unit (SICU) population. We retrospectively reviewed electronic medical records for patients admitted to the SICU from March 2011 to February 2012 in a university hospital. Measurements of discrimination and calibration were performed using the area under the receiver operating characteristic curve (AUC) and the Hosmer-Lemeshow test, respectively. We calculated the standardized mortality ratio (SMR, actual mortality predicted mortality) for the four models. The study included 1,314 patients. The hospital mortality rate was 3.3%. The discriminative powers of all models were similar and very reliable. The AUCs were 0.80 for APACHE IV, 0.85 for APACHE II, 0.86 for SAPS 3, and 0.86 for Korean SAPS 3. Hosmer and Lemeshow C and H statistics showed poor calibration for all of the models (P < 0.05). The SMRs of APACHE IV, APACHE II, SAPS 3, and Korean SAPS 3 were 0.21, 0.11 0.23, 0.34, and 0.25, respectively. The APACHE IV revealed good discrimination but poor calibration. The overall discrimination and calibration of APACHE IV were similar to those of APACHE II, SAPS 3, and Korean SAPS 3 in this study. A high level of customization is required to improve calibration in this study setting.

  7. 77 FR 18997 - Rim Lakes Forest Restoration Project; Apache-Sitgreavese National Forest, Black Mesa Ranger...

    Science.gov (United States)

    2012-03-29

    ... DEPARTMENT OF AGRICULTURE Forest Service Rim Lakes Forest Restoration Project; Apache-Sitgreavese... intends to conserve and restore the Rim Lakes Project Area to make--over time--the forest ecosystem more... concerns and contentions. The proposed Rim Lakes Forest Restoration Project is subject to the HFRA pre...

  8. Lutzomyia (Helcocyrtomyia) Apache Young and Perkins (Diptera: Psychodidae) feeds on reptiles

    Science.gov (United States)

    Phlebotomine sand flies are vectors of bacteria, parasites, and viruses. In the western USA a sand fly, Lutzomyia apache Young and Perkins, was initially associated with epizootics of vesicular stomatitis virus (VSV), because sand flies were trapped at sites of an outbreak. Additional studies indica...

  9. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    ias

    RESONANCE ⎜ August 2014. GENERAL ⎜ ARTICLE. Variational Monte Carlo Technique. Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. Keywords. Variational methods, Monte. Carlo techniques, harmonic os- cillators, quantum mechanical systems. Sukanta Deb is an. Assistant Professor in the.

  10. Markov Chain Monte Carlo

    Indian Academy of Sciences (India)

    . Keywords. Gibbs sampling, Markov Chain. Monte Carlo, Bayesian inference, stationary distribution, conver- gence, image restoration. Arnab Chakraborty. We describe the mathematics behind the Markov. Chain Monte Carlo method of ...

  11. Predictive value of SAPS II and APACHE II scoring systems for patient outcome in a medical intensive care unit

    Directory of Open Access Journals (Sweden)

    Amina Godinjak

    2016-11-01

    Full Text Available Objective. The aim is to determine SAPS II and APACHE II scores in medical intensive care unit (MICU patients, to compare them for prediction of patient outcome, and to compare with actual hospital mortality rates for different subgroups of patients. Methods. One hundred and seventy-four patients were included in this analysis over a oneyear period in the MICU, Clinical Center, University of Sarajevo. The following patient data were obtained: demographics, admission diagnosis, SAPS II, APACHE II scores and final outcome. Results. Out of 174 patients, 70 patients (40.2% died. Mean SAPS II and APACHE II scores in all patients were 48.4±17.0 and 21.6±10.3 respectively, and they were significantly different between survivors and non-survivors. SAPS II >50.5 and APACHE II >27.5 can predict the risk of mortality in these patients. There was no statistically significant difference in the clinical values of SAPS II vs APACHE II (p=0.501. A statistically significant positive correlation was established between the values of SAPS II and APACHE II (r=0.708; p=0.001. Patients with an admission diagnosis of sepsis/septic shock had the highest values of both SAPS II and APACHE II scores, and also the highest hospital mortality rate of 55.1%. Conclusion. Both APACHE II and SAPS II had an excellent ability to discriminate between survivors and non-survivors. There was no significant difference in the clinical values of SAPS II and APACHE II. A positive correlation was established between them. Sepsis/septic shock patients had the highest predicted and observed hospital mortality rate.

  12. Evaluation of a modified APACHE II Scoring System in the Intensive ...

    African Journals Online (AJOL)

    MAPA II format was designed from the APA II. The APA II score consists of 12 sets of acute physiological variables (A), age points (B) and chronic health points (C). Total APACHE II score of 71 was generated by adding A, B and C. (Appendix I). MAPA II score was generated by adding A, B and C but substituting PaO2 with ...

  13. The Usefulness of the APACHE II Score in Obstetric Critical Care: A Structured Review.

    Science.gov (United States)

    Ryan, Helen M; Sharma, Sumedha; Magee, Laura A; Ansermino, J Mark; MacDonell, Karen; Payne, Beth A; Walley, Keith R; von Dadelszen, Peter

    2016-10-01

    To assess the performance of the Acute Physiology and Chronic Health Evaluation II (APACHE II) mortality prediction model in pregnant and recently pregnant women receiving critical care in low-, middle-, and high-income countries during the study period (1985-2015), using a structured literature review. Ovid MEDLINE, Embase, Web of Science, and Evidence-Based Medicine Reviews, searched for articles published between 1985 and 2015. Twenty-five studies (24 publications), of which two were prospective, were included in the analyses. Ten studies were from high-income countries (HICs), and 15 were from low- and middle-income countries (LMICs). Median study duration and size were six years and 124 women, respectively. ICU admission complicates 0.48% of deliveries, and pregnant and recently pregnant women account for 1.49% of ICU admissions. One quarter were admitted while pregnant, three quarters of these for an obstetric indication and for a median of three days. The median APACHE II score was 10.9, with a median APACHE II-predicted mortality of 16.6%. Observed mortality was 4.6%, and the median standardized mortality ratio was 0.36 (interquartile range 0.23 to 0.73). The standardized mortality ratio was critical care, whether they reside in HICs or LMICs. There is a need for a pregnancy-specific outcome prediction model for these women. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Exploring Monte Carlo methods

    CERN Document Server

    Dunn, William L

    2012-01-01

    Exploring Monte Carlo Methods is a basic text that describes the numerical methods that have come to be known as "Monte Carlo." The book treats the subject generically through the first eight chapters and, thus, should be of use to anyone who wants to learn to use Monte Carlo. The next two chapters focus on applications in nuclear engineering, which are illustrative of uses in other fields. Five appendices are included, which provide useful information on probability distributions, general-purpose Monte Carlo codes for radiation transport, and other matters. The famous "Buffon's needle proble

  15. Monte Carlo methods

    Directory of Open Access Journals (Sweden)

    Bardenet Rémi

    2013-07-01

    Full Text Available Bayesian inference often requires integrating some function with respect to a posterior distribution. Monte Carlo methods are sampling algorithms that allow to compute these integrals numerically when they are not analytically tractable. We review here the basic principles and the most common Monte Carlo algorithms, among which rejection sampling, importance sampling and Monte Carlo Markov chain (MCMC methods. We give intuition on the theoretical justification of the algorithms as well as practical advice, trying to relate both. We discuss the application of Monte Carlo in experimental physics, and point to landmarks in the literature for the curious reader.

  16. APACHE III Outcome Prediction in Patients Admitted to the Intensive Care Unit with Sepsis Associated Acute Lung Injury.

    Science.gov (United States)

    Zhang, Zhongheng; Chen, Kun; Chen, Lin

    2015-01-01

    Acute Physiology and Chronic Health Evaluation (APACHE) III score has been widely used for prediction of clinical outcomes in mixed critically ill patients. However, it has not been validated in patients with sepsis-associated acute lung injury (ALI). The aim of the study was to explore the calibration and predictive value of APACHE III in patients with sepsis-associated ALI. The study was a secondary analysis of a prospective randomized controlled trial investigating the efficacy of rosuvastatin in sepsis-associated ALI (Statins for Acutely Injured Lungs from Sepsis, SAILS). The study population was sepsis-related ALI patients. The primary outcome of the current study was the same as in the original trial, 60-day in-hospital mortality, defined as death before hospital discharge, censored 60 days after enrollment. Discrimination of APACHE III was assessed by calculating the area under the receiver operating characteristic (ROC) curve (AUC) with its 95% CI. Hosmer-Lemeshow goodness-of-fit statistic was used to assess the calibration of APACHE III. The Brier score was reported to represent the overall performance of APACHE III in predicting outcome. A total of 745 patients were included in the study, including 540 survivors and 205 non-survivors. Non-survivors were significantly older than survivors (59.71 ± 16.17 vs 52.00 ± 15.92 years, p predict mortality in ALI patients was moderate with an AUC of 0.68 (95% confidence interval: 0.64-0.73). this study for the first time validated the discrimination of APACHE III in sepsis associated ALI patients. The result shows that APACHE III score has moderate predictive value for in-hospital mortality among adults with sepsis-associated acute lung injury.

  17. MORSE Monte Carlo code

    Energy Technology Data Exchange (ETDEWEB)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described.

  18. MORSE Monte Carlo code

    International Nuclear Information System (INIS)

    Cramer, S.N.

    1984-01-01

    The MORSE code is a large general-use multigroup Monte Carlo code system. Although no claims can be made regarding its superiority in either theoretical details or Monte Carlo techniques, MORSE has been, since its inception at ORNL in the late 1960s, the most widely used Monte Carlo radiation transport code. The principal reason for this popularity is that MORSE is relatively easy to use, independent of any installation or distribution center, and it can be easily customized to fit almost any specific need. Features of the MORSE code are described

  19. Comparison of the APACHE II, GCS and MRC scores in predicting outcomes in patients with tuberculous meningitis.

    Science.gov (United States)

    Chou, C-H; Lin, G-M; Ku, C-H; Chang, F-Y

    2010-01-01

    To evaluate different scoring systems, including Acute Physiology and Chronic Health Evaluation (APACHE) II, the Glasgow Coma Scale (GCS) and the Medical Research Council (MRC) staging system, as well as other prognostic factors, in predicting the discharge outcomes of adult patients with tuberculous meningitis (TBM). We conducted a retrospective analysis of patients admitted with a diagnosis of TBM to a tertiary hospital in northern Taiwan from March 1996 to February 2006. We used APACHE II, GCS, MRC and a variety of factors within 24 h of admission to predict discharge outcomes recorded by the Glasgow Outcome Scale (GOS). Among 43 TBM patients, 33 had a favourable outcome (GOS 4-5), and 10 had an unfavourable outcome (GOS 1-3). The severity of APACHE II, GCS, MRC and presence of hydrocephalus correlated well with the neurological outcomes (P MRC in receiver operating characteristic analysis. Furthermore, in-hospital mortality could be predicted accurately with APACHE II and GCS. The APACHE II scoring system is at least as effective as GCS and superior to MRC in predicting the discharge outcomes of adult patients with TBM.

  20. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 19; Issue 8. Variational Monte Carlo Technique: Ground State Energies of Quantum Mechanical Systems. Sukanta Deb. General Article Volume 19 Issue 8 August 2014 pp 713-739 ...

  1. Resonance – Journal of Science Education | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 8; Issue 4. Markov Chain Monte Carlo Methods - Simple Monte Carlo. K B Athreya Mohan Delampady T Krishnan. General ... School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Road ...

  2. Seguridad en la configuración del servidor web Apache

    Directory of Open Access Journals (Sweden)

    Carlos Eduardo Gómez Montoya

    2013-07-01

    Full Text Available Apache es el servidor Web con mayor presencia en el mercado mundial. Aunque su configuración es relativamente sencilla, fortalecer sus condiciones de seguridad implica entender y aplicar un conjunto de reglas generales conocidas, aceptadas y disponibles. Por otra parte, a pesar de ser un tema aparentemente resuelto, la seguridad en los servidores HTTP constituye un problema en aumento, y no todas las compañías lo toman en serio. Este artículo identifica y verifica un conjunto de buenas prácticas de seguridad informática aplicadas a la configuración de Apache. Para alcanzar los objetivos, y con el fin de garantizar un proceso adecuado, se eligió una metodología basada en el Círculo de Calidad de Deming, el cual comprende cuatro fases: planear, hacer, verificar y actuar, y su aplicación condujo el desarrollo del proyecto. Este artículo consta de cinco secciones: Introducción, Marco de referencia, Metodología, Resultados y discusión, y Conclusiones.

  3. Monte Carlo codes and Monte Carlo simulator program

    International Nuclear Information System (INIS)

    Higuchi, Kenji; Asai, Kiyoshi; Suganuma, Masayuki.

    1990-03-01

    Four typical Monte Carlo codes KENO-IV, MORSE, MCNP and VIM have been vectorized on VP-100 at Computing Center, JAERI. The problems in vector processing of Monte Carlo codes on vector processors have become clear through the work. As the result, it is recognized that these are difficulties to obtain good performance in vector processing of Monte Carlo codes. A Monte Carlo computing machine, which processes the Monte Carlo codes with high performances is being developed at our Computing Center since 1987. The concept of Monte Carlo computing machine and its performance have been investigated and estimated by using a software simulator. In this report the problems in vectorization of Monte Carlo codes, Monte Carlo pipelines proposed to mitigate these difficulties and the results of the performance estimation of the Monte Carlo computing machine by the simulator are described. (author)

  4. CERN honours Carlo Rubbia

    CERN Multimedia

    2009-01-01

    Carlo Rubbia turned 75 on March 31, and CERN held a symposium to mark his birthday and pay tribute to his impressive contribution to both CERN and science. Carlo Rubbia, 4th from right, together with the speakers at the symposium.On 7 April CERN hosted a celebration marking Carlo Rubbia’s 75th birthday and 25 years since he was awarded the Nobel Prize for Physics. "Today we will celebrate 100 years of Carlo Rubbia" joked CERN’s Director-General, Rolf Heuer in his opening speech, "75 years of his age and 25 years of the Nobel Prize." Rubbia received the Nobel Prize along with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. During the symposium, which was held in the Main Auditorium, several eminent speakers gave lectures on areas of science to which Carlo Rubbia made decisive contributions. Among those who spoke were Michel Spiro, Director of the French National Insti...

  5. An Indian tribal view of the back end of the nuclear fuel cycle: historical and cultural lessons

    International Nuclear Information System (INIS)

    Tano, M.L.; Powankee, D.; Lester, A.D.

    1995-01-01

    The Nez Perce Tribe, the Confederated Tribes of the Umatilla Indian Reservation and the Yakama Indian Nation have entered into cooperative agreements with the US Department of Energy to oversee the cleanup of the Hanford Reservation. The Mescalero Apache Tribe and the Meadow Lake Tribal Council have come under severe criticism from some ''ideological pure'' Indians and non-Indians for aiding and abetting the violation of Mother Earth by permitting the land to be contaminated by radioactive wastes. This paper suggests that this view of the Indian relationship to nature and the environment is too narrow and describes aspects of Indian religion that support tribal involvement in radioactive waste management. (O.M.)

  6. The Virtual Monte Carlo

    CERN Document Server

    Hrivnacova, I; Berejnov, V V; Brun, R; Carminati, F; Fassò, A; Futo, E; Gheata, A; Caballero, I G; Morsch, Andreas

    2003-01-01

    The concept of Virtual Monte Carlo (VMC) has been developed by the ALICE Software Project to allow different Monte Carlo simulation programs to run without changing the user code, such as the geometry definition, the detector response simulation or input and output formats. Recently, the VMC classes have been integrated into the ROOT framework, and the other relevant packages have been separated from the AliRoot framework and can be used individually by any other HEP project. The general concept of the VMC and its set of base classes provided in ROOT will be presented. Existing implementations for Geant3, Geant4 and FLUKA and simple examples of usage will be described.

  7. The Ability of the Acute Physiology and Chronic Health Evaluation (APACHE IV Score to Predict Mortality in a Single Tertiary Hospital

    Directory of Open Access Journals (Sweden)

    Jae Woo Choi

    2017-08-01

    Full Text Available Background The Acute Physiology and Chronic Health Evaluation (APACHE II model has been widely used in Korea. However, there have been few studies on the APACHE IV model in Korean intensive care units (ICUs. The aim of this study was to compare the ability of APACHE IV and APACHE II in predicting hospital mortality, and to investigate the ability of APACHE IV as a critical care triage criterion. Methods The study was designed as a prospective cohort study. Measurements of discrimination and calibration were performed using the area under the receiver operating characteristic curve (AUROC and the Hosmer-Lemeshow goodness-of-fit test respectively. We also calculated the standardized mortality ratio (SMR. Results The APACHE IV score, the Charlson Comorbidity index (CCI score, acute respiratory distress syndrome, and unplanned ICU admissions were independently associated with hospital mortality. The calibration, discrimination, and SMR of APACHE IV were good (H = 7.67, P = 0.465; C = 3.42, P = 0.905; AUROC = 0.759; SMR = 1.00. However, the explanatory power of an APACHE IV score >93 alone on hospital mortality was low at 44.1%. The explanatory power was increased to 53.8% when the hospital mortality was predicted using a model that considers APACHE IV >93 scores, medical admission, and risk factors for CCI >3 coincidentally. However, the discriminative ability of the prediction model was unsatisfactory (C index <0.70. Conclusions The APACHE IV presented good discrimination, calibration, and SMR for hospital mortality.

  8. Outcrop Analysis of the Cretaceous Mesaverde Group: Jicarilla Apache Reservation, New Mexico

    Energy Technology Data Exchange (ETDEWEB)

    Ridgley, Jennie; Dunbar, Robin Wright

    2001-04-24

    Field work for this project was conducted during July and April 1998, at which time fourteen measured sections were described and correlated on or adjacent to Jicarilla Apache Reservation lands. A fifteenth section, described east of the main field area, is included in this report, although its distant location precluded use in the correlations and cross sections presented herein. Ground-based photo mosaics were shot for much of the exposed Mesaverde outcrop belt and were used to assist in correlation. Outcrop gamma-ray surveys at six of the fifteen measured sections using a GAD-6 scintillometer was conducted. The raw gamma-ray data are included in this report, however, analysis of those data is part of the ongoing Phase Two of this project.

  9. Inequalities in Open Source Software Development: Analysis of Contributor's Commits in Apache Software Foundation Projects.

    Science.gov (United States)

    Chełkowski, Tadeusz; Gloor, Peter; Jemielniak, Dariusz

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution.

  10. Extraction of UMLS® Concepts Using Apache cTAKES™ for German Language.

    Science.gov (United States)

    Becker, Matthias; Böckmann, Britta

    2016-01-01

    Automatic information extraction of medical concepts and classification with semantic standards from medical reports is useful for standardization and for clinical research. This paper presents an approach for an UMLS concept extraction with a customized natural language processing pipeline for German clinical notes using Apache cTAKES. The objectives are, to test the natural language processing tool for German language if it is suitable to identify UMLS concepts and map these with SNOMED-CT. The German UMLS database and German OpenNLP models extended the natural language processing pipeline, so the pipeline can normalize to domain ontologies such as SNOMED-CT using the German concepts. For testing, the ShARe/CLEF eHealth 2013 training dataset translated into German was used. The implemented algorithms are tested with a set of 199 German reports, obtaining a result of average 0.36 F1 measure without German stemming, pre- and post-processing of the reports.

  11. 76 FR 47441 - Safety Zone; Apache Pier Labor Day Weekend Fireworks Display, Atlantic Ocean, Myrtle Beach, SC

    Science.gov (United States)

    2011-08-05

    ... Display, Atlantic Ocean, Myrtle Beach, SC AGENCY: Coast Guard, DHS. ACTION: Temporary final rule. SUMMARY... vicinity of Apache Pier in Myrtle Beach, South Carolina during a Labor Day weekend fireworks display on... fireworks display is scheduled to take place in Myrtle Beach, South Carolina. The fireworks will be launched...

  12. Post-High School Careers of Apache Special Education and Regular Education Subjects: A Five Year Follow-Up Study.

    Science.gov (United States)

    Rangasamy, Ramasamy

    This study analyzed the employment situation of 106 Apache youth, of whom 52 had been special education students. All students had exited high school between 1987 and 1992. Findings indicate that 65 percent of the regular education students and 73 percent of special education students were unemployed, that 50 percent of working special education…

  13. Are cicadas (Diceroprocta apache) both a "keystone" and a "critical-link" species in lower Colorado River riparian communities?

    Science.gov (United States)

    Andersen, Douglas C.

    1994-01-01

    Apache cicada (Homoptera: Cicadidae: Diceroprocta apache Davis) densities were estimated to be 10 individuals/m2 within a closed-canopy stand of Fremont cottonwood (Populus fremontii) and Goodding willow (Salix gooddingii) in a revegetated site adjacent to the Colorado River near Parker, Arizona. Coupled with data drawn from the literature, I estimate that up to 1.3 cm (13 1/m2) of water may be added to the upper soil layers annually through the feeding activities of cicada nymphs. This is equivalent to 12% of the annual precipitation received in the study area. Apache cicadas may have significant effects on ecosystem functioning via effects on water transport and thus act as a critical-link species in this southwest desert riverine ecosystem. Cicadas emerged later within the cottonwood-willow stand than in relatively open saltcedar-mesquite stands; this difference in temporal dynamics would affect their availability to several insectivorous bird species and may help explain the birds' recent declines. Resource managers in this region should be sensitive to the multiple and strong effects that Apache cicadas may have on ecosystem structure and functioning.

  14. Genetic analysis of resistance to septoria tritici blotch in the French winter wheat cultivars Balance and Apache

    NARCIS (Netherlands)

    Tabib Ghaffary, M.S.; Robert, O.; Laurent, V.; Lonnet, P.; Margalé, E.; Lee, van der T.A.J.; Visser, R.G.F.; Kema, G.H.J.

    2011-01-01

    The ascomycete Mycosphaerella graminicola is the causal agent of septoria tritici blotch (STB), one of the most destructive foliar diseases of bread and durum wheat globally, particularly in temperate humid areas. A screening of the French bread wheat cultivars Apache and Balance with 30 M.

  15. Development of a helmet/helmet-display-unit alignment tool (HAT) for the Apache helmet and display unit

    Science.gov (United States)

    McLean, William; Statz, Jonathan; Estes, Victor; Booms, Shawn; Martin, John S.; Harding, Thomas

    2015-05-01

    Project Manager (PM) Apache Block III contacted the U.S. Army Aeromedical Research Laboratory (USAARL), Fort Rucker, Alabama, requesting assistance to evaluate and find solutions to a government-developed Helmet Display Unit (HDU) device called the Mock HDU for helmet alignment of the Apache Advanced Integrated Helmet (AAIH). The AAIH is a modified Head Gear Unit No. 56 for Personnel (HGU-56/P) to replace the current Integrated Helmet and Sighting System (IHADSS). The current flashlight-based HDU simulator for helmet/HDU alignment was no longer in production or available. Proper helmet/HDU alignment is critical to position the right eye in the small HDU eye box to obtain image alignment and full field of view (FOV). The initial approach of the PM to developing a helmet/HDU fitting device (Mock HDU) was to duplicate the optical characteristics of the current tactical HDU using less complex optics. However, the results produced questionable alignment, FOV, and distortion issues, with cost and development time overruns. After evaluating the Mock HDU, USAARL proposed a cost effective, less complex optical design called the Helmet/HDU Alignment Tool (HAT). This paper will show the development, components, and evaluations of the HAT compared to the current flashlight HDU simulator device. The laboratory evaluations included FOV measurements and alignment accuracies compared to tactical HDUs. The Apache helmet fitter technicians and Apache pilots compared the HAT to the current flashlight based HDU and ranked the HAT superior.

  16. Overview of the SDSS-IV MaNGA Survey: Mapping nearby Galaxies at Apache Point Observatory

    NARCIS (Netherlands)

    Bundy, Kevin; Bershady, Matthew A.; Law, David R.; Yan, Renbin; Drory, Niv; MacDonald, Nicholas; Wake, David A.; Cherinka, Brian; Sánchez-Gallego, José R.; Weijmans, Anne-Marie; Thomas, Daniel; Tremonti, Christy; Masters, Karen; Coccato, Lodovico; Diamond-Stanic, Aleksandar M.; Aragón-Salamanca, Alfonso; Avila-Reese, Vladimir; Badenes, Carles; Falcón-Barroso, Jésus; Belfiore, Francesco; Bizyaev, Dmitry; Blanc, Guillermo A.; Bland-Hawthorn, Joss; Blanton, Michael R.; Brownstein, Joel R.; Byler, Nell; Cappellari, Michele; Conroy, Charlie; Dutton, Aaron A.; Emsellem, Eric; Etherington, James; Frinchaboy, Peter M.; Fu, Hai; Gunn, James E.; Harding, Paul; Johnston, Evelyn J.; Kauffmann, Guinevere; Kinemuchi, Karen; Klaene, Mark A.; Knapen, Johan H.; Leauthaud, Alexie; Li, Cheng; Lin, Lihwai; Maiolino, Roberto; Malanushenko, Viktor; Malanushenko, Elena; Mao, Shude; Maraston, Claudia; McDermid, Richard M.; Merrifield, Michael R.; Nichol, Robert C.; Oravetz, Daniel; Pan, Kaike; Parejko, John K.; Sanchez, Sebastian F.; Schlegel, David; Simmons, Audrey; Steele, Oliver; Steinmetz, Matthias; Thanjavur, Karun; Thompson, Benjamin A.; Tinker, Jeremy L.; van den Bosch, Remco C. E.; Westfall, Kyle B.; Wilkinson, David; Wright, Shelley; Xiao, Ting; Zhang, Kai

    We present an overview of a new integral field spectroscopic survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory), one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV) that began on 2014 July 1. MaNGA will investigate the internal kinematic

  17. Next Generation Astronomical Data Processing using Big Data Technologies from the Apache Software Foundation

    Science.gov (United States)

    Mattmann, Chris

    2014-04-01

    In this era of exascale instruments for astronomy we must naturally develop next generation capabilities for the unprecedented data volume and velocity that will arrive due to the veracity of these ground-based sensor and observatories. Integrating scientific algorithms stewarded by scientific groups unobtrusively and rapidly; intelligently selecting data movement technologies; making use of cloud computing for storage and processing; and automatically extracting text and metadata and science from any type of file are all needed capabilities in this exciting time. Our group at NASA JPL has promoted the use of open source data management technologies available from the Apache Software Foundation (ASF) in pursuit of constructing next generation data management and processing systems for astronomical instruments including the Expanded Very Large Array (EVLA) in Socorro, NM and the Atacama Large Milimetre/Sub Milimetre Array (ALMA); as well as for the KAT-7 project led by SKA South Africa as a precursor to the full MeerKAT telescope. In addition we are funded currently by the National Science Foundation in the US to work with MIT Haystack Observatory and the University of Cambridge in the UK to construct a Radio Array of Portable Interferometric Devices (RAPID) that will undoubtedly draw from the rich technology advances underway. NASA JPL is investing in a strategic initiative for Big Data that is pulling in these capabilities and technologies for astronomical instruments and also for Earth science remote sensing. In this talk I will describe the above collaborative efforts underway and point to solutions in open source from the Apache Software Foundation that can be deployed and used today and that are already bringing our teams and projects benefits. I will describe how others can take advantage of our experience and point towards future application and contribution of these tools.

  18. Variational Monte Carlo Technique

    Indian Academy of Sciences (India)

    ias

    nonprobabilistic) problem [5]. ... In quantum mechanics, the MC methods are used to simulate many-particle systems us- ing random ...... D Ceperley, G V Chester and M H Kalos, Monte Carlo simulation of a many-fermion study, Physical Review Vol.

  19. Markov Chain Monte Carlo

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 3. Markov Chain Monte Carlo - Examples. Arnab Chakraborty. General Article Volume 7 Issue 3 March 2002 pp 25-34. Fulltext. Click here to view fulltext PDF. Permanent link: https://www.ias.ac.in/article/fulltext/reso/007/03/0025-0034. Keywords.

  20. Carlo Caso (1940 - 2007)

    CERN Multimedia

    Leonardo Rossi

    Carlo Caso (1940 - 2007) Our friend and colleague Carlo Caso passed away on July 7th, after several months of courageous fight against cancer. Carlo spent most of his scientific career at CERN, taking an active part in the experimental programme of the laboratory. His long and fruitful involvement in particle physics started in the sixties, in the Genoa group led by G. Tomasini. He then made several experiments using the CERN liquid hydrogen bubble chambers -first the 2000HBC and later BEBC- to study various facets of the production and decay of meson and baryon resonances. He later made his own group and joined the NA27 Collaboration to exploit the EHS Spectrometer with a rapid cycling bubble chamber as vertex detector. Amongst their many achievements, they were the first to measure, with excellent precision, the lifetime of the charmed D mesons. At the start of the LEP era, Carlo and his group moved to the DELPHI experiment, participating in the construction and running of the HPC electromagnetic c...

  1. Monte Carlo methods

    CERN Document Server

    Kalos, Melvin H

    2008-01-01

    This introduction to Monte Carlo methods seeks to identify and study the unifying elements that underlie their effective application. Initial chapters provide a short treatment of the probability and statistics needed as background, enabling those without experience in Monte Carlo techniques to apply these ideas to their research.The book focuses on two basic themes: The first is the importance of random walks as they occur both in natural stochastic systems and in their relationship to integral and differential equations. The second theme is that of variance reduction in general and importance sampling in particular as a technique for efficient use of the methods. Random walks are introduced with an elementary example in which the modeling of radiation transport arises directly from a schematic probabilistic description of the interaction of radiation with matter. Building on this example, the relationship between random walks and integral equations is outlined

  2. Carlos Vesga Duarte

    Directory of Open Access Journals (Sweden)

    Pedro Medina Avendaño

    1981-01-01

    Full Text Available Carlos Vega Duarte tenía la sencillez de los seres elementales y puros. Su corazón era limpio como oro de aluvión. Su trato directo y coloquial ponía de relieve a un santandereano sin contaminaciones que amaba el fulgor de las armas y se encandilaba con el destello de las frases perfectas

  3. Fundamentals of Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Wollaber, Allan Benton [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation which serves as lecture material for the Parallel Computing summer school. It goes over the fundamentals of the Monte Carlo calculation method. The material is presented according to the following outline: Introduction (background, a simple example: estimating π), Why does this even work? (The Law of Large Numbers, The Central Limit Theorem), How to sample (inverse transform sampling, rejection), and An example from particle transport.

  4. Wormhole Hamiltonian Monte Carlo

    OpenAIRE

    Lan, S; Streets, J; Shahbaba, B

    2014-01-01

    Copyright © 2014, Association for the Advancement of Artificial Intelligence. In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, espe...

  5. Microcanonical Monte Carlo

    International Nuclear Information System (INIS)

    Creutz, M.

    1986-01-01

    The author discusses a recently developed algorithm for simulating statistical systems. The procedure interpolates between molecular dynamics methods and canonical Monte Carlo. The primary advantages are extremely fast simulations of discrete systems such as the Ising model and a relative insensitivity to random number quality. A variation of the algorithm gives rise to a deterministic dynamics for Ising spins. This model may be useful for high speed simulation of non-equilibrium phenomena

  6. Monte Carlo alpha calculation

    Energy Technology Data Exchange (ETDEWEB)

    Brockway, D.; Soran, P.; Whalen, P.

    1985-01-01

    A Monte Carlo algorithm to efficiently calculate static alpha eigenvalues, N = ne/sup ..cap alpha..t/, for supercritical systems has been developed and tested. A direct Monte Carlo approach to calculating a static alpha is to simply follow the buildup in time of neutrons in a supercritical system and evaluate the logarithmic derivative of the neutron population with respect to time. This procedure is expensive, and the solution is very noisy and almost useless for a system near critical. The modified approach is to convert the time-dependent problem to a static ..cap alpha../sup -/eigenvalue problem and regress ..cap alpha.. on solutions of a/sup -/ k/sup -/eigenvalue problem. In practice, this procedure is much more efficient than the direct calculation, and produces much more accurate results. Because the Monte Carlo codes are intrinsically three-dimensional and use elaborate continuous-energy cross sections, this technique is now used as a standard for evaluating other calculational techniques in odd geometries or with group cross sections.

  7. Who Writes Carlos Bulosan?

    Directory of Open Access Journals (Sweden)

    Charlie Samuya Veric

    2001-12-01

    Full Text Available The importance of Carlos Bulosan in Filipino and Filipino-American radical history and literature is indisputable. His eminence spans the pacific, and he is known, diversely, as a radical poet, fictionist, novelist, and labor organizer. Author of the canonical America Iis the Hearts, Bulosan is celebrated for chronicling the conditions in America in his time, such as racism and unemployment. In the history of criticism on Bulosan's life and work, however, there is an undeclared general consensus that views Bulosan and his work as coherent permanent texts of radicalism and anti-imperialism. Central to the existence of such a tradition of critical reception are the generations of critics who, in more ways than one, control the discourse on and of Carlos Bulosan. This essay inquires into the sphere of the critical reception that orders, for our time and for the time ahead, the reading and interpretation of Bulosan. What eye and seeing, the essay asks, determine the perception of Bulosan as the angel of radicalism? What is obscured in constructing Bulosan as an immutable figure of the political? What light does the reader conceive when the personal is brought into the open and situated against the political? the essay explores the answers to these questions in Bulosan's loving letters to various friends, strangers, and white American women. The presence of these interrogations, the essay believes, will secure ultimately the continuing importance of Carlos Bulosan to radical literature and history.

  8. CERN honours Carlo Rubbia

    CERN Multimedia

    2009-01-01

    On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency and Professor at the IUSS School for Advanced Studies in Pavia will speak about his work with Carlo Rubbia. Finally, Hans Joachim Sch...

  9. CERN honours Carlo Rubbia

    CERN Document Server

    2009-01-01

    On 7 April CERN will be holding a symposium to mark the 75th birthday of Carlo Rubbia, who shared the 1984 Nobel Prize for Physics with Simon van der Meer for contributions to the discovery of the W and Z bosons, carriers of the weak interaction. Following a presentation by Rolf Heuer, lectures will be given by eminent speakers on areas of science to which Carlo Rubbia has made decisive contributions. Michel Spiro, Director of the French National Institute of Nuclear and Particle Physics (IN2P3) of the CNRS, Lyn Evans, sLHC Project Leader, and Alan Astbury of the TRIUMF Laboratory will talk about the physics of the weak interaction and the discovery of the W and Z bosons. Former CERN Director-General Herwig Schopper will lecture on CERN’s accelerators from LEP to the LHC. Giovanni Bignami, former President of the Italian Space Agency, will speak about his work with Carlo Rubbia. Finally, Hans Joachim Schellnhuber of the Potsdam Institute for Climate Research and Sven Kul...

  10. The APACHE survey hardware and software design: Tools for an automatic search of small-size transiting exoplanets

    Directory of Open Access Journals (Sweden)

    Lattanzi M.G.

    2013-04-01

    Full Text Available Small-size ground-based telescopes can effectively be used to look for transiting rocky planets around nearby low-mass M stars using the photometric transit method, as recently demonstrated for example by the MEarth project. Since 2008 at the Astronomical Observatory of the Autonomous Region of Aosta Valley (OAVdA, we have been preparing for the long-term photometric survey APACHE, aimed at finding transiting small-size planets around thousands of nearby early and mid-M dwarfs. APACHE (A PAthway toward the Characterization of Habitable Earths is designed to use an array of five dedicated and identical 40-cm Ritchey-Chretien telescopes and its observations started at the beginning of summer 2012. The main characteristics of the survey final set up and the preliminary results from the first weeks of observations will be discussed.

  11. Detection of attack-targeted scans from the Apache HTTP Server access logs

    Directory of Open Access Journals (Sweden)

    Merve Baş Seyyar

    2018-01-01

    Full Text Available A web application could be visited for different purposes. It is possible for a web site to be visited by a regular user as a normal (natural visit, to be viewed by crawlers, bots, spiders, etc. for indexing purposes, lastly to be exploratory scanned by malicious users prior to an attack. An attack targeted web scan can be viewed as a phase of a potential attack and can lead to more attack detection as compared to traditional detection methods. In this work, we propose a method to detect attack-oriented scans and to distinguish them from other types of visits. In this context, we use access log files of Apache (or ISS web servers and try to determine attack situations through examination of the past data. In addition to web scan detections, we insert a rule set to detect SQL Injection and XSS attacks. Our approach has been applied on sample data sets and results have been analyzed in terms of performance measures to compare our method and other commonly used detection techniques. Furthermore, various tests have been made on log samples from real systems. Lastly, several suggestions about further development have been also discussed.

  12. Cultural Foundations for Ecological Restoration on the White Mountain Apache Reservation

    Directory of Open Access Journals (Sweden)

    Jonathan Long

    2003-12-01

    Full Text Available Myths, metaphors, and social norms that facilitate collective action and understanding of restoration dynamics serve as foundations for ecological restoration. The experience of the White Mountain Apache Tribe demonstrates how such cultural foundations can permeate and motivate ecological restoration efforts. Through interviews with tribal cultural advisors and restoration practitioners, we examined how various traditions inform their understanding of restoration processes. Creation stories reveal the time-honored importance and functions of water bodies within the landscape, while place names yield insights into their historical and present conditions. Traditional healing principles and agricultural traditions help guide modern restoration techniques. A metaphor of stability illustrates how restoration practitioners see links among ecological, social, and personal dimensions of health. These views inspire reciprocal relationships focused on caretaking of sites, learning from elders, and passing knowledge on to youths. Woven together, these cultural traditions uphold a system of adaptive management that has withstood the imposition of non-indigenous management schemes in the 20th century, and now provides hope for restoring health and productivity of ecosystems through individual and collective efforts. Although these traditions are adapted to the particular ecosystems of the Tribe, they demonstrate the value of understanding and promoting the diverse cultural foundations of restoration.

  13. The Goddard Integral Field Spectrograph at Apache Point Observatory: Current Status and Progress Towards Photon Counting

    Science.gov (United States)

    McElwain, Michael W.; Grady, Carol A.; Bally, John; Brinkmann, Jonathan V.; Bubeck, James; Gong, Qian; Hilton, George M.; Ketzeback, William F.; Lindler, Don; Llop Sayson, Jorge; Malatesta, Michael A.; Norton, Timothy; Rauscher, Bernard J.; Rothe, Johannes; Straka, Lorrie; Wilkins, Ashlee N.; Wisniewski, John P.; Woodgate, Bruce E.; York, Donald G.

    2015-01-01

    We present the current status and progress towards photon counting with the Goddard Integral Field Spectrograph (GIFS), a new instrument at the Apache Point Observatory's ARC 3.5m telescope. GIFS is a visible light imager and integral field spectrograph operating from 400-1000 nm over a 2.8' x 2.8' and 14' x 14' field of view, respectively. As an IFS, GIFS obtains over 1000 spectra simultaneously and its data reduction pipeline reconstructs them into an image cube that has 32 x 32 spatial elements and more than 200 spectral channels. The IFS mode can be applied to a wide variety of science programs including exoplanet transit spectroscopy, protostellar jets, the galactic interstellar medium probed by background quasars, Lyman-alpha emission line objects, and spectral imaging of galactic winds. An electron-multiplying CCD (EMCCD) detector enables photon counting in the high spectral resolution mode to be demonstrated at the ARC 3.5m in early 2015. The EMCCD work builds upon successful operational and characterization tests that have been conducted in the IFS laboratory at NASA Goddard. GIFS sets out to demonstrate an IFS photon-counting capability on-sky in preparation for future exoplanet direct imaging missions such as the AFTA-Coronagraph, Exo-C, and ATLAST mission concepts. This work is supported by the NASA APRA program under RTOP 10-APRA10-0103.

  14. High performance Spark best practices for scaling and optimizing Apache Spark

    CERN Document Server

    Karau, Holden

    2017-01-01

    Apache Spark is amazing when everything clicks. But if you haven’t seen the performance improvements you expected, or still don’t feel confident enough to use Spark in production, this practical book is for you. Authors Holden Karau and Rachel Warren demonstrate performance optimizations to help your Spark queries run faster and handle larger data sizes, while using fewer resources. Ideal for software engineers, data engineers, developers, and system administrators working with large-scale data applications, this book describes techniques that can reduce data infrastructure costs and developer hours. Not only will you gain a more comprehensive understanding of Spark, you’ll also learn how to make it sing. With this book, you’ll explore: How Spark SQL’s new interfaces improve performance over SQL’s RDD data structure The choice between data joins in Core Spark and Spark SQL Techniques for getting the most out of standard RDD transformations How to work around performance issues i...

  15. The dimensionality of stellar chemical space using spectra from the Apache Point Observatory Galactic Evolution Experiment

    Science.gov (United States)

    Price-Jones, Natalie; Bovy, Jo

    2018-03-01

    Chemical tagging of stars based on their similar compositions can offer new insights about the star formation and dynamical history of the Milky Way. We investigate the feasibility of identifying groups of stars in chemical space by forgoing the use of model derived abundances in favour of direct analysis of spectra. This facilitates the propagation of measurement uncertainties and does not pre-suppose knowledge of which elements are important for distinguishing stars in chemical space. We use ˜16 000 red giant and red clump H-band spectra from the Apache Point Observatory Galactic Evolution Experiment (APOGEE) and perform polynomial fits to remove trends not due to abundance-ratio variations. Using expectation maximized principal component analysis, we find principal components with high signal in the wavelength regions most important for distinguishing between stars. Different subsamples of red giant and red clump stars are all consistent with needing about 10 principal components to accurately model the spectra above the level of the measurement uncertainties. The dimensionality of stellar chemical space that can be investigated in the H band is therefore ≲10. For APOGEE observations with typical signal-to-noise ratios of 100, the number of chemical space cells within which stars cannot be distinguished is approximately 1010±2 × (5 ± 2)n - 10 with n the number of principal components. This high dimensionality and the fine-grained sampling of chemical space are a promising first step towards chemical tagging based on spectra alone.

  16. The Rotation of M Dwarfs Observed by the Apache Point Galactic Evolution Experiment

    Science.gov (United States)

    Gilhool, Steven H.; Blake, Cullen H.; Terrien, Ryan C.; Bender, Chad; Mahadevan, Suvrath; Deshpande, Rohit

    2018-01-01

    We present the results of a spectroscopic analysis of rotational velocities in 714 M-dwarf stars observed by the SDSS-III Apache Point Galactic Evolution Experiment (APOGEE) survey. We use a template-fitting technique to estimate v\\sin i while simultaneously estimating {log}g, [{{M}}/{{H}}], and {T}{eff}. We conservatively estimate that our detection limit is 8 km s‑1. We compare our results to M-dwarf rotation studies in the literature based on both spectroscopic and photometric measurements. Like other authors, we find an increase in the fraction of rapid rotators with decreasing stellar temperature, exemplified by a sharp increase in rotation near the M4 transition to fully convective stellar interiors, which is consistent with the hypothesis that fully convective stars are unable to shed angular momentum as efficiently as those with radiative cores. We compare a sample of targets observed both by APOGEE and the MEarth transiting planet survey and find no cases where the measured v\\sin i and rotation period are physically inconsistent, requiring \\sin i> 1. We compare our spectroscopic results to the fraction of rotators inferred from photometric surveys and find that while the results are broadly consistent, the photometric surveys exhibit a smaller fraction of rotators beyond the M4 transition by a factor of ∼2. We discuss possible reasons for this discrepancy. Given our detection limit, our results are consistent with a bimodal distribution in rotation that is seen in photometric surveys.

  17. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    International Nuclear Information System (INIS)

    Nidever, David L.; Holtzman, Jon A.; Prieto, Carlos Allende; Mészáros, Szabolcs; Beland, Stephane; Bender, Chad; Desphande, Rohit; Bizyaev, Dmitry; Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C.; Fleming, Scott W.; Muna, Demitri; Nguyen, Duy; Schiavon, Ricardo P.; Shetrone, Matthew

    2015-01-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s −1 ) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement

  18. THE DATA REDUCTION PIPELINE FOR THE APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Nidever, David L. [Department of Astronomy, University of Michigan, Ann Arbor, MI 48109 (United States); Holtzman, Jon A. [New Mexico State University, Las Cruces, NM 88003 (United States); Prieto, Carlos Allende; Mészáros, Szabolcs [Instituto de Astrofísica de Canarias, Via Láctea s/n, E-38205 La Laguna, Tenerife (Spain); Beland, Stephane [Laboratory for Atmospheric and Space Sciences, University of Colorado at Boulder, Boulder, CO (United States); Bender, Chad; Desphande, Rohit [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Bizyaev, Dmitry [Apache Point Observatory and New Mexico State University, P.O. Box 59, sunspot, NM 88349-0059 (United States); Burton, Adam; García Pérez, Ana E.; Hearty, Fred R.; Majewski, Steven R.; Skrutskie, Michael F.; Sobeck, Jennifer S.; Wilson, John C. [Department of Astronomy, University of Virginia, Charlottesville, VA 22904-4325 (United States); Fleming, Scott W. [Computer Sciences Corporation, 3700 San Martin Dr, Baltimore, MD 21218 (United States); Muna, Demitri [Department of Astronomy and the Center for Cosmology and Astro-Particle Physics, The Ohio State University, Columbus, OH 43210 (United States); Nguyen, Duy [Department of Astronomy and Astrophysics, University of Toronto, Toronto, Ontario, M5S 3H4 (Canada); Schiavon, Ricardo P. [Gemini Observatory, 670 N. A’Ohoku Place, Hilo, HI 96720 (United States); Shetrone, Matthew, E-mail: dnidever@umich.edu [University of Texas at Austin, McDonald Observatory, Fort Davis, TX 79734 (United States)

    2015-12-15

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE), part of the Sloan Digital Sky Survey III, explores the stellar populations of the Milky Way using the Sloan 2.5-m telescope linked to a high resolution (R ∼ 22,500), near-infrared (1.51–1.70 μm) spectrograph with 300 optical fibers. For over 150,000 predominantly red giant branch stars that APOGEE targeted across the Galactic bulge, disks and halo, the collected high signal-to-noise ratio (>100 per half-resolution element) spectra provide accurate (∼0.1 km s{sup −1}) RVs, stellar atmospheric parameters, and precise (≲0.1 dex) chemical abundances for about 15 chemical species. Here we describe the basic APOGEE data reduction software that reduces multiple 3D raw data cubes into calibrated, well-sampled, combined 1D spectra, as implemented for the SDSS-III/APOGEE data releases (DR10, DR11 and DR12). The processing of the near-IR spectral data of APOGEE presents some challenges for reduction, including automated sky subtraction and telluric correction over a 3°-diameter field and the combination of spectrally dithered spectra. We also discuss areas for future improvement.

  19. APACHE II SCORING SYSTEM AND ITS MODIFICATION FOR THE ASSESSMENT OF DISEASE SEVERITY IN CHILDREN WHO UNDERWENT POLYCHEMOTHERAPY

    Directory of Open Access Journals (Sweden)

    А. V. Sotnikov

    2014-01-01

    Full Text Available Short-term disease prognosis should be considered for the appropriate treatment policy based on the assessment of disease severity in patients with acute disease. The adequate assessment of disease severity and prognosis allows the indications for transferring patients to the resuscitation and intensive care department to be defined more precisely. Disease severity of patients who underwent polychemotherapy was assessed using APACHE II scoring system.

  20. Efficient Streaming Mass Spatio-Temporal Vehicle Data Access in Urban Sensor Networks Based on Apache Storm.

    Science.gov (United States)

    Zhou, Lianjie; Chen, Nengcheng; Chen, Zeqiang

    2017-04-10

    The efficient data access of streaming vehicle data is the foundation of analyzing, using and mining vehicle data in smart cities, which is an approach to understand traffic environments. However, the number of vehicles in urban cities has grown rapidly, reaching hundreds of thousands in number. Accessing the mass streaming data of vehicles is hard and takes a long time due to limited computation capability and backward modes. We propose an efficient streaming spatio-temporal data access based on Apache Storm (ESDAS) to achieve real-time streaming data access and data cleaning. As a popular streaming data processing tool, Apache Storm can be applied to streaming mass data access and real time data cleaning. By designing the Spout/bolt workflow of topology in ESDAS and by developing the speeding bolt and other bolts, Apache Storm can achieve the prospective aim. In our experiments, Taiyuan BeiDou bus location data is selected as the mass spatio-temporal data source. In the experiments, the data access results with different bolts are shown in map form, and the filtered buses' aggregation forms are different. In terms of performance evaluation, the consumption time in ESDAS for ten thousand records per second for a speeding bolt is approximately 300 milliseconds, and that for MongoDB is approximately 1300 milliseconds. The efficiency of ESDAS is approximately three times higher than that of MongoDB.

  1. Predictive value of the APACHE II, SAPS II, SOFA and GCS scoring systems in patients with severe purulent bacterial meningitis.

    Science.gov (United States)

    Pietraszek-Grzywaczewska, Iwona; Bernas, Szymon; Łojko, Piotr; Piechota, Anna; Piechota, Mariusz

    2016-01-01

    Scoring systems in critical care patients are essential for predicting of the patient outcome and evaluating the therapy. In this study, we determined the value of the Acute Physiology and Chronic Health Evaluation II (APACHE II), Simplified Acute Physiology Score II (SAPS II), Sequential Organ Failure Assessment (SOFA) and Glasgow Coma Scale (GCS) scoring systems in the prediction of mortality in adult patients admitted to the intensive care unit (ICU) with severe purulent bacterial meningitis. We retrospectively analysed data from 98 adult patients with severe purulent bacterial meningitis who were admitted to the single ICU between March 2006 and September 2015. Univariate logistic regression identified the following risk factors of death in patients with severe purulent bacterial meningitis: APACHE II, SAPS II, SOFA, and GCS scores, and the lengths of ICU stay and hospital stay. The independent risk factors of patient death in multivariate analysis were the SAPS II score, the length of ICU stay and the length of hospital stay. In the prediction of mortality according to the area under the curve, the SAPS II score had the highest accuracy followed by the APACHE II, GCS and SOFA scores. For the prediction of mortality in a patient with severe purulent bacterial meningitis, SAPS II had the highest accuracy.

  2. Monte Carlo alpha deposition

    International Nuclear Information System (INIS)

    Talley, T.L.; Evans, F.

    1988-01-01

    Prior work demonstrated the importance of nuclear scattering to fusion product energy deposition in hot plasmas. This suggests careful examination of nuclear physics details in burning plasma simulations. An existing Monte Carlo fast ion transport code is being expanded to be a test bed for this examination. An initial extension, the energy deposition of fast alpha particles in a hot deuterium plasma, is reported. The deposition times and deposition ranges are modified by allowing nuclear scattering. Up to 10% of the initial alpha particle energy is carried to greater ranges and times by the more mobile recoil deuterons. 4 refs., 5 figs., 2 tabs

  3. Field studies at the Apache Leap Research Site in support of alternative conceptual models

    International Nuclear Information System (INIS)

    Woodhouse, E.G.; Davidson, G.R.; Theis, C.

    1997-08-01

    This is a final technical report for a project of the U.S Nuclear Regulatory Commission (sponsored contract NRC-04-090-51) with the University of Arizona. The contract was an optional extension that was initiated on July 21, 1994 and that expired on May 31, 1995. The project manager was Thomas J. Nicholson, Office of Nuclear Regulatory Research. The objectives of this contract were to examine hypotheses and conceptual models concerning unsaturated flow and transport through fractured rock, and to design and execute confirmatory field and laboratory experiments to test these hypotheses and conceptual models at the Apache Leap Research Site near Superior, Arizona. The results discussed here are products of specific tasks that address a broad spectrum of issues related to flow and transport through fractures. Each chapter in this final report summarizes research related to a specific set of objectives and can be read and interpreted as a separate entity. The tasks include detection and characterization of historical rapid fluid flow through fractured rock and the relationship to perched water systems using environmental isotopic tracers of 3 H and 14 C, fluid- and rock-derived 2343 U/ 238 U measurements, and geophysical data. The water balance in a small watershed at the ALRS demonstrates the methods of acounting for ET, and estimating the quantity of water available for infiltration through fracture networks. Grain density measurements were made for core-sized samples using a newly designed gas pycnometer. The distribution and magnitude of air permeability measurements have been measured in a three-dimensional setting; the subsequent geostatistical analysis is presented. Electronic versions of the data presented here are available from authors; more detailed discussions and analyses are available in technical publications referenced herein, or soon to appear in the professional literature

  4. Field studies at the Apache Leap Research Site in support of alternative conceptual models

    Energy Technology Data Exchange (ETDEWEB)

    Woodhouse, E.G.; Davidson, G.R.; Theis, C. [eds.] [and others

    1997-08-01

    This is a final technical report for a project of the U.S Nuclear Regulatory Commission (sponsored contract NRC-04-090-51) with the University of Arizona. The contract was an optional extension that was initiated on July 21, 1994 and that expired on May 31, 1995. The project manager was Thomas J. Nicholson, Office of Nuclear Regulatory Research. The objectives of this contract were to examine hypotheses and conceptual models concerning unsaturated flow and transport through fractured rock, and to design and execute confirmatory field and laboratory experiments to test these hypotheses and conceptual models at the Apache Leap Research Site near Superior, Arizona. The results discussed here are products of specific tasks that address a broad spectrum of issues related to flow and transport through fractures. Each chapter in this final report summarizes research related to a specific set of objectives and can be read and interpreted as a separate entity. The tasks include detection and characterization of historical rapid fluid flow through fractured rock and the relationship to perched water systems using environmental isotopic tracers of {sup 3}H and {sup 14}C, fluid- and rock-derived {sup 2343}U/{sup 238}U measurements, and geophysical data. The water balance in a small watershed at the ALRS demonstrates the methods of acounting for ET, and estimating the quantity of water available for infiltration through fracture networks. Grain density measurements were made for core-sized samples using a newly designed gas pycnometer. The distribution and magnitude of air permeability measurements have been measured in a three-dimensional setting; the subsequent geostatistical analysis is presented. Electronic versions of the data presented here are available from authors; more detailed discussions and analyses are available in technical publications referenced herein, or soon to appear in the professional literature.

  5. Temporal Variations of Telluric Water Vapor Absorption at Apache Point Observatory

    Science.gov (United States)

    Li, Dan; Blake, Cullen H.; Nidever, David; Halverson, Samuel P.

    2018-01-01

    Time-variable absorption by water vapor in Earth’s atmosphere presents an important source of systematic error for a wide range of ground-based astronomical measurements, particularly at near-infrared wavelengths. We present results from the first study on the temporal and spatial variability of water vapor absorption at Apache Point Observatory (APO). We analyze ∼400,000 high-resolution, near-infrared (H-band) spectra of hot stars collected as calibration data for the APO Galactic Evolution Experiment (APOGEE) survey. We fit for the optical depths of telluric water vapor absorption features in APOGEE spectra and convert these optical depths to Precipitable Water Vapor (PWV) using contemporaneous data from a GPS-based PWV monitoring station at APO. Based on simultaneous measurements obtained over a 3° field of view, we estimate that our PWV measurement precision is ±0.11 mm. We explore the statistics of PWV variations over a range of timescales from less than an hour to days. We find that the amplitude of PWV variations within an hour is less than 1 mm for most (96.5%) APOGEE field visits. By considering APOGEE observations that are close in time but separated by large distances on the sky, we find that PWV is homogeneous across the sky at a given epoch, with 90% of measurements taken up to 70° apart within 1.5 hr having ΔPWV < 1.0 mm. Our results can be used to help simulate the impact of water vapor absorption on upcoming surveys at continental observing sites like APO, and also to help plan for simultaneous water vapor metrology that may be carried out in support of upcoming photometric and spectroscopic surveys.

  6. Indian Legends.

    Science.gov (United States)

    Gurnoe, Katherine J.; Skjervold, Christian, Ed.

    Presenting American Indian legends, this material provides insight into the cultural background of the Dakota, Ojibwa, and Winnebago people. Written in a straightforward manner, each of the eight legends is associated with an Indian group. The legends included here are titled as follows: Minnesota is Minabozho's Land (Ojibwa); How We Got the…

  7. Monte Carlo Methods in Physics

    International Nuclear Information System (INIS)

    Santoso, B.

    1997-01-01

    Method of Monte Carlo integration is reviewed briefly and some of its applications in physics are explained. A numerical experiment on random generators used in the monte Carlo techniques is carried out to show the behavior of the randomness of various methods in generating them. To account for the weight function involved in the Monte Carlo, the metropolis method is used. From the results of the experiment, one can see that there is no regular patterns of the numbers generated, showing that the program generators are reasonably good, while the experimental results, shows a statistical distribution obeying statistical distribution law. Further some applications of the Monte Carlo methods in physics are given. The choice of physical problems are such that the models have available solutions either in exact or approximate values, in which comparisons can be mode, with the calculations using the Monte Carlo method. Comparison show that for the models to be considered, good agreement have been obtained

  8. Metropolis Methods for Quantum Monte Carlo Simulations

    OpenAIRE

    Ceperley, D. M.

    2003-01-01

    Since its first description fifty years ago, the Metropolis Monte Carlo method has been used in a variety of different ways for the simulation of continuum quantum many-body systems. This paper will consider some of the generalizations of the Metropolis algorithm employed in quantum Monte Carlo: Variational Monte Carlo, dynamical methods for projector monte carlo ({\\it i.e.} diffusion Monte Carlo with rejection), multilevel sampling in path integral Monte Carlo, the sampling of permutations, ...

  9. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    listening to Indian classical music. Mohan Delampady is at the. Indian Statistical Institute,. Bangalore. His research interests include robustness, nonparametric inference and computing in Bayesian statistics. T Krishnan is now a full- time Technical Consultant t(J. Systat Software Asia-Pacific. Ltd., in Bangalore, where the.

  10. Parallelizing Monte Carlo with PMC

    International Nuclear Information System (INIS)

    Rathkopf, J.A.; Jones, T.R.; Nessett, D.M.; Stanberry, L.C.

    1994-11-01

    PMC (Parallel Monte Carlo) is a system of generic interface routines that allows easy porting of Monte Carlo packages of large-scale physics simulation codes to Massively Parallel Processor (MPP) computers. By loading various versions of PMC, simulation code developers can configure their codes to run in several modes: serial, Monte Carlo runs on the same processor as the rest of the code; parallel, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on other MPP processor(s); distributed, Monte Carlo runs in parallel across many processors of the MPP with the rest of the code running on a different machine. This multi-mode approach allows maintenance of a single simulation code source regardless of the target machine. PMC handles passing of messages between nodes on the MPP, passing of messages between a different machine and the MPP, distributing work between nodes, and providing independent, reproducible sequences of random numbers. Several production codes have been parallelized under the PMC system. Excellent parallel efficiency in both the distributed and parallel modes results if sufficient workload is available per processor. Experiences with a Monte Carlo photonics demonstration code and a Monte Carlo neutronics package are described

  11. Predictive ability of the ISS, NISS, and APACHE II score for SIRS and sepsis in polytrauma patients.

    Science.gov (United States)

    Mica, L; Furrer, E; Keel, M; Trentz, O

    2012-12-01

    Systemic inflammatory response syndrome (SIRS) and sepsis as causes of multiple organ dysfunction syndrome (MODS) remain challenging to treat in polytrauma patients. In this study, the focus was set on widely used scoring systems to assess their diagnostic quality. A total of 512 patients (mean age: 39.2 ± 16.2, range: 16-88 years) who had an Injury Severity Score (ISS) ≥17 were included in this retrospective study. The patients were subdivided into four groups: no SIRS, slight SIRS, severe SIRS, and sepsis. The ISS, New Injury Severity Score (NISS), Acute Physiology and Chronic Health Evaluation II (APACHE II) scores, and prothrombin time were collected at admission. The Kruskal-Wallis test and χ(2)-test, multinomial regression analysis, and kernel density estimates were performed. Receiver operating characteristic (ROC) analysis is reported as the area under the curve (AUC). Data were considered as significant if p SIRS severity for NISS (slight vs. no SIRS, 1.06, p = 0.07; severe vs. no SIRS, 1.07, p = 0.04; and sepsis vs. no SIRS, 1.11, p = 0.0028) and APACHE II score (slight vs. no SIRS, 0.97, p = 0.44; severe vs. no SIRS, 1.08, p = 0.02; and sepsis vs. no SIRS, 1.12, p = 0.0028). ROC analysis revealed that the NISS (slight vs. no SIRS, AUC 0.61; severe vs. no SIRS, AUC 0.67; and sepsis vs. no SIRS, AUC 0.77) and APACHE II score (slight vs. no SIRS, AUC 0.60; severe vs. no SIRS, AUC 0.74; and sepsis vs. no SIRS, AUC 0.82) had the best predictive ability for SIRS and sepsis. Quick assessment with the NISS or APACHE II score could preselect possible candidates for sepsis following polytrauma and provide guidance in trauma surgeons' decision-making.

  12. Lectures on Monte Carlo methods

    CERN Document Server

    Madras, Neal

    2001-01-01

    Monte Carlo methods form an experimental branch of mathematics that employs simulations driven by random number generators. These methods are often used when others fail, since they are much less sensitive to the "curse of dimensionality", which plagues deterministic methods in problems with a large number of variables. Monte Carlo methods are used in many fields: mathematics, statistics, physics, chemistry, finance, computer science, and biology, for instance. This book is an introduction to Monte Carlo methods for anyone who would like to use these methods to study various kinds of mathemati

  13. Wormhole Hamiltonian Monte Carlo

    Science.gov (United States)

    Lan, Shiwei; Streets, Jeffrey; Shahbaba, Babak

    2015-01-01

    In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated. To this end, it exploits and modifies the Riemannian geometric properties of the target distribution to create wormholes connecting modes in order to facilitate moving between them. Further, our proposed method uses the regeneration technique in order to adapt the algorithm by identifying new modes and updating the network of wormholes without affecting the stationary distribution. To find new modes, as opposed to redis-covering those previously identified, we employ a novel mode searching algorithm that explores a residual energy function obtained by subtracting an approximate Gaussian mixture density (based on previously discovered modes) from the target density function. PMID:25861551

  14. Wormhole Hamiltonian Monte Carlo.

    Science.gov (United States)

    Lan, Shiwei; Streets, Jeffrey; Shahbaba, Babak

    2014-07-31

    In machine learning and statistics, probabilistic inference involving multimodal distributions is quite difficult. This is especially true in high dimensional problems, where most existing algorithms cannot easily move from one mode to another. To address this issue, we propose a novel Bayesian inference approach based on Markov Chain Monte Carlo. Our method can effectively sample from multimodal distributions, especially when the dimension is high and the modes are isolated. To this end, it exploits and modifies the Riemannian geometric properties of the target distribution to create wormholes connecting modes in order to facilitate moving between them. Further, our proposed method uses the regeneration technique in order to adapt the algorithm by identifying new modes and updating the network of wormholes without affecting the stationary distribution. To find new modes, as opposed to redis-covering those previously identified, we employ a novel mode searching algorithm that explores a residual energy function obtained by subtracting an approximate Gaussian mixture density (based on previously discovered modes) from the target density function.

  15. Initial dosing regimen of vancomycin to achieve early therapeutic plasma concentration in critically ill patients with MRSA infection based on APACHE II score.

    Science.gov (United States)

    Imaura, Masaharu; Yokoyama, Haruko; Kohata, Yuji; Kanai, Riichiro; Kohyama, Tomoki; Idemitsu, Wataru; Maki, Yuichi; Igarashi, Takashi; Takahashi, Hiroyuki; Kanno, Hiroshi; Yamada, Yasuhiko

    2016-06-01

    It is essential to assure the efficacy of antimicrobials at the initial phase of therapy. However, increasing the volume of distribution (Vd) of hydrophilic antimicrobials in critically ill patients leads to reduced antimicrobial concentration in plasma and tissue, which may adversely affect the efficacy of that therapy. The aim of the present study was to establish a theoretical methodology for setting an appropriate level for initial vancomycin therapy in individual patients based on Acute Physiology and Chronic Health Evaluation (APACHE) II score. We obtained data from patients who received intravenous vancomycin for a suspected or definitively diagnosed Gram-positive bacterial infection within 72 h after admission to the intensive care unit. The Vd and elimination half-life (t 1/2) of vancomycin values were calculated using the Bayesian method, and we investigated the relationship between them and APACHE II score. There were significant correlations between APACHE II scores and Vd/actual body weight (ABW), as well as t 1/2 (r = 0.58, p vancomycin could be estimated using the following regression equations using APACHE II score.[Formula: see text] [Formula: see text]We found that APACHE II score was a useful index for predicting the Vd and t 1/2 of vancomycin, and used that to establish an initial vancomycin dosing regimen comprised of initial dose and administration interval for individual patients.

  16. Effectively Engaging in Tribal Consultation to protect Traditional Cultural Properties while navigating the 1872 Mining Law - Tonto National Forest, Western Apache Tribes, & Resolution Copper Mine

    Science.gov (United States)

    Nez, N.

    2017-12-01

    By effectively engaging in government-to-government consultation the Tonto National Forest is able to consider oral histories and tribal cultural knowledge in decision making. These conversations often have the potential to lead to the protection and preservation of public lands. Discussed here is one example of successful tribal consultation and how it let to the protection of Traditional Cultural Properties (TCPs). One hour east of Phoenix, Arizona on the Tonto National Forest, Resolution Copper Mine, is working to access a rich copper vein more than 7,000 feet deep. As part of the mining plan of operation they are investigating viable locations to store the earth removed from the mine site. One proposed storage location required hydrologic and geotechnical studies to determine viability. This constituted a significant amount of ground disturbance in an area that is of known importance to local Indian tribes. To ensure proper consideration of tribal concerns, the Forest engaged nine local tribes in government-government consultation. Consultation resulted in the identification of five springs in the project area considered (TCPs) by the Western Apache tribes. Due to the presence of identified TCPs, the Forest asked tribes to assist in the development of mitigation measures to minimize effects of this project on the TCPs identified. The goal of this partnership was to find a way for the Mine to still be able to gather data, while protecting TCPs. During field visits and consultations, a wide range of concerns were shared which were recorded and considered by Tonto National Forest. The Forest developed a proposed mitigation approach to protect springs, which would prevent (not permit) the installation of water monitoring wells, geotechnical borings or trench excavations within 1,200 feet of perennial springs in the project area. As an added mitigation measure, a cultural resources specialist would be on-site during all ground-disturbing activities. Diligent work on

  17. Advanced Multilevel Monte Carlo Methods

    KAUST Repository

    Jasra, Ajay

    2017-04-24

    This article reviews the application of advanced Monte Carlo techniques in the context of Multilevel Monte Carlo (MLMC). MLMC is a strategy employed to compute expectations which can be biased in some sense, for instance, by using the discretization of a associated probability law. The MLMC approach works with a hierarchy of biased approximations which become progressively more accurate and more expensive. Using a telescoping representation of the most accurate approximation, the method is able to reduce the computational cost for a given level of error versus i.i.d. sampling from this latter approximation. All of these ideas originated for cases where exact sampling from couples in the hierarchy is possible. This article considers the case where such exact sampling is not currently possible. We consider Markov chain Monte Carlo and sequential Monte Carlo methods which have been introduced in the literature and we describe different strategies which facilitate the application of MLMC within these methods.

  18. Handbook of Monte Carlo methods

    National Research Council Canada - National Science Library

    Kroese, Dirk P; Taimre, Thomas; Botev, Zdravko I

    2011-01-01

    ... in rapid succession, the staggering number of related techniques, ideas, concepts and algorithms makes it difficult to maintain an overall picture of the Monte Carlo approach. This book attempts to encapsulate the emerging dynamics of this field of study"--

  19. TARC: Carlo Rubbia's Energy Amplifier

    CERN Multimedia

    Laurent Guiraud

    1997-01-01

    Transmutation by Adiabatic Resonance Crossing (TARC) is Carlo Rubbia's energy amplifier. This CERN experiment demonstrated that long-lived fission fragments, such as 99-TC, can be efficiently destroyed.

  20. Monte Carlo simulation for IRRMA

    International Nuclear Information System (INIS)

    Gardner, R.P.; Liu Lianyan

    2000-01-01

    Monte Carlo simulation is fast becoming a standard approach for many radiation applications that were previously treated almost entirely by experimental techniques. This is certainly true for Industrial Radiation and Radioisotope Measurement Applications - IRRMA. The reasons for this include: (1) the increased cost and inadequacy of experimentation for design and interpretation purposes; (2) the availability of low cost, large memory, and fast personal computers; and (3) the general availability of general purpose Monte Carlo codes that are increasingly user-friendly, efficient, and accurate. This paper discusses the history and present status of Monte Carlo simulation for IRRMA including the general purpose (GP) and specific purpose (SP) Monte Carlo codes and future needs - primarily from the experience of the authors

  1. Carlos Chagas: biographical sketch.

    Science.gov (United States)

    Moncayo, Alvaro

    2010-01-01

    Carlos Chagas was born on 9 July 1878 in the farm "Bon Retiro" located close to the City of Oliveira in the interior of the State of Minas Gerais, Brazil. He started his medical studies in 1897 at the School of Medicine of Rio de Janeiro. In the late XIX century, the works by Louis Pasteur and Robert Koch induced a change in the medical paradigm with emphasis in experimental demonstrations of the causal link between microbes and disease. During the same years in Germany appeared the pathological concept of disease, linking organic lesions with symptoms. All these innovations were adopted by the reforms of the medical schools in Brazil and influenced the scientific formation of Chagas. Chagas completed his medical studies between 1897 and 1903 and his examinations during these years were always ranked with high grades. Oswaldo Cruz accepted Chagas as a doctoral candidate and directed his thesis on "Hematological studies of Malaria" which was received with honors by the examiners. In 1903 the director appointed Chagas as research assistant at the Institute. In those years, the Institute of Manguinhos, under the direction of Oswaldo Cruz, initiated a process of institutional growth and gathered a distinguished group of Brazilian and foreign scientists. In 1907, he was requested to investigate and control a malaria outbreak in Lassance, Minas Gerais. In this moment Chagas could not have imagined that this field research was the beginning of one of the most notable medical discoveries. Chagas was, at the age of 28, a Research Assistant at the Institute of Manguinhos and was studying a new flagellate parasite isolated from triatomine insects captured in the State of Minas Gerais. Chagas made his discoveries in this order: first the causal agent, then the vector and finally the human cases. These notable discoveries were carried out by Chagas in twenty months. At the age of 33 Chagas had completed his discoveries and published the scientific articles that gave him world

  2. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Helicopter Overflight

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, Rebecca Ann [ORNL; Hargrove, William Walter [ORNL; Suter, Glenn [U.S. Environmental Protection Agency

    2008-01-01

    A multi-stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus of the assessment was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper focuses on the wildlife risk assessment for the helicopter overflight. The primary stressors were sound and the view of the aircraft. Exposure to desert mule deer (Odocoileus hemionus crooki) was quantified using Air Force sound contour programs NOISEMAP and MR_NMAP, which gave very different results. Slant distance from helicopters to deer was also used as a measure of exposure that integrated risk from sound and view of the aircraft. Exposure-response models for the characterization of effects consisted of behavioral thresholds in sound exposure level or maximum sound level units or slant distance. Available sound thresholds were limited for desert mule deer, but a distribution of slant-distance thresholds was available for ungulates. The risk characterization used a weight-of-evidence approach and concluded that risk to mule deer behavior from the Apache overflight is uncertain, but that no risk to mule deer abundance and reproduction is expected.

  3. Adjoint electron Monte Carlo calculations

    International Nuclear Information System (INIS)

    Jordan, T.M.

    1986-01-01

    Adjoint Monte Carlo is the most efficient method for accurate analysis of space systems exposed to natural and artificially enhanced electron environments. Recent adjoint calculations for isotropic electron environments include: comparative data for experimental measurements on electronics boxes; benchmark problem solutions for comparing total dose prediction methodologies; preliminary assessment of sectoring methods used during space system design; and total dose predictions on an electronics package. Adjoint Monte Carlo, forward Monte Carlo, and experiment are in excellent agreement for electron sources that simulate space environments. For electron space environments, adjoint Monte Carlo is clearly superior to forward Monte Carlo, requiring one to two orders of magnitude less computer time for relatively simple geometries. The solid-angle sectoring approximations used for routine design calculations can err by more than a factor of 2 on dose in simple shield geometries. For critical space systems exposed to severe electron environments, these potential sectoring errors demand the establishment of large design margins and/or verification of shield design by adjoint Monte Carlo/experiment

  4. Indian Summer

    Energy Technology Data Exchange (ETDEWEB)

    Galindo, E. [Sho-Ban High School, Fort Hall, ID (United States)

    1997-08-01

    This paper focuses on preserving and strengthening two resources culturally and socially important to the Shoshone-Bannock Indian Tribe on the Fort Hall Reservation in Idaho; their young people and the Pacific-Northwest Salmon. After learning that salmon were not returning in significant numbers to ancestral fishing waters at headwater spawning sites, tribal youth wanted to know why. As a result, the Indian Summer project was conceived to give Shoshone-Bannock High School students the opportunity to develop hands-on, workable solutions to improve future Indian fishing and help make the river healthy again. The project goals were to increase the number of fry introduced into the streams, teach the Shoshone-Bannock students how to use scientific methodologies, and get students, parents, community members, and Indian and non-Indian mentors excited about learning. The students chose an egg incubation experiment to help increase self-sustaining, natural production of steelhead trout, and formulated and carried out a three step plan to increase the hatch-rate of steelhead trout in Idaho waters. With the help of local companies, governmental agencies, scientists, and mentors students have been able to meet their project goals, and at the same time, have learned how to use scientific methods to solve real life problems, how to return what they have used to the water and land, and how to have fun and enjoy life while learning.

  5. Multilevel sequential Monte Carlo samplers

    KAUST Repository

    Beskos, Alexandros

    2016-08-29

    In this article we consider the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods which depend on the step-size level . hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretization levels . ∞>h0>h1⋯>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence and a sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. It is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context. That is, relative to exact sampling and Monte Carlo for the distribution at the finest level . hL. The approach is numerically illustrated on a Bayesian inverse problem. © 2016 Elsevier B.V.

  6. Inequalities in Open Source Software Development: Analysis of Contributor’s Commits in Apache Software Foundation Projects

    Science.gov (United States)

    2016-01-01

    While researchers are becoming increasingly interested in studying OSS phenomenon, there is still a small number of studies analyzing larger samples of projects investigating the structure of activities among OSS developers. The significant amount of information that has been gathered in the publicly available open-source software repositories and mailing-list archives offers an opportunity to analyze projects structures and participant involvement. In this article, using on commits data from 263 Apache projects repositories (nearly all), we show that although OSS development is often described as collaborative, but it in fact predominantly relies on radically solitary input and individual, non-collaborative contributions. We also show, in the first published study of this magnitude, that the engagement of contributors is based on a power-law distribution. PMID:27096157

  7. The 13th Data Release of the Sloan Digital Sky Survey: First Spectroscopic Data from the SDSS-IV Survey Mapping Nearby Galaxies at Apache Point Observatory

    DEFF Research Database (Denmark)

    Albareti, Franco D.; Allende Prieto, Carlos; Almeida, Andres

    2017-01-01

    The fourth generation of the Sloan Digital Sky Survey (SDSS-IV) began observations in 2014 July. It pursues three core programs: the Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2), Mapping Nearby Galaxies at APO (MaNGA), and the Extended Baryon Oscillation Spectroscopic Surve...

  8. APOGEE-2: The Second Phase of the Apache Point Observatory Galactic Evolution Experiment in SDSS-IV

    Science.gov (United States)

    Sobeck, Jennifer; Majewski, S.; Hearty, F.; Schiavon, R. P.; Holtzman, J. A.; Johnson, J.; Frinchaboy, P. M.; Skrutskie, M. F.; Munoz, R.; Pinsonneault, M. H.; Nidever, D. L.; Zasowski, G.; Garcia Perez, A.; Fabbian, D.; Meza Cofre, A.; Cunha, K. M.; Smith, V. V.; Chiappini, C.; Beers, T. C.; Steinmetz, M.; Anders, F.; Bizyaev, D.; Roman, A.; Fleming, S. W.; Crane, J. D.; SDSS-IV/APOGEE-2 Collaboration

    2014-01-01

    The second phase of the Apache Point Observatory Galactic Evolution Experiment (APOGEE-2), a part of the Sloan Digital Sky Survey IV (SDSS-IV), will commence operations in 2014. APOGEE-2 represents a significant expansion over APOGEE-1, not only in the size of the stellar sample, but also in the coverage of the sky through observations in both the Northern and Southern Hemispheres. Observations on the 2.5m Sloan Foundation Telescope of the Apache Point Observatory (APOGEE-2N) will continue immediately after the conclusion of APOGEE-1, to be followed by observations with the 2.5m du Pont Telescope of the Las Campanas Observatory (APOGEE-2S) within three years. Over the six-year lifetime of the project, high resolution (R˜22,500), high signal-to-noise (≥100) spectroscopic data in the H-band wavelength regime (1.51-1.69 μm) will be obtained for several hundred thousand stars, more than tripling the total APOGEE-1 sample. Accurate radial velocities and detailed chemical compositions will be generated for target stars in the main Galactic components (bulge, disk, and halo), open/globular clusters, and satellite dwarf galaxies. The spectroscopic follow-up program of Kepler targets with the APOGEE-2N instrument will be continued and expanded. APOGEE-2 will significantly extend and enhance the APOGEE-1 legacy of scientific contributions to understanding the origin and evolution of the elements, the assembly and formation history of galaxies like the Milky Way, and fundamental stellar astrophysics.

  9. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    Energy Technology Data Exchange (ETDEWEB)

    Efroymson, R.A.

    2002-05-09

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km{sup 2} between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  10. Demonstration of the Military Ecological Risk Assessment Framework (MERAF): Apache Longbow - Hell Missile Test at Yuma Proving Ground

    International Nuclear Information System (INIS)

    Efroymson, R.A.

    2002-01-01

    This ecological risk assessment for a testing program at Yuma Proving Ground, Arizona, is a demonstration of the Military Ecological Risk Assessment Framework (MERAF; Suter et al. 2001). The demonstration is intended to illustrate how risk assessment guidance concerning-generic military training and testing activities and guidance concerning a specific type of activity (e.g., low-altitude aircraft overflights) may be implemented at a military installation. MERAF was developed with funding from the Strategic Research and Development Program (SERDP) of the Department of Defense. Novel aspects of MERAF include: (1) the assessment of risks from physical stressors using an ecological risk assessment framework, (2) the consideration of contingent or indirect effects of stressors (e.g., population-level effects that are derived from habitat or hydrological changes), (3) the integration of risks associated with different component activities or stressors, (4) the emphasis on quantitative risk estimates and estimates of uncertainty, and (5) the modularity of design, permitting components of the framework to be used in various military risk assessments that include similar activities. The particular subject of this report is the assessment of ecological risks associated with a testing program at Cibola Range of Yuma Proving Ground, Arizona. The program involves an Apache Longbow helicopter firing Hellfire missiles at moving targets, i.e., M60-A1 tanks. Thus, the three component activities of the Apache-Hellfire test were: (1) helicopter overflight, (2) missile firing, and (3) tracked vehicle movement. The demonstration was limited, to two ecological endpoint entities (i.e., potentially susceptible and valued populations or communities): woody desert wash communities and mule deer populations. The core assessment area is composed of about 126 km 2 between the Chocolate and Middle Mountains. The core time of the program is a three-week period, including fourteen days of

  11. Exact Monte Carlo for molecules

    Energy Technology Data Exchange (ETDEWEB)

    Lester, W.A. Jr.; Reynolds, P.J.

    1985-03-01

    A brief summary of the fixed-node quantum Monte Carlo method is presented. Results obtained for binding energies, the classical barrier height for H + H2, and the singlet-triplet splitting in methylene are presented and discussed. 17 refs.

  12. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    time Technical Consultant to. Systat Software Asia-Pacific. (P) Ltd., in Bangalore, where the technical work for the development of the statistical software Systat takes place. His research interests have been in statistical pattern recognition and biostatistics. Keywords. Markov chain, Monte Carlo sampling, Markov chain Monte.

  13. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    Markov Chain Monte Carlo Methods. 2. The Markov Chain Case. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance. His spare time is ...

  14. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    GENERAL ! ARTICLE. Markov Chain Monte Carlo Methods. 3. Statistical Concepts. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance.

  15. Monte Carlo calculations of nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Pieper, S.C. [Argonne National Lab., IL (United States). Physics Div.

    1997-10-01

    Nuclear many-body calculations have the complication of strong spin- and isospin-dependent potentials. In these lectures the author discusses the variational and Green`s function Monte Carlo techniques that have been developed to address this complication, and presents a few results.

  16. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    ter of the 20th century, due to rapid developments in computing technology ... early part of this development saw a host of Monte ... These iterative. Monte Carlo procedures typically generate a random se- quence with the Markov property such that the Markov chain is ergodic with a limiting distribution coinciding with the ...

  17. Is Monte Carlo embarrassingly parallel?

    International Nuclear Information System (INIS)

    Hoogenboom, J. E.

    2012-01-01

    Monte Carlo is often stated as being embarrassingly parallel. However, running a Monte Carlo calculation, especially a reactor criticality calculation, in parallel using tens of processors shows a serious limitation in speedup and the execution time may even increase beyond a certain number of processors. In this paper the main causes of the loss of efficiency when using many processors are analyzed using a simple Monte Carlo program for criticality. The basic mechanism for parallel execution is MPI. One of the bottlenecks turn out to be the rendez-vous points in the parallel calculation used for synchronization and exchange of data between processors. This happens at least at the end of each cycle for fission source generation in order to collect the full fission source distribution for the next cycle and to estimate the effective multiplication factor, which is not only part of the requested results, but also input to the next cycle for population control. Basic improvements to overcome this limitation are suggested and tested. Also other time losses in the parallel calculation are identified. Moreover, the threading mechanism, which allows the parallel execution of tasks based on shared memory using OpenMP, is analyzed in detail. Recommendations are given to get the maximum efficiency out of a parallel Monte Carlo calculation. (authors)

  18. Monte Carlo - Advances and Challenges

    International Nuclear Information System (INIS)

    Brown, Forrest B.; Mosteller, Russell D.; Martin, William R.

    2008-01-01

    Abstract only, full text follows: With ever-faster computers and mature Monte Carlo production codes, there has been tremendous growth in the application of Monte Carlo methods to the analysis of reactor physics and reactor systems. In the past, Monte Carlo methods were used primarily for calculating k eff of a critical system. More recently, Monte Carlo methods have been increasingly used for determining reactor power distributions and many design parameters, such as β eff , l eff , τ, reactivity coefficients, Doppler defect, dominance ratio, etc. These advanced applications of Monte Carlo methods are now becoming common, not just feasible, but bring new challenges to both developers and users: Convergence of 3D power distributions must be assured; confidence interval bias must be eliminated; iterated fission probabilities are required, rather than single-generation probabilities; temperature effects including Doppler and feedback must be represented; isotopic depletion and fission product buildup must be modeled. This workshop focuses on recent advances in Monte Carlo methods and their application to reactor physics problems, and on the resulting challenges faced by code developers and users. The workshop is partly tutorial, partly a review of the current state-of-the-art, and partly a discussion of future work that is needed. It should benefit both novice and expert Monte Carlo developers and users. In each of the topic areas, we provide an overview of needs, perspective on past and current methods, a review of recent work, and discussion of further research and capabilities that are required. Electronic copies of all workshop presentations and material will be available. The workshop is structured as 2 morning and 2 afternoon segments: - Criticality Calculations I - convergence diagnostics, acceleration methods, confidence intervals, and the iterated fission probability, - Criticality Calculations II - reactor kinetics parameters, dominance ratio, temperature

  19. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    ... Dirichlet prior; Metropolis-Hastings algorithm; rejection sampling; Gibbs sampler; proposal density; Rao-Blackwellisation; binomial; multinomial; Gamma; uniform. ... School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA; Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560059, India.

  20. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    K B Athreya1 Mohan Delampady2 T Krishnan3. School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Rood Bangalore 560 059, India. Systat Software Asia-Pacific Ltd. Floor 5, 'C' Tower Golden Enclave, Airport Rood Bangalore 560 017, India.

  1. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    2. The Markov Chain Case. K B Athreya, Mohan Delampady and T Krishnan. K B Athreya is a Professor at. Cornell University. His research interests include mathematical analysis, probability theory and its application and statistics. He enjoys writing for Resonance. His spare time is spent listening to Indian classical music.

  2. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    Author Affiliations. K B Athreya1 Mohan Delampady2 T Krishnan3. School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA; Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560059, India. Systat Software Asia-Pacific Ltd. Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560 017, ...

  3. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    Author Affiliations. K B Athreya1 Mohan Delampady2 T Krishnan3. School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA; Indian Statistical Institute 8th Mile, Mysore Road Bangalore 560 059, India. Systat Software Asia-Pacific Ltd. Floor 5, 'C' Tower Golden Enclave, Airport Road Bangalore 560 017, ...

  4. Markov Chain Monte Carlo Methods

    Indian Academy of Sciences (India)

    Author Affiliations. K B Athreya1 Mohan Delampady2 T Krishnan3. School of ORIE Rhodes Hall Cornell University, Ithaca New York 14853, USA. Indian Statistical Institute 8th Mile, Mysore Rood Bangalore 560 059, India. Systat Software Asia-Pacific Ltd. Floor 5, 'C' Tower Golden Enclave, Airport Rood Bangalore 560 017, ...

  5. (U) Introduction to Monte Carlo Methods

    Energy Technology Data Exchange (ETDEWEB)

    Hungerford, Aimee L. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-03-20

    Monte Carlo methods are very valuable for representing solutions to particle transport problems. Here we describe a “cook book” approach to handling the terms in a transport equation using Monte Carlo methods. Focus is on the mechanics of a numerical Monte Carlo code, rather than the mathematical foundations of the method.

  6. APACHE score, Severity Index of Paraquat Poisoning, and serum lactic acid concentration in the prognosis of paraquat poisoning of Chinese Patients.

    Science.gov (United States)

    Xu, Shuyun; Hu, Hai; Jiang, Zhen; Tang, Shiyuan; Zhou, Yuangao; Sheng, Jie; Chen, Jinggang; Cao, Yu

    2015-02-01

    Many prognostic indictors have been studied to evaluate the prognosis of paraquat poisoning. However, the optimal indicator remains unclear. To determine the value of the Acute Physiology and Chronic Health Evaluation II (APACHE II) score, the Severity Index of Paraquat Poisoning (SIPP), and serum lactate levels in the prognosis of paraquat poisoning, we performed a prospective study that enrolled 143 paraquat patients. Data were collected from patients (161) at West China Hospital in Chengdu, China, including details about the patients' general conditions, laboratory examinations, and treatment. Receiver operating characteristic curves for predicting inpatient mortality based on APACHE II score, SIPP, and lactate levels were generated. To analyze the best cutoff values for lactate levels, APACHE II scores, and SIPP in predicting the prognosis of paraquat poisoning, the initial parameters on admission and 7-day survival curves of patients with lactate levels greater than or equal to 2.95 mmol/L, APACHE II score greater than or equal to 15.22, and SIPP greater than or equal to 5.50 h · mg/L at the time of arrival at West China Hospital were compared using the 1-way analysis of variance and the log-rank test. The APACHE II score (5.45 [3.67] vs 11.29 [4.31]), SIPP (2.78 [1.89] vs 7.63 [2.46] h · mg/L), and lactate level (2.78 [1.89] vs 7.63 [2.46] mmol/L) were significantly lower in survivors (77) after oral ingestion of paraquat, compared with nonsurvivors (66). The APACHE II score, SIPP, and lactate level had different areas under the curve (0.847, 0.789, and 0.916, respectively) and accuracy (0.64, 0.84, and 0.89, respectively). Respiratory rate, serum creatinine level, Paco2, and mortality rate at 7 days after admission in patients with lactate levels greater than or equal to 2.95 mmol/L were markedly different compared with those of other patients (P paraquat poisoning.

  7. Shell model Monte Carlo methods

    International Nuclear Information System (INIS)

    Koonin, S.E.

    1996-01-01

    We review quantum Monte Carlo methods for dealing with large shell model problems. These methods reduce the imaginary-time many-body evolution operator to a coherent superposition of one-body evolutions in fluctuating one-body fields; resultant path integral is evaluated stochastically. We first discuss the motivation, formalism, and implementation of such Shell Model Monte Carlo methods. There then follows a sampler of results and insights obtained from a number of applications. These include the ground state and thermal properties of pf-shell nuclei, thermal behavior of γ-soft nuclei, and calculation of double beta-decay matrix elements. Finally, prospects for further progress in such calculations are discussed. 87 refs

  8. Monte Carlo Methods in ICF

    Science.gov (United States)

    Zimmerman, George B.

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials.

  9. Monte Carlo methods in ICF

    International Nuclear Information System (INIS)

    Zimmerman, G.B.

    1997-01-01

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials. copyright 1997 American Institute of Physics

  10. Monte Carlo methods in ICF

    International Nuclear Information System (INIS)

    Zimmerman, George B.

    1997-01-01

    Monte Carlo methods appropriate to simulate the transport of x-rays, neutrons, ions and electrons in Inertial Confinement Fusion targets are described and analyzed. The Implicit Monte Carlo method of x-ray transport handles symmetry within indirect drive ICF hohlraums well, but can be improved 50X in efficiency by angular biasing the x-rays towards the fuel capsule. Accurate simulation of thermonuclear burn and burn diagnostics involves detailed particle source spectra, charged particle ranges, inflight reaction kinematics, corrections for bulk and thermal Doppler effects and variance reduction to obtain adequate statistics for rare events. It is found that the effects of angular Coulomb scattering must be included in models of charged particle transport through heterogeneous materials

  11. Assessment of performance and utility of mortality prediction models in a single Indian mixed tertiary intensive care unit.

    Science.gov (United States)

    Sathe, Prachee M; Bapat, Sharda N

    2014-01-01

    To assess the performance and utility of two mortality prediction models viz. Acute Physiology and Chronic Health Evaluation II (APACHE II) and Simplified Acute Physiology Score II (SAPS II) in a single Indian mixed tertiary intensive care unit (ICU). Secondary objectives were bench-marking and setting a base line for research. In this observational cohort, data needed for calculation of both scores were prospectively collected for all consecutive admissions to 28-bedded ICU in the year 2011. After excluding readmissions, discharges within 24 h and age predicted mortality had strong association with true mortality (R (2) = 0.98 for APACHE II and R (2) = 0.99 for SAPS II). Both models performed poorly in formal Hosmer-Lemeshow goodness-of-fit testing (Chi-square = 12.8 (P = 0.03) for APACHE II, Chi-square = 26.6 (P = 0.001) for SAPS II) but showed good discrimination (area under receiver operating characteristic curve 0.86 ± 0.013 SE (P care and comparing performances of different units without customization. Considering comparable performance and simplicity of use, efforts should be made to adapt SAPS II.

  12. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  13. Extending canonical Monte Carlo methods

    International Nuclear Information System (INIS)

    Velazquez, L; Curilef, S

    2010-01-01

    In this paper, we discuss the implications of a recently obtained equilibrium fluctuation-dissipation relation for the extension of the available Monte Carlo methods on the basis of the consideration of the Gibbs canonical ensemble to account for the existence of an anomalous regime with negative heat capacities C α with α≈0.2 for the particular case of the 2D ten-state Potts model

  14. Parallel Monte Carlo reactor neutronics

    International Nuclear Information System (INIS)

    Blomquist, R.N.; Brown, F.B.

    1994-01-01

    The issues affecting implementation of parallel algorithms for large-scale engineering Monte Carlo neutron transport simulations are discussed. For nuclear reactor calculations, these include load balancing, recoding effort, reproducibility, domain decomposition techniques, I/O minimization, and strategies for different parallel architectures. Two codes were parallelized and tested for performance. The architectures employed include SIMD, MIMD-distributed memory, and workstation network with uneven interactive load. Speedups linear with the number of nodes were achieved

  15. Comparison of Charlson comorbidity index with SAPS and APACHE scores for prediction of mortality following intensive care

    Directory of Open Access Journals (Sweden)

    Christensen S

    2011-06-01

    Full Text Available Steffen Christensen1, Martin Berg Johansen1, Christian Fynbo Christiansen1, Reinhold Jensen2, Stanley Lemeshow1,31Department of Clinical Epidemiology, Aarhus University Hospital, Aarhus, Denmark; 2Department of Intensive Care, Skejby Hospital, Aarhus University Hospital, Aarhus, Denmark; 3Division of Biostatistics, College of Public Health, Ohio State University, Columbus, OH, USABackground: Physiology-based severity of illness scores are often used for risk adjustment in observational studies of intensive care unit (ICU outcome. However, the complexity and time constraints of these scoring systems may limit their use in administrative databases. Comorbidity is a main determinant of ICU outcome, and comorbidity scores can be computed based on data from most administrative databases. However, limited data exist on the performance of comorbidity scores in predicting mortality of ICU patients.Objectives: To examine the performance of the Charlson comorbidity index (CCI alone and in combination with other readily available administrative data and three physiology-based scores (acute physiology and chronic health evaluations [APACHE] II, simplified acute physiology score [SAPS] II, and SAPS III in predicting short- and long-term mortality following intensive care.Methods: For all adult patients (n = 469 admitted to a tertiary university–affiliated ICU in 2007, we computed APACHE II, SAPS II, and SAPS III scores based on data from medical records. Data on CCI score age and gender, surgical/medical status, social factors, mechanical ventilation and renal replacement therapy, primary diagnosis, and complete follow-up for 1-year mortality was obtained from administrative databases. We computed goodness-of-fit statistics and c-statistics (area under ROC [receiver operating characteristic] curve as measures of model calibration (ability to predict mortality proportions over classes of risk and discrimination (ability to discriminate among the patients

  16. Indian Ledger Art.

    Science.gov (United States)

    Chilcoat, George W.

    1990-01-01

    Offers an innovative way to teach mid-nineteenth century North American Indian history by having students create their own Indian Ledger art. Purposes of the project are: to understand the role played by American Indians, to reveal American Indian stereotypes, and to identify relationships between cultures and environments. Background and…

  17. Jim Crow, Indian Style.

    Science.gov (United States)

    Svingen, Orlan J.

    1987-01-01

    Reviews history of voting rights for Indians and discusses a 1986 decision calling for election reform in Big Horn County, Montana, to eliminate violations of the voting rights of the county's Indian citizens. Notes that positive effects--such as election of the county's first Indian commissioner--co-exist with enduring anti-Indian sentiment. (JHZ)

  18. Development and Comparison of Open Source based Web GIS Frameworks on WAMP and Apache Tomcat Web Servers

    Science.gov (United States)

    Agrawal, S.; Gupta, R. D.

    2014-04-01

    Geographic Information System (GIS) is a tool used for capture, storage, manipulation, query and presentation of spatial data that have applicability in diverse fields. Web GIS has put GIS on Web, that made it available to common public which was earlier used by few elite users. In the present paper, development of Web GIS frameworks has been explained that provide the requisite knowledge for creating Web based GIS applications. Open Source Software (OSS) have been used to develop two Web GIS frameworks. In first Web GIS framework, WAMP server, ALOV, Quantum GIS and MySQL have been used while in second Web GIS framework, Apache Tomcat server, GeoServer, Quantum GIS, PostgreSQL and PostGIS have been used. These two Web GIS frameworks have been critically compared to bring out the suitability of each for a particular application as well as their performance. This will assist users in selecting the most suitable one for a particular Web GIS application.

  19. Data collection and field experiments at the Apache Leap research site. Annual report, May 1995--1996

    Energy Technology Data Exchange (ETDEWEB)

    Woodhouse, E.G. [ed.; Bassett, R.L.; Neuman, S.P.; Chen, G. [and others

    1997-08-01

    This report documents the research performed during the period May 1995-May 1996 for a project of the U.S. Regulatory Commission (sponsored contract NRC-04-090-051) by the University of Arizona. The project manager for this research in Thomas J. Nicholson, Office of Nuclear Regulatory Research. The objectives of this research were to examine hypotheses and test alternative conceptual models concerning unsaturated flow and transport through fractured rock, and to design and execute confirmatory field and laboratory experiments to test these hypotheses and conceptual models at the Apache Leap Research Site near Superior, Arizona. Each chapter in this report summarizes research related to a specific set of objectives and can be read and interpreted as a separate entity. Topics include: crosshole pneumatic and gaseous tracer field and modeling experiments designed to help validate the applicability of contiuum geostatistical and stochastic concepts, theories, models, and scaling relations relevant to unsaturated flow and transport in fractured porous tuffs; use of geochemistry and aquifer testing to evaluate fracture flow and perching mechanisms; investigations of {sup 234}U/{sup 238}U fractionation to evaluate leaching selectivity; and transport and modeling of both conservative and non-conservative tracers.

  20. Cloud Computing: A model Construct of Real-Time Monitoring for Big Dataset Analytics Using Apache Spark

    Science.gov (United States)

    Alkasem, Ameen; Liu, Hongwei; Zuo, Decheng; Algarash, Basheer

    2018-01-01

    The volume of data being collected, analyzed, and stored has exploded in recent years, in particular in relation to the activity on the cloud computing. While large-scale data processing, analysis, storage, and platform model such as cloud computing were previously and currently are increasingly. Today, the major challenge is it address how to monitor and control these massive amounts of data and perform analysis in real-time at scale. The traditional methods and model systems are unable to cope with these quantities of data in real-time. Here we present a new methodology for constructing a model for optimizing the performance of real-time monitoring of big datasets, which includes a machine learning algorithms and Apache Spark Streaming to accomplish fine-grained fault diagnosis and repair of big dataset. As a case study, we use the failure of Virtual Machines (VMs) to start-up. The methodology proposition ensures that the most sensible action is carried out during the procedure of fine-grained monitoring and generates the highest efficacy and cost-saving fault repair through three construction control steps: (I) data collection; (II) analysis engine and (III) decision engine. We found that running this novel methodology can save a considerate amount of time compared to the Hadoop model, without sacrificing the classification accuracy or optimization of performance. The accuracy of the proposed method (92.13%) is an improvement on traditional approaches.

  1. Data collection and field experiments at the Apache Leap research site. Annual report, May 1995--1996

    International Nuclear Information System (INIS)

    Woodhouse, E.G.; Bassett, R.L.; Neuman, S.P.; Chen, G.

    1997-08-01

    This report documents the research performed during the period May 1995-May 1996 for a project of the U.S. Regulatory Commission (sponsored contract NRC-04-090-051) by the University of Arizona. The project manager for this research in Thomas J. Nicholson, Office of Nuclear Regulatory Research. The objectives of this research were to examine hypotheses and test alternative conceptual models concerning unsaturated flow and transport through fractured rock, and to design and execute confirmatory field and laboratory experiments to test these hypotheses and conceptual models at the Apache Leap Research Site near Superior, Arizona. Each chapter in this report summarizes research related to a specific set of objectives and can be read and interpreted as a separate entity. Topics include: crosshole pneumatic and gaseous tracer field and modeling experiments designed to help validate the applicability of contiuum geostatistical and stochastic concepts, theories, models, and scaling relations relevant to unsaturated flow and transport in fractured porous tuffs; use of geochemistry and aquifer testing to evaluate fracture flow and perching mechanisms; investigations of 234 U/ 238 U fractionation to evaluate leaching selectivity; and transport and modeling of both conservative and non-conservative tracers

  2. Effect of Climate Conditions on Land Surface Productivity Across the Mojave, Sonoran, and Chihuahua Deserts and Apache Highlands

    Science.gov (United States)

    K. C., Pratima

    Understanding the patterns and relationships between land surface productivity and the climatic condition is essential to predict the impact of climate change. This study aims to understand spatial temporal variability and relationships of land surface productivity using Normalized Difference Vegetation Index (NDVI) and drought indices, mainly Standard Precipitation Index (SPI) and Standard Precipitation Evaporation Index (SPEI) across four ecoregions: Mojave, Sonoran, Apache Highlands and Chihuahua of the Southwest United States. Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Difference Vegetation Index (NDVI) and land cover data, and Parameter Regression on Independent Slopes Model (PRISM) precipitation and temperature data were used for analysis. Using Mann-Kendall, I calculated the trends in annual and seasonal NDVI, SPI and SPEI datasets. I used the Pearson Correlation Coefficients to examine the response of integrated and monthly NDVI values to SPI and SPEI values. The positive and negative trends were found during the annual and monsoon seasons whereas only negative trends were found during the spring season for NDVI, SPI and SPEI values. The relationship between NDVI and coincident and antecedent SPEI values changed significantly by area and season for each of the ecoregions across the east-west seasonal precipitation gradient.

  3. Comparative Study of Load Testing Tools: Apache JMeter, HP LoadRunner, Microsoft Visual Studio (TFS, Siege

    Directory of Open Access Journals (Sweden)

    Rabiya Abbas

    2017-12-01

    Full Text Available Software testing is the process of verifying and validating the user’s requirements. Testing is ongoing process during whole software development. Software testing is characterized into three main types. That is, in Black box testing, user doesn’t know domestic knowledge, internal logics and design of system. In white box testing, Tester knows the domestic logic of code. In Grey box testing, Tester has little bit knowledge about the internal structure and working of the system. It is commonly used in case of Integration testing.Load testing helps us to analyze the performance of the system under heavy load or under Zero load. This is achieved with the help of a Load Testing Tool. The intention for writing this research is to carry out a comparison of four load testing tools i.e. Apache JMeter, LoadRunner, Microsoft Visual Studio (TFS, Siege based on certain criteria  i.e. test scripts generation , result reports, application support, plug-in supports, and cost . The main focus is to study these load testing tools and identify which tool is better and more efficient . We assume this comparison can help in selecting the most appropriate tool and motivates the use of open source load testing tools.

  4. Developing Online Communities with LAMP (Linux, Apache, MySQL, PHP) - the IMIA OSNI and CHIRAD Experiences.

    Science.gov (United States)

    Murray, Peter J; Oyri, Karl

    2005-01-01

    Many health informatics organisations do not seem to use, on a practical basis, for the benefit of their activities and interaction with their members, the very technologies that they often promote for use within healthcare environments. In particular, many organisations seem to be slow to take up the benefits of interactive web technologies. This paper presents an introduction to some of the many free/libre and open source (FLOSS) applications currently available and using the LAMP - Linux, Apache, MySQL, PHP architecture - as a way of cheaply deploying reliable, scalable, and secure web applications. The experience of moving to applications using LAMP architecture, in particular that of the Open Source Nursing Informatics (OSNI) Working Group of the Special Interest Group in Nursing Informatics of the International Medical Informatics Association (IMIA-NI), in using PostNuke, a FLOSS Content Management System (CMS) illustrates many of the benefits of such applications. The experiences of the authors in installing and maintaining a large number of websites using FLOSS CMS to develop dynamic, interactive websites that facilitate real engagement with the members of IMIA-NI OSNI, the IMIA Open Source Working Group, and the Centre for Health Informatics Research and Development (CHIRAD), as well as other organisations, is used as the basis for discussing the potential benefits that could be realised by others within the health informatics community.

  5. Prediction of Mortality after Emergent Transjugular Intrahepatic Portosystemic Shunt Placement: Use of APACHE II, Child-Pugh and MELD Scores in Asian Patients with Refractory Variceal Hemorrhage

    Energy Technology Data Exchange (ETDEWEB)

    Tzeng, Wen Sheng; Wu, Reng Hong; Lin, Ching Yih; Chen, Jyh Jou; Sheu, Ming Juen; Koay, Lok Beng; Lee, Chuan [Chi-Mei Foundation Medical Center, Tainan (China)

    2009-10-15

    This study was designed to determine if existing methods of grading liver function that have been developed in non-Asian patients with cirrhosis can be used to predict mortality in Asian patients treated for refractory variceal hemorrhage by the use of the transjugular intrahepatic portosystemic shunt (TIPS) procedure. Data for 107 consecutive patients who underwent an emergency TIPS procedure were retrospectively analyzed. Acute physiology and chronic health evaluation (APACHE II), Child-Pugh and model for end-stage liver disease (MELD) scores were calculated. Survival analyses were performed to evaluate the ability of the various models to predict 30-day, 60-day and 360-day mortality. The ability of stratified APACHE II, Child-Pugh, and MELD scores to predict survival was assessed by the use of Kaplan-Meier analysis with the log-rank test. No patient died during the TIPS procedure, but 82 patients died during the follow-up period. Thirty patients died within 30 days after the TIPS procedure; 37 patients died within 60 days and 53 patients died within 360 days. Univariate analysis indicated that hepatorenal syndrome, use of inotropic agents and mechanical ventilation were associated with elevated 30-day mortality (p < 0.05). Multivariate analysis showed that a Child-Pugh score > 11 or an MELD score > 20 predicted increased risk of death at 30, 60 and 360 days (p < 0.05). APACHE II scores could only predict mortality at 360 days (p < 0.05). A Child-Pugh score > 11 or an MELD score > 20 are predictive of mortality in Asian patients with refractory variceal hemorrhage treated with the TIPS procedure. An APACHE II score is not predictive of early mortality in this patient population.

  6. Validation of APACHE II scoring system at 24 hours after admission as a prognostic tool in urosepsis: A prospective observational study

    Directory of Open Access Journals (Sweden)

    Sundaramoorthy VijayGanapathy

    2017-11-01

    Full Text Available Purpose: Urosepsis implies clinically evident severe infection of urinary tract with features of systemic inflammatory response syndrome (SIRS. We validate the role of a single Acute Physiology and Chronic Health Evaluation II (APACHE II score at 24 hours after admission in predicting mortality in urosepsis. Materials and Methods: A prospective observational study was done in 178 patients admitted with urosepsis in the Department of Urology, in a tertiary care institute from January 2015 to August 2016. Patients >18 years diagnosed as urosepsis using SIRS criteria with positive urine or blood culture for bacteria were included. At 24 hours after admission to intensive care unit, APACHE II score was calculated using 12 physiological variables, age and chronic health. Results: Mean±standard deviation (SD APACHE II score was 26.03±7.03. It was 24.31±6.48 in survivors and 32.39±5.09 in those expired (p<0.001. Among patients undergoing surgery, mean±SD score was higher (30.74±4.85 than among survivors (24.30±6.54 (p<0.001. Receiver operating characteristic (ROC analysis revealed area under curve (AUC of 0.825 with cutoff 25.5 being 94.7% sensitive and 56.4% specific to predict mortality. Mean±SD score in those undergoing surgery was 25.22±6.70 and was lesser than those who did not undergo surgery (28.44±7.49 (p=0.007. ROC analysis revealed AUC of 0.760 with cutoff 25.5 being 94.7% sensitive and 45.6% specific to predict mortality even after surgery. Conclusions: A single APACHE II score assessed at 24 hours after admission was able to predict morbidity, mortality, need for surgical intervention, length of hospitalization, treatment success and outcome in urosepsis patients.

  7. Prediction of Mortality after Emergent Transjugular Intrahepatic Portosystemic Shunt Placement: Use of APACHE II, Child-Pugh and MELD Scores in Asian Patients with Refractory Variceal Hemorrhage

    International Nuclear Information System (INIS)

    Tzeng, Wen Sheng; Wu, Reng Hong; Lin, Ching Yih; Chen, Jyh Jou; Sheu, Ming Juen; Koay, Lok Beng; Lee, Chuan

    2009-01-01

    This study was designed to determine if existing methods of grading liver function that have been developed in non-Asian patients with cirrhosis can be used to predict mortality in Asian patients treated for refractory variceal hemorrhage by the use of the transjugular intrahepatic portosystemic shunt (TIPS) procedure. Data for 107 consecutive patients who underwent an emergency TIPS procedure were retrospectively analyzed. Acute physiology and chronic health evaluation (APACHE II), Child-Pugh and model for end-stage liver disease (MELD) scores were calculated. Survival analyses were performed to evaluate the ability of the various models to predict 30-day, 60-day and 360-day mortality. The ability of stratified APACHE II, Child-Pugh, and MELD scores to predict survival was assessed by the use of Kaplan-Meier analysis with the log-rank test. No patient died during the TIPS procedure, but 82 patients died during the follow-up period. Thirty patients died within 30 days after the TIPS procedure; 37 patients died within 60 days and 53 patients died within 360 days. Univariate analysis indicated that hepatorenal syndrome, use of inotropic agents and mechanical ventilation were associated with elevated 30-day mortality (p 11 or an MELD score > 20 predicted increased risk of death at 30, 60 and 360 days (p 11 or an MELD score > 20 are predictive of mortality in Asian patients with refractory variceal hemorrhage treated with the TIPS procedure. An APACHE II score is not predictive of early mortality in this patient population

  8. Cuartel San Carlos. Yacimiento veterano

    Directory of Open Access Journals (Sweden)

    Mariana Flores

    2007-01-01

    Full Text Available El Cuartel San Carlos es un monumento histórico nacional (1986 de finales del siglo XVIII (1785-1790, caracterizado por sufrir diversas adversidades en su construcción y soportar los terremotos de 1812 y 1900. En el año 2006, el organismo encargado de su custodia, el Instituto de Patrimonio Cultural del Ministerio de Cultura, ejecutó tres etapas de exploración arqueológica, que abarcaron las áreas Traspatio, Patio Central y las Naves Este y Oeste de la edificación. Este trabajo reseña el análisis de la documentación arqueológica obtenida en el sitio, a partir de la realización de dicho proyecto, denominado EACUSAC (Estudio Arqueológico del Cuartel San Carlos, que representa además, la tercera campaña realizada en el sitio. La importancia de este yacimiento histórico, radica en su participación en los acontecimientos que propiciaron conflictos de poder durante el surgimiento de la República y en los sucesos políticos del siglo XX. De igual manera, se encontró en el sitio una amplia muestra de materiales arqueológicos que reseñan un estilo de vida cotidiana militar, así como las dinámicas sociales internas ocurridas en el San Carlos, como lugar estratégico para la defensa de los diferentes regímenes que atravesó el país, desde la época del imperialismo español hasta nuestros días.

  9. Carlos Battilana: Profesor, Gestor, Amigo

    Directory of Open Access Journals (Sweden)

    José Pacheco

    2009-12-01

    Full Text Available El Comité Editorial de Anales ha perdido a uno de sus miembros más connotados. Brillante docente de nuestra Facultad, Carlos Alberto Battilana Guanilo (1945-2009 supo transmitir los conocimientos y atraer la atención de sus auditorios, de jóvenes estudiantes o de contemporáneos ya no tan jóvenes. Interesó a sus alumnos en la senda de la capacitación permanente y en la investigación. Por otro lado, comprometió a médicos distinguidos a conformar y liderar grupos con interés en la ciencia-amistad. Su vocación docente lo vinculó a facultades de medicina y academias y sociedades científicas, en donde coordinó cursos y congresos de grato recuerdo. Su producción científica la dedicó a la nefrología, inmunología, cáncer, costos en el tratamiento médico. Su capacidad gestora y de liderazgo presente desde su época de estudiante, le permitió llegar a ser director regional de un laboratorio farmacéutico de mucho prestigio, organizar una facultad de medicina y luego tener el cargo de decano de la facultad de ciencias de la salud de dicha universidad privada. Carlos fue elemento importante para que Anales alcanzara un sitial de privilegio entre las revistas biomédicas peruanas. En la semblanza que publicamos tratamos de resumir apretadamente la trayectoria de Carlos Battilana, semanas después de su partida sin retorno.

  10. Luis Carlos López

    Directory of Open Access Journals (Sweden)

    Rafael Maya

    1979-04-01

    Full Text Available Entre los poetasa del Centenario tuvo Luis Carlos López mucha popularidad en el extranjero, desde la publicación de su primer libro. Creo que su obra llamó la atención de filósofos como Unamuno y, si no estoy equivocado, Darío se refirió a ella en términos elogiosos. En Colombia ha sido encomiada hiperbólicamente por algunos, a tiemp que otros no le conceden mayor mérito.

  11. Antitwilight II: Monte Carlo simulations.

    Science.gov (United States)

    Richtsmeier, Steven C; Lynch, David K; Dearborn, David S P

    2017-07-01

    For this paper, we employ the Monte Carlo scene (MCScene) radiative transfer code to elucidate the underlying physics giving rise to the structure and colors of the antitwilight, i.e., twilight opposite the Sun. MCScene calculations successfully reproduce colors and spatial features observed in videos and still photos of the antitwilight taken under clear, aerosol-free sky conditions. Through simulations, we examine the effects of solar elevation angle, Rayleigh scattering, molecular absorption, aerosol scattering, multiple scattering, and surface reflectance on the appearance of the antitwilight. We also compare MCScene calculations with predictions made by the MODTRAN radiative transfer code for a solar elevation angle of +1°.

  12. Carlos Restrepo. Un verdadero Maestro

    OpenAIRE

    Pelayo Correa

    2009-01-01

    Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias...

  13. Monte Carlo techniques in radiation therapy

    CERN Document Server

    Verhaegen, Frank

    2013-01-01

    Modern cancer treatment relies on Monte Carlo simulations to help radiotherapists and clinical physicists better understand and compute radiation dose from imaging devices as well as exploit four-dimensional imaging data. With Monte Carlo-based treatment planning tools now available from commercial vendors, a complete transition to Monte Carlo-based dose calculation methods in radiotherapy could likely take place in the next decade. Monte Carlo Techniques in Radiation Therapy explores the use of Monte Carlo methods for modeling various features of internal and external radiation sources, including light ion beams. The book-the first of its kind-addresses applications of the Monte Carlo particle transport simulation technique in radiation therapy, mainly focusing on external beam radiotherapy and brachytherapy. It presents the mathematical and technical aspects of the methods in particle transport simulations. The book also discusses the modeling of medical linacs and other irradiation devices; issues specific...

  14. Application of Advanced Exploration Technologies for the Development of Mancos Formation Oil Reservoirs, Jicarilla Apache Indian Nation, San Juan Basin, New Mexico

    International Nuclear Information System (INIS)

    Reeves, Scott; Billingsley, Randy

    2002-01-01

    The objectives of this project are to: (1) develop an exploration rationale for the Mancos shale in the north-eastern San Juan basin; (2) assess the regional prospectivity of the Mancos in the northern Nation lands based on that rationale; (3) identify specific leads in the northern Nation as appropriate; (4) forecast pro-forma production, reserves and economics for any leads identified; and (5) package and disseminate the results to attract investment in Mancos development on the Nation lands

  15. Mean field simulation for Monte Carlo integration

    CERN Document Server

    Del Moral, Pierre

    2013-01-01

    In the last three decades, there has been a dramatic increase in the use of interacting particle methods as a powerful tool in real-world applications of Monte Carlo simulation in computational physics, population biology, computer sciences, and statistical machine learning. Ideally suited to parallel and distributed computation, these advanced particle algorithms include nonlinear interacting jump diffusions; quantum, diffusion, and resampled Monte Carlo methods; Feynman-Kac particle models; genetic and evolutionary algorithms; sequential Monte Carlo methods; adaptive and interacting Marko

  16. Monte Carlo simulations of neutron scattering instruments

    International Nuclear Information System (INIS)

    Aestrand, Per-Olof; Copenhagen Univ.; Lefmann, K.; Nielsen, K.

    2001-01-01

    A Monte Carlo simulation is an important computational tool used in many areas of science and engineering. The use of Monte Carlo techniques for simulating neutron scattering instruments is discussed. The basic ideas, techniques and approximations are presented. Since the construction of a neutron scattering instrument is very expensive, Monte Carlo software used for design of instruments have to be validated and tested extensively. The McStas software was designed with these aspects in mind and some of the basic principles of the McStas software will be discussed. Finally, some future prospects are discussed for using Monte Carlo simulations in optimizing neutron scattering experiments. (R.P.)

  17. Status of Monte Carlo dose planning

    International Nuclear Information System (INIS)

    Mackie, T.R.

    1995-01-01

    Monte Carlo simulation will become increasing important for treatment planning for radiotherapy. The EGS4 Monte Carlo system, a general particle transport system, has been used most often for simulation tasks in radiotherapy although ETRAN/ITS and MCNP have also been used. Monte Carlo treatment planning requires that the beam characteristics such as the energy spectrum and angular distribution of particles emerging from clinical accelerators be accurately represented. An EGS4 Monte Carlo code, called BEAM, was developed by the OMEGA Project (a collaboration between the University of Wisconsin and the National Research Council of Canada) to transport particles through linear accelerator heads. This information was used as input to simulate the passage of particles through CT-based representations of phantoms or patients using both an EGS4 code (DOSXYZ) and the macro Monte Carlo (MMC) method. Monte Carlo computed 3-D electron beam dose distributions compare well to measurements obtained in simple and complex heterogeneous phantoms. The present drawback with most Monte Carlo codes is that simulation times are slower than most non-stochastic dose computation algorithms. This is especially true for photon dose planning. In the future dedicated Monte Carlo treatment planning systems like Peregrine (from Lawrence Livermore National Laboratory), which will be capable of computing the dose from all beam types, or the Macro Monte Carlo (MMC) system, which is an order of magnitude faster than other algorithms, may dominate the field

  18. OVERVIEW OF THE SDSS-IV MaNGA SURVEY: MAPPING NEARBY GALAXIES AT APACHE POINT OBSERVATORY

    International Nuclear Information System (INIS)

    Bundy, Kevin; Bershady, Matthew A.; Wake, David A.; Tremonti, Christy; Diamond-Stanic, Aleksandar M.; Law, David R.; Cherinka, Brian; Yan, Renbin; Sánchez-Gallego, José R.; Drory, Niv; MacDonald, Nicholas; Weijmans, Anne-Marie; Thomas, Daniel; Masters, Karen; Coccato, Lodovico; Aragón-Salamanca, Alfonso; Avila-Reese, Vladimir; Badenes, Carles; Falcón-Barroso, Jésus; Belfiore, Francesco

    2015-01-01

    We present an overview of a new integral field spectroscopic survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory), one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV) that began on 2014 July 1. MaNGA will investigate the internal kinematic structure and composition of gas and stars in an unprecedented sample of 10,000 nearby galaxies. We summarize essential characteristics of the instrument and survey design in the context of MaNGA's key science goals and present prototype observations to demonstrate MaNGA's scientific potential. MaNGA employs dithered observations with 17 fiber-bundle integral field units that vary in diameter from 12'' (19 fibers) to 32'' (127 fibers). Two dual-channel spectrographs provide simultaneous wavelength coverage over 3600-10300 Å at R ∼ 2000. With a typical integration time of 3 hr, MaNGA reaches a target r-band signal-to-noise ratio of 4-8 (Å –1 per 2'' fiber) at 23 AB mag arcsec –2 , which is typical for the outskirts of MaNGA galaxies. Targets are selected with M * ≳ 10 9 M ☉ using SDSS-I redshifts and i-band luminosity to achieve uniform radial coverage in terms of the effective radius, an approximately flat distribution in stellar mass, and a sample spanning a wide range of environments. Analysis of our prototype observations demonstrates MaNGA's ability to probe gas ionization, shed light on recent star formation and quenching, enable dynamical modeling, decompose constituent components, and map the composition of stellar populations. MaNGA's spatially resolved spectra will enable an unprecedented study of the astrophysics of nearby galaxies in the coming 6 yr

  19. OVERVIEW OF THE SDSS-IV MaNGA SURVEY: MAPPING NEARBY GALAXIES AT APACHE POINT OBSERVATORY

    Energy Technology Data Exchange (ETDEWEB)

    Bundy, Kevin [Kavli Institute for the Physics and Mathematics of the Universe (Kavli IPMU, WPI), Todai Institutes for Advanced Study, the University of Tokyo, Kashiwa 277-8583 (Japan); Bershady, Matthew A.; Wake, David A.; Tremonti, Christy; Diamond-Stanic, Aleksandar M. [Department of Astronomy, University of Wisconsin-Madison, 475 North Charter Street, Madison, WI 53706 (United States); Law, David R.; Cherinka, Brian [Dunlap Institute for Astronomy and Astrophysics, University of Toronto, 50 St. George Street, Toronto, Ontario M5S 3H4 (Canada); Yan, Renbin; Sánchez-Gallego, José R. [Department of Physics and Astronomy, University of Kentucky, 505 Rose Street, Lexington, KY 40506-0055 (United States); Drory, Niv [McDonald Observatory, Department of Astronomy, University of Texas at Austin, 1 University Station, Austin, TX 78712-0259 (United States); MacDonald, Nicholas [Department of Astronomy, Box 351580, University of Washington, Seattle, WA 98195 (United States); Weijmans, Anne-Marie [School of Physics and Astronomy, University of St Andrews, North Haugh, St Andrews KY16 9SS (United Kingdom); Thomas, Daniel; Masters, Karen; Coccato, Lodovico [Institute of Cosmology and Gravitation, University of Portsmouth, Portsmouth (United Kingdom); Aragón-Salamanca, Alfonso [School of Physics and Astronomy, University of Nottingham, University Park, Nottingham NG7 2RD (United Kingdom); Avila-Reese, Vladimir [Instituto de Astronomia, Universidad Nacional Autonoma de Mexico, A.P. 70-264, 04510 Mexico D.F. (Mexico); Badenes, Carles [Department of Physics and Astronomy and Pittsburgh Particle Physics, Astrophysics and Cosmology Center (PITT PACC), University of Pittsburgh, 3941 OHara St, Pittsburgh, PA 15260 (United States); Falcón-Barroso, Jésus [Instituto de Astrofísica de Canarias, E-38200 La Laguna, Tenerife (Spain); Belfiore, Francesco [Cavendish Laboratory, University of Cambridge, 19 J. J. Thomson Avenue, Cambridge CB3 0HE (United Kingdom); and others

    2015-01-01

    We present an overview of a new integral field spectroscopic survey called MaNGA (Mapping Nearby Galaxies at Apache Point Observatory), one of three core programs in the fourth-generation Sloan Digital Sky Survey (SDSS-IV) that began on 2014 July 1. MaNGA will investigate the internal kinematic structure and composition of gas and stars in an unprecedented sample of 10,000 nearby galaxies. We summarize essential characteristics of the instrument and survey design in the context of MaNGA's key science goals and present prototype observations to demonstrate MaNGA's scientific potential. MaNGA employs dithered observations with 17 fiber-bundle integral field units that vary in diameter from 12'' (19 fibers) to 32'' (127 fibers). Two dual-channel spectrographs provide simultaneous wavelength coverage over 3600-10300 Å at R ∼ 2000. With a typical integration time of 3 hr, MaNGA reaches a target r-band signal-to-noise ratio of 4-8 (Å{sup –1} per 2'' fiber) at 23 AB mag arcsec{sup –2}, which is typical for the outskirts of MaNGA galaxies. Targets are selected with M {sub *} ≳ 10{sup 9} M {sub ☉} using SDSS-I redshifts and i-band luminosity to achieve uniform radial coverage in terms of the effective radius, an approximately flat distribution in stellar mass, and a sample spanning a wide range of environments. Analysis of our prototype observations demonstrates MaNGA's ability to probe gas ionization, shed light on recent star formation and quenching, enable dynamical modeling, decompose constituent components, and map the composition of stellar populations. MaNGA's spatially resolved spectra will enable an unprecedented study of the astrophysics of nearby galaxies in the coming 6 yr.

  20. Luis Carlos López

    Directory of Open Access Journals (Sweden)

    Fernando Garavito

    1981-06-01

    Full Text Available La crítica literaria de los últimos años se ha acostumbrado a ver en Guillermo Valencia la cifra de una época, a la que es necesario referirse, para bien o para mal, cuando se trata de fijar límites a la actividad poética de cualquiera otro de sus contemporáneos. Y aunque el aserto no es valedero en un todo respecto de quienes se consideran sus discípulos, porque en este caso la augusta soberbia del maestro de Popayán los coloca al margen, sí lo es, y en alto grado, cuando se trata de Luis Carlos López, quien por su tono, sus temas y su "aliento" ha pasado a ser manoseable.

  1. Monte Carlo lattice program KIM

    International Nuclear Information System (INIS)

    Cupini, E.; De Matteis, A.; Simonini, R.

    1980-01-01

    The Monte Carlo program KIM solves the steady-state linear neutron transport equation for a fixed-source problem or, by successive fixed-source runs, for the eigenvalue problem, in a two-dimensional thermal reactor lattice. Fluxes and reaction rates are the main quantities computed by the program, from which power distribution and few-group averaged cross sections are derived. The simulation ranges from 10 MeV to zero and includes anisotropic and inelastic scattering in the fast energy region, the epithermal Doppler broadening of the resonances of some nuclides, and the thermalization phenomenon by taking into account the thermal velocity distribution of some molecules. Besides the well known combinatorial geometry, the program allows complex configurations to be represented by a discrete set of points, an approach greatly improving calculation speed

  2. EL LENGUAJE DE CARLOS ALONSO

    Directory of Open Access Journals (Sweden)

    Bárbara Bustamante

    2005-01-01

    Full Text Available El talento de Carlos Alonso (Argentina, 1929 ha logrado conquistar un lenguaje con estilo propio. La creación de dibujos, pinturas, pasteles y tintas, collages y grabados fijaron en el campo visual la proyección de su subjetividad. Tanto la imagen como la palabra explicitan una visión crítica de la realidad, que tensiona al espectador obligándolo a una condición reflexiva y comprometida con el mensaje; este es el aspecto más destacado por los historiadores del arte. Sin embargo, la presente investigación pretende focalizar aspectos icónicos y plásticos de su hacer.

  3. El lenguaje de Carlos Alonso

    Directory of Open Access Journals (Sweden)

    Bárbara Bustamante

    2005-10-01

    Full Text Available El talento de Carlos Alonso (Argentina, 1929 ha logrado conquistar un lenguaje con estilo propio. La creación de dibujos, pinturas, pasteles y tintas, collages y grabados fijaron en el campo visual la proyección de su subjetividad. Tanto la imagen como la palabra explicitan una visión crítica de la realidad, que tensiona al espectador obligándolo a una condición reflexiva y comprometida con el mensaje; este es el aspecto más destacado por los historiadores del arte. Sin embargo, la presente investigación pretende focalizar aspectos icónicos y plásticos de su hacer.

  4. Monte Carlo simulation of experiments

    International Nuclear Information System (INIS)

    Opat, G.I.

    1977-07-01

    An outline of the technique of computer simulation of particle physics experiments by the Monte Carlo method is presented. Useful special purpose subprograms are listed and described. At each stage the discussion is made concrete by direct reference to the programs SIMUL8 and its variant MONTE-PION, written to assist in the analysis of the radiative decay experiments μ + → e + ν sub(e) antiνγ and π + → e + ν sub(e)γ, respectively. These experiments were based on the use of two large sodium iodide crystals, TINA and MINA, as e and γ detectors. Instructions for the use of SIMUL8 and MONTE-PION are given. (author)

  5. Monte Carlo Simulation of Phase Transitions

    OpenAIRE

    村井, 信行; N., MURAI; 中京大学教養部

    1983-01-01

    In the Monte Carlo simulation of phase transition, a simple heat bath method is applied to the classical Heisenberg model in two dimensions. It reproduces the correlation length predicted by the Monte Carlo renor-malization group and also computed in the non-linear σ model

  6. Advanced Computational Methods for Monte Carlo Calculations

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2018-01-12

    This course is intended for graduate students who already have a basic understanding of Monte Carlo methods. It focuses on advanced topics that may be needed for thesis research, for developing new state-of-the-art methods, or for working with modern production Monte Carlo codes.

  7. The MC21 Monte Carlo Transport Code

    International Nuclear Information System (INIS)

    Sutton TM; Donovan TJ; Trumbull TH; Dobreff PS; Caro E; Griesheimer DP; Tyburski LJ; Carpenter DC; Joo H

    2007-01-01

    MC21 is a new Monte Carlo neutron and photon transport code currently under joint development at the Knolls Atomic Power Laboratory and the Bettis Atomic Power Laboratory. MC21 is the Monte Carlo transport kernel of the broader Common Monte Carlo Design Tool (CMCDT), which is also currently under development. The vision for CMCDT is to provide an automated, computer-aided modeling and post-processing environment integrated with a Monte Carlo solver that is optimized for reactor analysis. CMCDT represents a strategy to push the Monte Carlo method beyond its traditional role as a benchmarking tool or ''tool of last resort'' and into a dominant design role. This paper describes various aspects of the code, including the neutron physics and nuclear data treatments, the geometry representation, and the tally and depletion capabilities

  8. Monte Carlo simulation in nuclear medicine

    International Nuclear Information System (INIS)

    Morel, Ch.

    2007-01-01

    The Monte Carlo method allows for simulating random processes by using series of pseudo-random numbers. It became an important tool in nuclear medicine to assist in the design of new medical imaging devices, optimise their use and analyse their data. Presently, the sophistication of the simulation tools allows the introduction of Monte Carlo predictions in data correction and image reconstruction processes. The availability to simulate time dependent processes opens up new horizons for Monte Carlo simulation in nuclear medicine. In a near future, these developments will allow to tackle simultaneously imaging and dosimetry issues and soon, case system Monte Carlo simulations may become part of the nuclear medicine diagnostic process. This paper describes some Monte Carlo method basics and the sampling methods that were developed for it. It gives a referenced list of different simulation software used in nuclear medicine and enumerates some of their present and prospective applications. (author)

  9. Western Indian Ocean Journal of Marine Science - Vol 11, No 1 (2012)

    African Journals Online (AJOL)

    Using an ecosystem model to evaluate fisheries management options to mitigate climate change impacts in western Indian Ocean coral reefs · EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT · DOWNLOAD FULL TEXT DOWNLOAD FULL TEXT. Carlos Ruiz Sebastián, Tim R. McClanahan, 77-86 ...

  10. Leadership Preferences of Indian and Non-Indian Athletes.

    Science.gov (United States)

    Malloy, D. C.; Nilson, R. N.

    1991-01-01

    Among 86 Indian and non-Indian volleyball competitors, non-Indian players indicated significantly greater preferences for leadership that involved democratic behavior, autocratic behavior, or social support. Indians may adapt their behavior by participating in non-Indian games, without changing their traditional value orientations. Contains 22…

  11. Apache hive essentials

    CERN Document Server

    Du, Dayong

    2015-01-01

    If you are a data analyst, developer, or simply someone who wants to use Hive to explore and analyze data in Hadoop, this is the book for you. Whether you are new to big data or an expert, with this book, you will be able to master both the basic and the advanced features of Hive. Since Hive is an SQL-like language, some previous experience with the SQL language and databases is useful to have a better understanding of this book.

  12. Mastering Apache Maven 3

    CERN Document Server

    Siriwardena, Prabath

    2014-01-01

    If you are working with Java or Java EE projects and you want to take full advantage of Maven in designing, executing, and maintaining your build system for optimal developer productivity, then this book is ideal for you. You should be well versed with Maven and its basic functionality if you wish to get the most out of the book.

  13. Apache SMART Briefing

    Science.gov (United States)

    2002-06-01

    Relative States ● Fuzing Range ● Bodying Bending Parameters ● Pressures ● Body Accel. ● Total Impulse ● Detector Material ● Optical Diameter MSC NASTRAN ...Design (IMD) System Boeing (McDonnell Douglas Helicopter Systems) • Rotary Wing Structures Technology Demonstration Program (RWSTDP) • DMAPS  A Suite...Lines Nitrogen Inerting Unit SUBSYSTEMS IN CENTER FUSELAGE Flow Chart Of DMAPS Process Conceptual Layouts Assembly Layouts CLO ALO BTP Objectives 33

  14. Learning Apache Mahout

    CERN Document Server

    Tiwary, Chandramani

    2015-01-01

    If you are a Java developer and want to use Mahout and machine learning to solve Big Data Analytics use cases then this book is for you. Familiarity with shell scripts is assumed but no prior experience is required.

  15. Apache Accumulo for developers

    CERN Document Server

    Halldórsson, Guðmundur Jón

    2013-01-01

    The book will have a tutorial-based approach that will show the readers how to start from scratch with building an Accumulo cluster and learning how to monitor the system and implement aspects such as security.This book is great for developers new to Accumulo, who are looking to get a good grounding in how to use Accumulo. It's assumed that you have an understanding of how Hadoop works, both HDFS and the Map/Reduce. No prior knowledge of ZooKeeper is assumed.

  16. Learning Apache Cassandra

    CERN Document Server

    Brown, Mat

    2015-01-01

    If you're an application developer familiar with SQL databases such as MySQL or Postgres, and you want to explore distributed databases such as Cassandra, this is the perfect guide for you. Even if you've never worked with a distributed database before, Cassandra's intuitive programming interface coupled with the step-by-step examples in this book will have you building highly scalable persistence layers for your applications in no time.

  17. Apache Maven dependency management

    CERN Document Server

    Lalou, Jonathan

    2013-01-01

    An easy-to-follow, tutorial-based guide with chapters progressing from basic to advanced dependency management.If you are working with Java or Java EE projects and you want to take advantage of Maven dependency management, then this book is ideal for you. This book is also particularly useful if you are a developer or an architect. You should be well versed with Maven and its basic functionalities if you wish to get the most out of this book.

  18. Learning Apache Mahout classification

    CERN Document Server

    Gupta, Ashish

    2015-01-01

    If you are a data scientist who has some experience with the Hadoop ecosystem and machine learning methods and want to try out classification on large datasets using Mahout, this book is ideal for you. Knowledge of Java is essential.

  19. Apache Solr PHP integration

    CERN Document Server

    Kumar, Jayant

    2013-01-01

    This book is full of step-by-step example-oriented tutorials which will show readers how to integrate Solr in PHP applications using the available libraries, and boost the inherent search facilities that Solr offers.If you are a developer who knows PHP and is interested in integrating search into your applications, this is the book for you. No advanced knowledge of Solr is required. Very basic knowledge of system commands and the command-line interface on both Linux and Windows is required. You should also be familiar with the concept of Web servers.

  20. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig

    Science.gov (United States)

    Sahoo, Satya S.; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A.; Lhatoo, Samden D.

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This “neuroscience Big data” represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability—the ability to efficiently process increasing volumes of data; (b) Adaptability—the toolkit can be deployed across different computing configurations; and (c) Ease of programming—the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that

  1. NeuroPigPen: A Scalable Toolkit for Processing Electrophysiological Signal Data in Neuroscience Applications Using Apache Pig.

    Science.gov (United States)

    Sahoo, Satya S; Wei, Annan; Valdez, Joshua; Wang, Li; Zonjy, Bilal; Tatsuoka, Curtis; Loparo, Kenneth A; Lhatoo, Samden D

    2016-01-01

    The recent advances in neurological imaging and sensing technologies have led to rapid increase in the volume, rate of data generation, and variety of neuroscience data. This "neuroscience Big data" represents a significant opportunity for the biomedical research community to design experiments using data with greater timescale, large number of attributes, and statistically significant data size. The results from these new data-driven research techniques can advance our understanding of complex neurological disorders, help model long-term effects of brain injuries, and provide new insights into dynamics of brain networks. However, many existing neuroinformatics data processing and analysis tools were not built to manage large volume of data, which makes it difficult for researchers to effectively leverage this available data to advance their research. We introduce a new toolkit called NeuroPigPen that was developed using Apache Hadoop and Pig data flow language to address the challenges posed by large-scale electrophysiological signal data. NeuroPigPen is a modular toolkit that can process large volumes of electrophysiological signal data, such as Electroencephalogram (EEG), Electrocardiogram (ECG), and blood oxygen levels (SpO2), using a new distributed storage model called Cloudwave Signal Format (CSF) that supports easy partitioning and storage of signal data on commodity hardware. NeuroPigPen was developed with three design principles: (a) Scalability-the ability to efficiently process increasing volumes of data; (b) Adaptability-the toolkit can be deployed across different computing configurations; and (c) Ease of programming-the toolkit can be easily used to compose multi-step data processing pipelines using high-level programming constructs. The NeuroPigPen toolkit was evaluated using 750 GB of electrophysiological signal data over a variety of Hadoop cluster configurations ranging from 3 to 30 Data nodes. The evaluation results demonstrate that the toolkit

  2. Test of an 0.8-Scale Model of the AH-64 Apache in the NASA Langley Full- Scale Wind Tunnel

    Science.gov (United States)

    1988-05-01

    available test rigs - the NASA Generalized Rotor Model System (GRMS) and the Navy Hub and Pylon Evaluation Rig (HPER). Computer programs studying the...I, USAAVSCOM TM-87-D-5 V US ARMY AVIATION SYSTEMS COMMAND TEST OF AN 0.8-SCALE MODEL OF THE AH-64 APACHE IN THE NASA LANGLEY FULL-SCALE WIND TUNNEL...I% DTIC TAB [ Unan~nounced %] J st f iaton By- PDistributIon / . . AvnI1qbIlitV C -Idns... Dist l SP cI~o: *5* % , -- 5 ~V . - ~5 \\ .\\% TABLE OF

  3. Predictive values of urine paraquat concentration, dose of poison, arterial blood lactate and APACHE II score in the prognosis of patients with acute paraquat poisoning.

    Science.gov (United States)

    Liu, Xiao-Wei; Ma, Tao; Li, Lu-Lu; Qu, Bo; Liu, Zhi

    2017-07-01

    The present study investigated the predictive values of urine paraquat (PQ) concentration, dose of poison, arterial blood lactate and Acute Physiology and Chronic Health Evaluation (APACHE) II score in the prognosis of patients with acute PQ poisoning. A total of 194 patients with acute PQ poisoning, hospitalized between April 2012 and January 2014 at the First Affiliated Hospital of P.R. China Medical University (Shenyang, China), were selected and divided into survival and mortality groups. Logistic regression analysis, receiver operator characteristic (ROC) curve analysis and Kaplan-Meier curve were applied to evaluate the values of urine paraquat (PQ) concentration, dose of poison, arterial blood lactate and (APACHE) II score for predicting the prognosis of patients with acute PQ poisoning. Initial urine PQ concentration (C0), dose of poison, arterial blood lactate and APACHE II score of patients in the mortality group were significantly higher compared with the survival group (all Ppoison and arterial blood lactate correlated with mortality risk of acute PQ poisoning (all Ppoison, arterial blood lactate and APACHE II score in predicting the mortality of patients within 28 days were 0.921, 0.887, 0.808 and 0.648, respectively. The AUC of C0 for predicting early and delayed mortality were 0.890 and 0.764, respectively. The AUC values of urine paraquat concentration the day after poisoning (Csec) and the rebound rate of urine paraquat concentration in predicting the mortality of patients within 28 days were 0.919 and 0.805, respectively. The 28-day survival rate of patients with C0 ≤32.2 µg/ml (42/71; 59.2%) was significantly higher when compared with patients with C0 >32.2 µg/ml (38/123; 30.9%). These results suggest that the initial urine PQ concentration may be the optimal index for predicting the prognosis of patients with acute PQ poisoning. Additionally, dose of poison, arterial blood lactate, Csec and rebound rate also have referential significance.

  4. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Ananth Ramaswamy, Indian Institute of Science , Bengaluru Anil Kulkarni, Indian Institute of Technology Bombay, Mumbai Anup Bhattacharjee, Bhabha Atomic Research Centre, Mumbai Anupam Dewan , Indian Institute of Technology, New Delhi Asif Ekbal, Indian Institute of Technology, Patna Chandra Kishen J M, Indian ...

  5. 76 FR 42722 - Indian Gaming

    Science.gov (United States)

    2011-07-19

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Assistant... of the Indian Gaming Regulatory Act of 1988 (IGRA), Public Law 100-497, 25 U.S.C. 2710, the Secretary...

  6. 76 FR 165 - Indian Gaming

    Science.gov (United States)

    2011-01-03

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of... the Menominee Indian Tribe of Wisconsin (``Tribe'') and the State of Wisconsin Gaming Compact of 1992... CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy Assistant Secretary...

  7. 75 FR 61511 - Indian Gaming

    Science.gov (United States)

    2010-10-05

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of... CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy Assistant Secretary... section 11 of the Indian Gaming Regulatory Act of 1988 (IGRA), Public Law 100-497, 25 U.S.C. 2710, the...

  8. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Editorial Board. Sadhana. Editor. N Viswanadham, Indian Institute of Science, Bengaluru. Senior Associate Editors. Arakeri J H, Indian Institute of Science, Bengaluru Hari K V S, Indian Institute of Science, Bengaluru Mujumdar P P, Indian Institute of Science, Bengaluru Manoj Kumar Tiwari, Indian Institute of Technology, ...

  9. Monte carlo simulation for soot dynamics

    KAUST Repository

    Zhou, Kun

    2012-01-01

    A new Monte Carlo method termed Comb-like frame Monte Carlo is developed to simulate the soot dynamics. Detailed stochastic error analysis is provided. Comb-like frame Monte Carlo is coupled with the gas phase solver Chemkin II to simulate soot formation in a 1-D premixed burner stabilized flame. The simulated soot number density, volume fraction, and particle size distribution all agree well with the measurement available in literature. The origin of the bimodal distribution of particle size distribution is revealed with quantitative proof.

  10. Monte Carlo approaches to light nuclei

    International Nuclear Information System (INIS)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of 16 O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs

  11. Monte Carlo approaches to light nuclei

    Energy Technology Data Exchange (ETDEWEB)

    Carlson, J.

    1990-01-01

    Significant progress has been made recently in the application of Monte Carlo methods to the study of light nuclei. We review new Green's function Monte Carlo results for the alpha particle, Variational Monte Carlo studies of {sup 16}O, and methods for low-energy scattering and transitions. Through these calculations, a coherent picture of the structure and electromagnetic properties of light nuclei has arisen. In particular, we examine the effect of the three-nucleon interaction and the importance of exchange currents in a variety of experimentally measured properties, including form factors and capture cross sections. 29 refs., 7 figs.

  12. Importance iteration in MORSE Monte Carlo calculations

    International Nuclear Information System (INIS)

    Kloosterman, J.L.; Hoogenboom, J.E.

    1994-02-01

    An expression to calculate point values (the expected detector response of a particle emerging from a collision or the source) is derived and implemented in the MORSE-SGC/S Monte Carlo code. It is outlined how these point values can be smoothed as a function of energy and as a function of the optical thickness between the detector and the source. The smoothed point values are subsequently used to calculate the biasing parameters of the Monte Carlo runs to follow. The method is illustrated by an example, which shows that the obtained biasing parameters lead to a more efficient Monte Carlo calculation. (orig.)

  13. Adaptive Markov Chain Monte Carlo

    KAUST Repository

    Jadoon, Khan

    2016-08-08

    A substantial interpretation of electromagnetic induction (EMI) measurements requires quantifying optimal model parameters and uncertainty of a nonlinear inverse problem. For this purpose, an adaptive Bayesian Markov chain Monte Carlo (MCMC) algorithm is used to assess multi-orientation and multi-offset EMI measurements in an agriculture field with non-saline and saline soil. In the MCMC simulations, posterior distribution was computed using Bayes rule. The electromagnetic forward model based on the full solution of Maxwell\\'s equations was used to simulate the apparent electrical conductivity measured with the configurations of EMI instrument, the CMD mini-Explorer. The model parameters and uncertainty for the three-layered earth model are investigated by using synthetic data. Our results show that in the scenario of non-saline soil, the parameters of layer thickness are not well estimated as compared to layers electrical conductivity because layer thicknesses in the model exhibits a low sensitivity to the EMI measurements, and is hence difficult to resolve. Application of the proposed MCMC based inversion to the field measurements in a drip irrigation system demonstrate that the parameters of the model can be well estimated for the saline soil as compared to the non-saline soil, and provide useful insight about parameter uncertainty for the assessment of the model outputs.

  14. Advanced computers and Monte Carlo

    International Nuclear Information System (INIS)

    Jordan, T.L.

    1979-01-01

    High-performance parallelism that is currently available is synchronous in nature. It is manifested in such architectures as Burroughs ILLIAC-IV, CDC STAR-100, TI ASC, CRI CRAY-1, ICL DAP, and many special-purpose array processors designed for signal processing. This form of parallelism has apparently not been of significant value to many important Monte Carlo calculations. Nevertheless, there is much asynchronous parallelism in many of these calculations. A model of a production code that requires up to 20 hours per problem on a CDC 7600 is studied for suitability on some asynchronous architectures that are on the drawing board. The code is described and some of its properties and resource requirements ae identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resource requirements are identified to compare with corresponding properties and resources of some asynchronous multiprocessor architectures. Arguments are made for programer aids and special syntax to identify and support important asynchronous parallelism. 2 figures, 5 tables

  15. 11th International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing

    CERN Document Server

    Nuyens, Dirk

    2016-01-01

    This book presents the refereed proceedings of the Eleventh International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing that was held at the University of Leuven (Belgium) in April 2014. These biennial conferences are major events for Monte Carlo and quasi-Monte Carlo researchers. The proceedings include articles based on invited lectures as well as carefully selected contributed papers on all theoretical aspects and applications of Monte Carlo and quasi-Monte Carlo methods. Offering information on the latest developments in these very active areas, this book is an excellent reference resource for theoreticians and practitioners interested in solving high-dimensional computational problems, arising, in particular, in finance, statistics and computer graphics.

  16. Monte Carlo simulations for plasma physics

    International Nuclear Information System (INIS)

    Okamoto, M.; Murakami, S.; Nakajima, N.; Wang, W.X.

    2000-07-01

    Plasma behaviours are very complicated and the analyses are generally difficult. However, when the collisional processes play an important role in the plasma behaviour, the Monte Carlo method is often employed as a useful tool. For examples, in neutral particle injection heating (NBI heating), electron or ion cyclotron heating, and alpha heating, Coulomb collisions slow down high energetic particles and pitch angle scatter them. These processes are often studied by the Monte Carlo technique and good agreements can be obtained with the experimental results. Recently, Monte Carlo Method has been developed to study fast particle transports associated with heating and generating the radial electric field. Further it is applied to investigating the neoclassical transport in the plasma with steep gradients of density and temperatures which is beyong the conventional neoclassical theory. In this report, we briefly summarize the researches done by the present authors utilizing the Monte Carlo method. (author)

  17. Hybrid Monte Carlo methods in computational finance

    NARCIS (Netherlands)

    Leitao Rodriguez, A.

    2017-01-01

    Monte Carlo methods are highly appreciated and intensively employed in computational finance in the context of financial derivatives valuation or risk management. The method offers valuable advantages like flexibility, easy interpretation and straightforward implementation. Furthermore, the

  18. Simulation and the Monte Carlo method

    CERN Document Server

    Rubinstein, Reuven Y

    2016-01-01

    Simulation and the Monte Carlo Method, Third Edition reflects the latest developments in the field and presents a fully updated and comprehensive account of the major topics that have emerged in Monte Carlo simulation since the publication of the classic First Edition over more than a quarter of a century ago. While maintaining its accessible and intuitive approach, this revised edition features a wealth of up-to-date information that facilitates a deeper understanding of problem solving across a wide array of subject areas, such as engineering, statistics, computer science, mathematics, and the physical and life sciences. The book begins with a modernized introduction that addresses the basic concepts of probability, Markov processes, and convex optimization. Subsequent chapters discuss the dramatic changes that have occurred in the field of the Monte Carlo method, with coverage of many modern topics including: Markov Chain Monte Carlo, variance reduction techniques such as the transform likelihood ratio...

  19. "Shaakal" Carlos kaebas arreteerija kohtusse / Margo Pajuste

    Index Scriptorium Estoniae

    Pajuste, Margo

    2006-01-01

    Ilmunud ka: Postimees : na russkom jazõke 3. juuli lk. 11. Vangistatud kurikuulus terrorist "Shaakal" Carlos alustas kohtuasja oma kunagise vahistaja vastu. Ta süüdistab Prantsusmaa luureteenistuse endist juhti inimröövis

  20. Monte Carlo methods for particle transport

    CERN Document Server

    Haghighat, Alireza

    2015-01-01

    The Monte Carlo method has become the de facto standard in radiation transport. Although powerful, if not understood and used appropriately, the method can give misleading results. Monte Carlo Methods for Particle Transport teaches appropriate use of the Monte Carlo method, explaining the method's fundamental concepts as well as its limitations. Concise yet comprehensive, this well-organized text: * Introduces the particle importance equation and its use for variance reduction * Describes general and particle-transport-specific variance reduction techniques * Presents particle transport eigenvalue issues and methodologies to address these issues * Explores advanced formulations based on the author's research activities * Discusses parallel processing concepts and factors affecting parallel performance Featuring illustrative examples, mathematical derivations, computer algorithms, and homework problems, Monte Carlo Methods for Particle Transport provides nuclear engineers and scientists with a practical guide ...

  1. Monte Carlo code development in Los Alamos

    International Nuclear Information System (INIS)

    Carter, L.L.; Cashwell, E.D.; Everett, C.J.; Forest, C.A.; Schrandt, R.G.; Taylor, W.M.; Thompson, W.L.; Turner, G.D.

    1974-01-01

    The present status of Monte Carlo code development at Los Alamos Scientific Laboratory is discussed. A brief summary is given of several of the most important neutron, photon, and electron transport codes. 17 references. (U.S.)

  2. Quantum Monte Carlo approaches for correlated systems

    CERN Document Server

    Becca, Federico

    2017-01-01

    Over the past several decades, computational approaches to studying strongly-interacting systems have become increasingly varied and sophisticated. This book provides a comprehensive introduction to state-of-the-art quantum Monte Carlo techniques relevant for applications in correlated systems. Providing a clear overview of variational wave functions, and featuring a detailed presentation of stochastic samplings including Markov chains and Langevin dynamics, which are developed into a discussion of Monte Carlo methods. The variational technique is described, from foundations to a detailed description of its algorithms. Further topics discussed include optimisation techniques, real-time dynamics and projection methods, including Green's function, reptation and auxiliary-field Monte Carlo, from basic definitions to advanced algorithms for efficient codes, and the book concludes with recent developments on the continuum space. Quantum Monte Carlo Approaches for Correlated Systems provides an extensive reference ...

  3. The APACHE II measured on patients' discharge from the Intensive Care Unit in the prediction of mortality APACHE II medido en la salida de los pacientes de la Unidad de Terapia Intensiva en la previsión de la mortalidad APACHE II medido na saída dos pacientes da Unidade de Terapia Intensiva na previsão da mortalidade

    Directory of Open Access Journals (Sweden)

    Luciana Gonzaga dos Santos Cardoso

    2013-06-01

    Full Text Available OBJECTIVE: to analyze the performance of the Acute Physiology and Chronic Health Evaluation (APACHE II, measured based on the data from the last 24 hours of hospitalization in ICU, for patients transferred to the wards. METHOD: an observational, prospective and quantitative study using the data from 355 patients admitted to the ICU between January and July 2010, who were transferred to the wards. RESULTS: the discriminatory power of the AII-OUT prognostic index showed a statistically significant area beneath the ROC curve. The mortality observed in the sample was slightly greater than that predicted by the AII-OUT, with a Standardized Mortality Ratio of 1.12. In the calibration curve the linear regression analysis showed the R2 value to be statistically significant. CONCLUSION: the AII-OUT could predict mortality after discharge from ICU, with the observed mortality being slightly greater than that predicted, which shows good discrimination and good calibration. This system was shown to be useful for stratifying the patients at greater risk of death after discharge from ICU. This fact deserves special attention from health professionals, particularly nurses, in managing human and technological resources for this group of patients. OBJETIVO: analizar el desempeño del Acute Physiology and Chronic Health Evaluation (APACHE II, medido con base en los datos de la últims 24 horas de internación en la UTI, en los pacientes con transferencia para las enfermerías. MÉTODO: estudio observacional, prospectivo y cuantitativo con datos de 355 pacientes admitidos en la UTI entre enero y julio de 2010 que fueron transferidos para las enfermerías. RESULTADOS: el poder discriminatorio del índice pronóstico AII-SALIDA demostró un área debajo de la curva ROC estadísticamente significativa. La mortalidad observada en la muestra fue discretamente mayor que la prevista por el AII-SALIDA, con una Razón de Mortalidad Estandarizada de 1,12. En la curva de

  4. Monte Carlo Algorithms for Linear Problems

    OpenAIRE

    Dimov, Ivan

    2000-01-01

    MSC Subject Classification: 65C05, 65U05. Monte Carlo methods are a powerful tool in many fields of mathematics, physics and engineering. It is known, that these methods give statistical estimates for the functional of the solution by performing random sampling of a certain chance variable whose mathematical expectation is the desired functional. Monte Carlo methods are methods for solving problems using random variables. In the book [16] edited by Yu. A. Shreider one can find the followin...

  5. Multilevel Monte Carlo in Approximate Bayesian Computation

    KAUST Repository

    Jasra, Ajay

    2017-02-13

    In the following article we consider approximate Bayesian computation (ABC) inference. We introduce a method for numerically approximating ABC posteriors using the multilevel Monte Carlo (MLMC). A sequential Monte Carlo version of the approach is developed and it is shown under some assumptions that for a given level of mean square error, this method for ABC has a lower cost than i.i.d. sampling from the most accurate ABC approximation. Several numerical examples are given.

  6. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    ... considerable difference between the Procedural programming and Object Oriented PHP language, on the middle layer in the three tier of the web architecture. Also, the research concerning the comparison of relationdatabase system, MySQL and NoSQL, key value store system, ApacheCassandra, on the database layer.

  7. Petite Guerre: Brigadier General George Cook, Commander of the Department of Arizona, Application of Small War Doctrine Against the Apache 1870-1873

    Science.gov (United States)

    2014-05-22

    increased tensions between agents and military commanders. Congressional appropriations directly to the Indian agents enabled corruption and inefficiency. In...system. The hasty practice of coercion and broken agreements by the government representatives represented the apathy and greed that the Indian was

  8. Bayesian statistics and Monte Carlo methods

    Science.gov (United States)

    Koch, K. R.

    2018-03-01

    The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.

  9. Successful vectorization - reactor physics Monte Carlo code

    International Nuclear Information System (INIS)

    Martin, W.R.

    1989-01-01

    Most particle transport Monte Carlo codes in use today are based on the ''history-based'' algorithm, wherein one particle history at a time is simulated. Unfortunately, the ''history-based'' approach (present in all Monte Carlo codes until recent years) is inherently scalar and cannot be vectorized. In particular, the history-based algorithm cannot take advantage of vector architectures, which characterize the largest and fastest computers at the current time, vector supercomputers such as the Cray X/MP or IBM 3090/600. However, substantial progress has been made in recent years in developing and implementing a vectorized Monte Carlo algorithm. This algorithm follows portions of many particle histories at the same time and forms the basis for all successful vectorized Monte Carlo codes that are in use today. This paper describes the basic vectorized algorithm along with descriptions of several variations that have been developed by different researchers for specific applications. These applications have been mainly in the areas of neutron transport in nuclear reactor and shielding analysis and photon transport in fusion plasmas. The relative merits of the various approach schemes will be discussed and the present status of known vectorization efforts will be summarized along with available timing results, including results from the successful vectorization of 3-D general geometry, continuous energy Monte Carlo. (orig.)

  10. Indian Arts in Canada

    Science.gov (United States)

    Tawow, 1974

    1974-01-01

    A recent publication, "Indian Arts in Canada", examines some of the forces, both past and present, which are not only affecting American Indian artists today, but which will also profoundly influence their future. The review presents a few of the illustrations used in the book, along with the Introduction and the Foreword. (KM)

  11. Indian Inuit Pottery '73

    Science.gov (United States)

    Tawow, 1974

    1974-01-01

    A unique exhibit of Canadian Native Ceramics which began touring various art galleries in September 1973 is described both verbally and photographically. The Indian Inuit Pottery '73 display, part of the 1973 International Ceramics Exhibition, includes 110 samples of craftsmanship from Indian and Inuit artists across Canada. (KM)

  12. Indian Ocean Rim Cooperation

    DEFF Research Database (Denmark)

    Wippel, Steffen

    Since the mid-1990s, the Indian Ocean has been experiencing increasing economic cooperation among its rim states. Middle Eastern countries, too, participate in the work of the Indian Ocean Rim Association, which received new impetus in the course of the current decade. Notably Oman is a very active...

  13. Indian Summer Arts Festival


    OpenAIRE

    Martel, Yann; Tabu; Tejpal, Tarun; Kunzru, Hari

    2011-01-01

    The SFU Woodward's Cultural Unit partnered with the Indian Summer Festival Society to kick off the inaugural Indian Summer Festival. Held at the Goldcorp Centre for the Arts, it included an interactive Literature Series with notable authors from both India and Canada, including special guests Yann Martel, Bollywood superstar Tabu, journalist Tarun Tejpal, writer Hari Kunzru, and many others.

  14. American Indian Community Colleges.

    Science.gov (United States)

    One Feather, Gerald

    With the emergence of reservation based community colleges (th Navajo Community College and the Dakota Community Colleges), the American Indian people, as decision makers in these institutions, are providing Indians with the technical skills and cultural knowledge necessary for self-determination. Confronted with limited numbers of accredited…

  15. The Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Naqvi, S.W.A.

    There are two unique aspects of geography of the Indian Ocean that profoundly influence its climate and circulation: (a) The Indian Ocean’s northern expanse is curtailed by the Eurasian landmass around the Tropic of Cancer (making it the only ocean...

  16. Becoming an Indian

    Indian Academy of Sciences (India)

    Ramachandra Guha

    2017-11-25

    Nov 25, 2017 ... ern education, and other accoutrements of civilization. In the competing version, associated with the ruled, the white man's Raj was always illegitimate, marked by coercion and backed by force, its central aim the economic exploitation of Indian labour and Indian raw materials. Thus, British bookshops and ...

  17. Writing American Indian History

    Science.gov (United States)

    Noley, Grayson B.

    2008-01-01

    The purpose of this paper is to critique the manner in which history about American Indians has been written and propose a rationale for the rethinking of what we know about this subject. In particular, histories of education as regards the participation of American Indians is a subject that has been given scant attention over the years and when…

  18. 76 FR 65208 - Indian Gaming

    Science.gov (United States)

    2011-10-20

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an Approval of the Gaming..., Office of Indian Gaming, Office of the Deputy Assistant Secretary--Policy and Economic Development...

  19. 77 FR 45371 - Indian Gaming

    Science.gov (United States)

    2012-07-31

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy...

  20. 78 FR 33435 - Indian Gaming

    Science.gov (United States)

    2013-06-04

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Amendments. SUMMARY: This notice publishes approval of an Agreement to Amend the Class III Tribal-State Gaming Compact between the Salt River Pima- Maricopa Indian...

  1. 75 FR 38833 - Indian Gaming

    Science.gov (United States)

    2010-07-06

    ... Doc No: 2010-16213] DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact..., Office of Indian Gaming, Office of the Deputy Assistant Secretary--Policy and Economic Development...

  2. 76 FR 8375 - Indian Gaming

    Science.gov (United States)

    2011-02-14

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the Gaming..., 2011. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  3. 78 FR 26801 - Indian Gaming

    Science.gov (United States)

    2013-05-08

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of an amendment to the Class III Tribal-State Gaming Compact between the Menominee Indian Tribe of Wisconsin and the...

  4. 75 FR 8108 - Indian Gaming

    Science.gov (United States)

    2010-02-23

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes approval of the Tribal-State Compact between the Pyramid Lake Paiute Indian Tribe and the State of Nevada Governing Class III Gaming...

  5. 77 FR 30550 - Indian Gaming

    Science.gov (United States)

    2012-05-23

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes approval by the Department of an extension to the Class III Gaming Compact between the Pyramid Lake Paiute Indian Tribe and the...

  6. 77 FR 59641 - Indian Gaming

    Science.gov (United States)

    2012-09-28

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming.... FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  7. 78 FR 11221 - Indian Gaming

    Science.gov (United States)

    2013-02-15

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the gaming..., 2013. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  8. 75 FR 55823 - Indian Gaming

    Science.gov (United States)

    2010-09-14

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming.... FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  9. 75 FR 38834 - Indian Gaming

    Science.gov (United States)

    2010-07-06

    ...: 2010-16214] DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian... Gaming, Office of the Deputy Assistant Secretary--Policy and Economic Development, Washington, DC 20240, (202) 219-4066. SUPPLEMENTARY INFORMATION: Under Section 11 of the Indian Gaming Regulatory Act of 1988...

  10. 77 FR 43110 - Indian Gaming

    Science.gov (United States)

    2012-07-23

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy...

  11. 76 FR 52968 - Indian Gaming

    Science.gov (United States)

    2011-08-24

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy...

  12. 76 FR 56466 - Indian Gaming

    Science.gov (United States)

    2011-09-13

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an approval of the gaming...: September 13, 2011. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming...

  13. 78 FR 15738 - Indian Gaming

    Science.gov (United States)

    2013-03-12

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of the gaming..., 2013. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  14. 78 FR 10203 - Indian Gaming

    Science.gov (United States)

    2013-02-13

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal State Class III Gaming Compact. SUMMARY: This notice publishes the Approval of the Class III Tribal- State Gaming Compact between the Chippewa-Cree Tribe of the Rocky Boy's Indian Reservation...

  15. 75 FR 68618 - Indian Gaming

    Science.gov (United States)

    2010-11-08

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of... the Red Cliff Band of Lake Superior Chippewas (``Tribe'') and the State of Wisconsin Gaming Compact of... CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy Assistant Secretary...

  16. 76 FR 33341 - Indian Gaming

    Science.gov (United States)

    2011-06-08

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes an extension of Gaming... FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the Deputy...

  17. 77 FR 76513 - Indian Gaming

    Science.gov (United States)

    2012-12-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Amended Tribal-State Class III Gaming Compact taking effect. SUMMARY..., 2012. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director, Office of Indian Gaming, Office of the...

  18. Correlación entre índices de bioimpedancia eléctrica y score Apache II en pacientes con shock séptico

    OpenAIRE

    Díaz-De Los Santos, Manuel; Cieza, Javier; Valenzuela, Raúl

    2011-01-01

    Objetivo: Determinar la correlación entre diversos índices de bioimpedancia eléctrica (IBE) y el score APACHE II (sAII) en pacientes con shock séptico. Material y métodos: Se incluyeron 30 pacientes >14 años con shock séptico de la unidad de cuidados intensivos (UCI) adultos del Hospital Nacional Cayetano Heredia-Perú a quienes se calculó el (sAII) y se midió el ángulo de fase, índice de impedancia y relación LIC/LEC, correlacionándolos posteriormente mediante Pearson y regresión lineal...

  19. Monte Carlo simulation of Markov unreliability models

    International Nuclear Information System (INIS)

    Lewis, E.E.; Boehm, F.

    1984-01-01

    A Monte Carlo method is formulated for the evaluation of the unrealibility of complex systems with known component failure and repair rates. The formulation is in terms of a Markov process allowing dependences between components to be modeled and computational efficiencies to be achieved in the Monte Carlo simulation. Two variance reduction techniques, forced transition and failure biasing, are employed to increase computational efficiency of the random walk procedure. For an example problem these result in improved computational efficiency by more than three orders of magnitudes over analog Monte Carlo. The method is generalized to treat problems with distributed failure and repair rate data, and a batching technique is introduced and shown to result in substantial increases in computational efficiency for an example problem. A method for separating the variance due to the data uncertainty from that due to the finite number of random walks is presented. (orig.)

  20. Adiabatic optimization versus diffusion Monte Carlo methods

    Science.gov (United States)

    Jarret, Michael; Jordan, Stephen P.; Lackey, Brad

    2016-10-01

    Most experimental and theoretical studies of adiabatic optimization use stoquastic Hamiltonians, whose ground states are expressible using only real nonnegative amplitudes. This raises a question as to whether classical Monte Carlo methods can simulate stoquastic adiabatic algorithms with polynomial overhead. Here we analyze diffusion Monte Carlo algorithms. We argue that, based on differences between L1 and L2 normalized states, these algorithms suffer from certain obstructions preventing them from efficiently simulating stoquastic adiabatic evolution in generality. In practice however, we obtain good performance by introducing a method that we call Substochastic Monte Carlo. In fact, our simulations are good classical optimization algorithms in their own right, competitive with the best previously known heuristic solvers for MAX-k -SAT at k =2 ,3 ,4 .

  1. Shell model the Monte Carlo way

    International Nuclear Information System (INIS)

    Ormand, W.E.

    1995-01-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined

  2. Shell model the Monte Carlo way

    Energy Technology Data Exchange (ETDEWEB)

    Ormand, W.E.

    1995-03-01

    The formalism for the auxiliary-field Monte Carlo approach to the nuclear shell model is presented. The method is based on a linearization of the two-body part of the Hamiltonian in an imaginary-time propagator using the Hubbard-Stratonovich transformation. The foundation of the method, as applied to the nuclear many-body problem, is discussed. Topics presented in detail include: (1) the density-density formulation of the method, (2) computation of the overlaps, (3) the sign of the Monte Carlo weight function, (4) techniques for performing Monte Carlo sampling, and (5) the reconstruction of response functions from an imaginary-time auto-correlation function using MaxEnt techniques. Results obtained using schematic interactions, which have no sign problem, are presented to demonstrate the feasibility of the method, while an extrapolation method for realistic Hamiltonians is presented. In addition, applications at finite temperature are outlined.

  3. Off-diagonal expansion quantum Monte Carlo.

    Science.gov (United States)

    Albash, Tameem; Wagenbreth, Gene; Hen, Itay

    2017-12-01

    We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.

  4. Off-diagonal expansion quantum Monte Carlo

    Science.gov (United States)

    Albash, Tameem; Wagenbreth, Gene; Hen, Itay

    2017-12-01

    We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.

  5. Random Numbers and Monte Carlo Methods

    Science.gov (United States)

    Scherer, Philipp O. J.

    Many-body problems often involve the calculation of integrals of very high dimension which cannot be treated by standard methods. For the calculation of thermodynamic averages Monte Carlo methods are very useful which sample the integration volume at randomly chosen points. After summarizing some basic statistics, we discuss algorithms for the generation of pseudo-random numbers with given probability distribution which are essential for all Monte Carlo methods. We show how the efficiency of Monte Carlo integration can be improved by sampling preferentially the important configurations. Finally the famous Metropolis algorithm is applied to classical many-particle systems. Computer experiments visualize the central limit theorem and apply the Metropolis method to the traveling salesman problem.

  6. Monte Carlo strategies in scientific computing

    CERN Document Server

    Liu, Jun S

    2008-01-01

    This paperback edition is a reprint of the 2001 Springer edition This book provides a self-contained and up-to-date treatment of the Monte Carlo method and develops a common framework under which various Monte Carlo techniques can be "standardized" and compared Given the interdisciplinary nature of the topics and a moderate prerequisite for the reader, this book should be of interest to a broad audience of quantitative researchers such as computational biologists, computer scientists, econometricians, engineers, probabilists, and statisticians It can also be used as the textbook for a graduate-level course on Monte Carlo methods Many problems discussed in the alter chapters can be potential thesis topics for masters’ or PhD students in statistics or computer science departments Jun Liu is Professor of Statistics at Harvard University, with a courtesy Professor appointment at Harvard Biostatistics Department Professor Liu was the recipient of the 2002 COPSS Presidents' Award, the most prestigious one for sta...

  7. Evaluation of the Apache II and the oncologic history, as indicative predictions of mortality in the unit of intensive care of the INC September 1996 -December 1997

    International Nuclear Information System (INIS)

    Camargo, David O; Gomez, Clara; Martinez, Teresa

    1999-01-01

    They are multiple the indexes of severity that have been carried out to value the predict and the quality of a patient's life, especially when this it enters to the unit of intensive care (UIC); however, the oncologic patient presents particularities in their mobility, that it supposes a different behavior in the results of the Indexes. Presently work is compared the Apache scale and the oncologic history like morbid mortality as predictors in the UCI. 207 patients were included that entered the UCI between September of 1996 and December of 1997. It was a mortality of 29%, the stay of most of this group of patient smaller than 24 hours or bigger than 8 days. To the entrance, 50% of the patients presented superior averages at 15 in the Apache Scale and at the 48 hours, alone 30.4% continued with this value. The patients with hematologic neoplasia presented superior average at 15 in 87%, with a mortality of 63.3% with average between 15 and 24 to the entrance, the risk of dying was 9.8 times but that with inferior average. In the hematologic patient, the risk of dying was 5.7 times but regarding the solid tumors. The system but altered it was the breathing one, with an increase in the risk of dying from 2,8 times for each increment utility in the scale. Contrary to described in the literature, the oncologic diagnoses and the neoplasia statistic they didn't influence in the mortality of the patients

  8. Simulation of transport equations with Monte Carlo

    International Nuclear Information System (INIS)

    Matthes, W.

    1975-09-01

    The main purpose of the report is to explain the relation between the transport equation and the Monte Carlo game used for its solution. The introduction of artificial particles carrying a weight provides one with high flexibility in constructing many different games for the solution of the same equation. This flexibility opens a way to construct a Monte Carlo game for the solution of the adjoint transport equation. Emphasis is laid mostly on giving a clear understanding of what to do and not on the details of how to do a specific game

  9. Self-learning Monte Carlo (dynamical biasing)

    International Nuclear Information System (INIS)

    Matthes, W.

    1981-01-01

    In many applications the histories of a normal Monte Carlo game rarely reach the target region. An approximate knowledge of the importance (with respect to the target) may be used to guide the particles more frequently into the target region. A Monte Carlo method is presented in which each history contributes to update the importance field such that eventually most target histories are sampled. It is a self-learning method in the sense that the procedure itself: (a) learns which histories are important (reach the target) and increases their probability; (b) reduces the probabilities of unimportant histories; (c) concentrates gradually on the more important target histories. (U.K.)

  10. Monte Carlo electron/photon transport

    International Nuclear Information System (INIS)

    Mack, J.M.; Morel, J.E.; Hughes, H.G.

    1985-01-01

    A review of nonplasma coupled electron/photon transport using Monte Carlo method is presented. Remarks are mainly restricted to linerarized formalisms at electron energies from 1 keV to 1000 MeV. Applications involving pulse-height estimation, transport in external magnetic fields, and optical Cerenkov production are discussed to underscore the importance of this branch of computational physics. Advances in electron multigroup cross-section generation is reported, and its impact on future code development assessed. Progress toward the transformation of MCNP into a generalized neutral/charged-particle Monte Carlo code is described. 48 refs

  11. A keff calculation method by Monte Carlo

    International Nuclear Information System (INIS)

    Shen, H; Wang, K.

    2008-01-01

    The effective multiplication factor (k eff ) is defined as the ratio between the number of neutrons in successive generations, which definition is adopted by most Monte Carlo codes (e.g. MCNP). Also, it can be thought of as the ratio of the generation rate of neutrons by the sum of the leakage rate and the absorption rate, which should exclude the effect of the neutron reaction such as (n, 2n) and (n, 3n). This article discusses the Monte Carlo method for k eff calculation based on the second definition. A new code has been developed and the results are presented. (author)

  12. Monte Carlo Treatment Planning for Advanced Radiotherapy

    DEFF Research Database (Denmark)

    Cronholm, Rickard

    This Ph.d. project describes the development of a workflow for Monte Carlo Treatment Planning for clinical radiotherapy plans. The workflow may be utilized to perform an independent dose verification of treatment plans. Modern radiotherapy treatment delivery is often conducted by dynamically...... modulating the intensity of the field during the irradiation. The workflow described has the potential to fully model the dynamic delivery, including gantry rotation during irradiation, of modern radiotherapy. Three corner stones of Monte Carlo Treatment Planning are identified: Building, commissioning...

  13. Monte Carlo dose distributions for radiosurgery

    International Nuclear Information System (INIS)

    Perucha, M.; Leal, A.; Rincon, M.; Carrasco, E.

    2001-01-01

    The precision of Radiosurgery Treatment planning systems is limited by the approximations of their algorithms and by their dosimetrical input data. This fact is especially important in small fields. However, the Monte Carlo methods is an accurate alternative as it considers every aspect of particle transport. In this work an acoustic neurinoma is studied by comparing the dose distribution of both a planning system and Monte Carlo. Relative shifts have been measured and furthermore, Dose-Volume Histograms have been calculated for target and adjacent organs at risk. (orig.)

  14. Monte Carlo applications to radiation shielding problems

    International Nuclear Information System (INIS)

    Subbaiah, K.V.

    2009-01-01

    Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling of physical and mathematical systems to compute their results. However, basic concepts of MC are both simple and straightforward and can be learned by using a personal computer. Uses of Monte Carlo methods require large amounts of random numbers, and it was their use that spurred the development of pseudorandom number generators, which were far quicker to use than the tables of random numbers which had been previously used for statistical sampling. In Monte Carlo simulation of radiation transport, the history (track) of a particle is viewed as a random sequence of free flights that end with an interaction event where the particle changes its direction of movement, loses energy and, occasionally, produces secondary particles. The Monte Carlo simulation of a given experimental arrangement (e.g., an electron beam, coming from an accelerator and impinging on a water phantom) consists of the numerical generation of random histories. To simulate these histories we need an interaction model, i.e., a set of differential cross sections (DCS) for the relevant interaction mechanisms. The DCSs determine the probability distribution functions (pdf) of the random variables that characterize a track; 1) free path between successive interaction events, 2) type of interaction taking place and 3) energy loss and angular deflection in a particular event (and initial state of emitted secondary particles, if any). Once these pdfs are known, random histories can be generated by using appropriate sampling methods. If the number of generated histories is large enough, quantitative information on the transport process may be obtained by simply averaging over the simulated histories. The Monte Carlo method yields the same information as the solution of the Boltzmann transport equation, with the same interaction model, but is easier to implement. In particular, the simulation of radiation

  15. Fast sequential Monte Carlo methods for counting and optimization

    CERN Document Server

    Rubinstein, Reuven Y; Vaisman, Radislav

    2013-01-01

    A comprehensive account of the theory and application of Monte Carlo methods Based on years of research in efficient Monte Carlo methods for estimation of rare-event probabilities, counting problems, and combinatorial optimization, Fast Sequential Monte Carlo Methods for Counting and Optimization is a complete illustration of fast sequential Monte Carlo techniques. The book provides an accessible overview of current work in the field of Monte Carlo methods, specifically sequential Monte Carlo techniques, for solving abstract counting and optimization problems. Written by authorities in the

  16. Use of Monte Carlo Methods in brachytherapy; Uso del metodo de Monte Carlo en braquiterapia

    Energy Technology Data Exchange (ETDEWEB)

    Granero Cabanero, D.

    2015-07-01

    The Monte Carlo method has become a fundamental tool for brachytherapy dosimetry mainly because no difficulties associated with experimental dosimetry. In brachytherapy the main handicap of experimental dosimetry is the high dose gradient near the present sources making small uncertainties in the positioning of the detectors lead to large uncertainties in the dose. This presentation will review mainly the procedure for calculating dose distributions around a fountain using the Monte Carlo method showing the difficulties inherent in these calculations. In addition we will briefly review other applications of the method of Monte Carlo in brachytherapy dosimetry, as its use in advanced calculation algorithms, calculating barriers or obtaining dose applicators around. (Author)

  17. Specialized Monte Carlo codes versus general-purpose Monte Carlo codes

    International Nuclear Information System (INIS)

    Moskvin, Vadim; DesRosiers, Colleen; Papiez, Lech; Lu, Xiaoyi

    2002-01-01

    The possibilities of Monte Carlo modeling for dose calculations and optimization treatment are quite limited in radiation oncology applications. The main reason is that the Monte Carlo technique for dose calculations is time consuming while treatment planning may require hundreds of possible cases of dose simulations to be evaluated for dose optimization. The second reason is that general-purpose codes widely used in practice, require an experienced user to customize them for calculations. This paper discusses the concept of Monte Carlo code design that can avoid the main problems that are preventing wide spread use of this simulation technique in medical physics. (authors)

  18. On the use of stochastic approximation Monte Carlo for Monte Carlo integration

    KAUST Repository

    Liang, Faming

    2009-03-01

    The stochastic approximation Monte Carlo (SAMC) algorithm has recently been proposed as a dynamic optimization algorithm in the literature. In this paper, we show in theory that the samples generated by SAMC can be used for Monte Carlo integration via a dynamically weighted estimator by calling some results from the literature of nonhomogeneous Markov chains. Our numerical results indicate that SAMC can yield significant savings over conventional Monte Carlo algorithms, such as the Metropolis-Hastings algorithm, for the problems for which the energy landscape is rugged. © 2008 Elsevier B.V. All rights reserved.

  19. Monte Carlo methods in AB initio quantum chemistry quantum Monte Carlo for molecules

    CERN Document Server

    Lester, William A; Reynolds, PJ

    1994-01-01

    This book presents the basic theory and application of the Monte Carlo method to the electronic structure of atoms and molecules. It assumes no previous knowledge of the subject, only a knowledge of molecular quantum mechanics at the first-year graduate level. A working knowledge of traditional ab initio quantum chemistry is helpful, but not essential.Some distinguishing features of this book are: Clear exposition of the basic theory at a level to facilitate independent study. Discussion of the various versions of the theory: diffusion Monte Carlo, Green's function Monte Carlo, and release n

  20. Monte Carlo method in neutron activation analysis

    International Nuclear Information System (INIS)

    Majerle, M.; Krasa, A.; Svoboda, O.; Wagner, V.; Adam, J.; Peetermans, S.; Slama, O.; Stegajlov, V.I.; Tsupko-Sitnikov, V.M.

    2009-01-01

    Neutron activation detectors are a useful technique for the neutron flux measurements in spallation experiments. The study of the usefulness and the accuracy of this method at similar experiments was performed with the help of Monte Carlo codes MCNPX and FLUKA

  1. Biases in Monte Carlo eigenvalue calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gelbard, E.M.

    1992-12-01

    The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the ``fixed-source`` case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated (``replicated``) over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here.

  2. Biases in Monte Carlo eigenvalue calculations

    Energy Technology Data Exchange (ETDEWEB)

    Gelbard, E.M.

    1992-01-01

    The Monte Carlo method has been used for many years to analyze the neutronics of nuclear reactors. In fact, as the power of computers has increased the importance of Monte Carlo in neutronics has also increased, until today this method plays a central role in reactor analysis and design. Monte Carlo is used in neutronics for two somewhat different purposes, i.e., (a) to compute the distribution of neutrons in a given medium when the neutron source-density is specified, and (b) to compute the neutron distribution in a self-sustaining chain reaction, in which case the source is determined as the eigenvector of a certain linear operator. In (b), then, the source is not given, but must be computed. In the first case (the fixed-source'' case) the Monte Carlo calculation is unbiased. That is to say that, if the calculation is repeated ( replicated'') over and over, with independent random number sequences for each replica, then averages over all replicas will approach the correct neutron distribution as the number of replicas goes to infinity. Unfortunately, the computation is not unbiased in the second case, which we discuss here.

  3. Monte Carlo method for random surfaces

    International Nuclear Information System (INIS)

    Berg, B.

    1985-01-01

    Previously two of the authors proposed a Monte Carlo method for sampling statistical ensembles of random walks and surfaces with a Boltzmann probabilistic weight. In the present paper we work out the details for several models of random surfaces, defined on d-dimensional hypercubic lattices. (orig.)

  4. Computer system for Monte Carlo experimentation

    International Nuclear Information System (INIS)

    Grier, D.A.

    1986-01-01

    A new computer system for Monte Carlo Experimentation is presented. The new system speeds and simplifies the process of coding and preparing a Monte Carlo Experiment; it also encourages the proper design of Monte Carlo Experiments, and the careful analysis of the experimental results. A new functional language is the core of this system. Monte Carlo Experiments, and their experimental designs, are programmed in this new language; those programs are compiled into Fortran output. The Fortran output is then compiled and executed. The experimental results are analyzed with a standard statistics package such as Si, Isp, or Minitab or with a user-supplied program. Both the experimental results and the experimental design may be directly loaded into the workspace of those packages. The new functional language frees programmers from many of the details of programming an experiment. Experimental designs such as factorial, fractional factorial, or latin square are easily described by the control structures and expressions of the language. Specific mathematical modes are generated by the routines of the language

  5. Monte Carlo simulation of the microcanonical ensemble

    International Nuclear Information System (INIS)

    Creutz, M.

    1984-01-01

    We consider simulating statistical systems with a random walk on a constant energy surface. This combines features of deterministic molecular dynamics techniques and conventional Monte Carlo simulations. For discrete systems the method can be programmed to run an order of magnitude faster than other approaches. It does not require high quality random numbers and may also be useful for nonequilibrium studies. 10 references

  6. Workshop: Monte Carlo computational performance benchmark - Contributions

    International Nuclear Information System (INIS)

    Hoogenboom, J.E.; Petrovic, B.; Martin, W.R.; Sutton, T.; Leppaenen, J.; Forget, B.; Romano, P.; Siegel, A.; Hoogenboom, E.; Wang, K.; Li, Z.; She, D.; Liang, J.; Xu, Q.; Qiu, Y.; Yu, J.; Sun, J.; Fan, X.; Yu, G.; Bernard, F.; Cochet, B.; Jinaphanh, A.; Jacquet, O.; Van der Marck, S.; Tramm, J.; Felker, K.; Smith, K.; Horelik, N.; Capellan, N.; Herman, B.

    2013-01-01

    This series of slides is divided into 3 parts. The first part is dedicated to the presentation of the Monte-Carlo computational performance benchmark (aims, specifications and results). This benchmark aims at performing a full-size Monte Carlo simulation of a PWR core with axial and pin-power distribution. Many different Monte Carlo codes have been used and their results have been compared in terms of computed values and processing speeds. It appears that local power values mostly agree quite well. The first part also includes the presentations of about 10 participants in which they detail their calculations. In the second part, an extension of the benchmark is proposed in order to simulate a more realistic reactor core (for instance non-uniform temperature) and to assess feedback coefficients due to change of some parameters. The third part deals with another benchmark, the BEAVRS benchmark (Benchmark for Evaluation And Validation of Reactor Simulations). BEAVRS is also a full-core PWR benchmark for Monte Carlo simulations

  7. Monte Carlo determination of heteroepitaxial misfit structures

    DEFF Research Database (Denmark)

    Baker, J.; Lindgård, Per-Anker

    1996-01-01

    We use Monte Carlo simulations to determine the structure of KBr overlayers on a NaCl(001) substrate, a system with large (17%) heteroepitaxial misfit. The equilibrium relaxation structure is determined for films of 2-6 ML, for which extensive helium-atom scattering data exist for comparison...

  8. Dynamic bounds coupled with Monte Carlo simulations

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza; Meester, L.E.; van Gelder, P.H.A.J.M.; Vrijling, J.K.

    2011-01-01

    For the reliability analysis of engineering structures a variety of methods is known, of which Monte Carlo (MC) simulation is widely considered to be among the most robust and most generally applicable. To reduce simulation cost of the MC method, variance reduction methods are applied. This paper

  9. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... of local-move MC methods in combination with molecular dynamics simulations, for example, for studying multi-component lipid membranes containing cholesterol....

  10. Design and analysis of Monte Carlo experiments

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.; Gentle, J.E.; Haerdle, W.; Mori, Y.

    2012-01-01

    By definition, computer simulation or Monte Carlo models are not solved by mathematical analysis (such as differential calculus), but are used for numerical experimentation. The goal of these experiments is to answer questions about the real world; i.e., the experimenters may use their models to

  11. Juan Carlos D'Olivo: A portrait

    Science.gov (United States)

    Aguilar-Arévalo, Alexis A.

    2013-06-01

    This report attempts to give a brief bibliographical sketch of the academic life of Juan Carlos D'Olivo, researcher and teacher at the Instituto de Ciencias Nucleares of UNAM, devoted to advancing the fields of High Energy Physics and Astroparticle Physics in Mexico and Latin America.

  12. Scalable Domain Decomposed Monte Carlo Particle Transport

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Matthew Joseph [Univ. of California, Davis, CA (United States)

    2013-12-05

    In this dissertation, we present the parallel algorithms necessary to run domain decomposed Monte Carlo particle transport on large numbers of processors (millions of processors). Previous algorithms were not scalable, and the parallel overhead became more computationally costly than the numerical simulation.

  13. An analysis of Monte Carlo tree search

    CSIR Research Space (South Africa)

    James, S

    2017-02-01

    Full Text Available Monte Carlo Tree Search (MCTS) is a family of directed search algorithms that has gained widespread attention in recent years. Despite the vast amount of research into MCTS, the effect of modifications on the algorithm, as well as the manner...

  14. Parallel processing Monte Carlo radiation transport codes

    International Nuclear Information System (INIS)

    McKinney, G.W.

    1994-01-01

    Issues related to distributed-memory multiprocessing as applied to Monte Carlo radiation transport are discussed. Measurements of communication overhead are presented for the radiation transport code MCNP which employs the communication software package PVM, and average efficiency curves are provided for a homogeneous virtual machine

  15. Monte Carlo studies of uranium calorimetry

    International Nuclear Information System (INIS)

    Brau, J.; Hargis, H.J.; Gabriel, T.A.; Bishop, B.L.

    1985-01-01

    Detailed Monte Carlo calculations of uranium calorimetry are presented which reveal a significant difference in the responses of liquid argon and plastic scintillator in uranium calorimeters. Due to saturation effects, neutrons from the uranium are found to contribute only weakly to the liquid argon signal. Electromagnetic sampling inefficiencies are significant and contribute substantially to compensation in both systems. 17 references

  16. Coded aperture optimization using Monte Carlo simulations

    International Nuclear Information System (INIS)

    Martineau, A.; Rocchisani, J.M.; Moretti, J.L.

    2010-01-01

    Coded apertures using Uniformly Redundant Arrays (URA) have been unsuccessfully evaluated for two-dimensional and three-dimensional imaging in Nuclear Medicine. The images reconstructed from coded projections contain artifacts and suffer from poor spatial resolution in the longitudinal direction. We introduce a Maximum-Likelihood Expectation-Maximization (MLEM) algorithm for three-dimensional coded aperture imaging which uses a projection matrix calculated by Monte Carlo simulations. The aim of the algorithm is to reduce artifacts and improve the three-dimensional spatial resolution in the reconstructed images. Firstly, we present the validation of GATE (Geant4 Application for Emission Tomography) for Monte Carlo simulations of a coded mask installed on a clinical gamma camera. The coded mask modelling was validated by comparison between experimental and simulated data in terms of energy spectra, sensitivity and spatial resolution. In the second part of the study, we use the validated model to calculate the projection matrix with Monte Carlo simulations. A three-dimensional thyroid phantom study was performed to compare the performance of the three-dimensional MLEM reconstruction with conventional correlation method. The results indicate that the artifacts are reduced and three-dimensional spatial resolution is improved with the Monte Carlo-based MLEM reconstruction.

  17. Uncertainty analysis in Monte Carlo criticality computations

    International Nuclear Information System (INIS)

    Qi Ao

    2011-01-01

    Highlights: ► Two types of uncertainty methods for k eff Monte Carlo computations are examined. ► Sampling method has the least restrictions on perturbation but computing resources. ► Analytical method is limited to small perturbation on material properties. ► Practicality relies on efficiency, multiparameter applicability and data availability. - Abstract: Uncertainty analysis is imperative for nuclear criticality risk assessments when using Monte Carlo neutron transport methods to predict the effective neutron multiplication factor (k eff ) for fissionable material systems. For the validation of Monte Carlo codes for criticality computations against benchmark experiments, code accuracy and precision are measured by both the computational bias and uncertainty in the bias. The uncertainty in the bias accounts for known or quantified experimental, computational and model uncertainties. For the application of Monte Carlo codes for criticality analysis of fissionable material systems, an administrative margin of subcriticality must be imposed to provide additional assurance of subcriticality for any unknown or unquantified uncertainties. Because of a substantial impact of the administrative margin of subcriticality on economics and safety of nuclear fuel cycle operations, recently increasing interests in reducing the administrative margin of subcriticality make the uncertainty analysis in criticality safety computations more risk-significant. This paper provides an overview of two most popular k eff uncertainty analysis methods for Monte Carlo criticality computations: (1) sampling-based methods, and (2) analytical methods. Examples are given to demonstrate their usage in the k eff uncertainty analysis due to uncertainties in both neutronic and non-neutronic parameters of fissionable material systems.

  18. Indian refining industry

    International Nuclear Information System (INIS)

    Singh, I.J.

    2002-01-01

    The author discusses the history of the Indian refining industry and ongoing developments under the headings: the present state; refinery configuration; Indian capabilities for refinery projects; and reforms in the refining industry. Tables lists India's petroleum refineries giving location and capacity; new refinery projects together with location and capacity; and expansion projects of Indian petroleum refineries. The Indian refinery industry has undergone substantial expansion as well as technological changes over the past years. There has been progressive technology upgrading, energy efficiency, better environmental control and improved capacity utilisation. Major reform processes have been set in motion by the government of India: converting the refining industry from a centrally controlled public sector dominated industry to a delicensed regime in a competitive market economy with the introduction of a liberal exploration policy; dismantling the administered price mechanism; and a 25 year hydrocarbon vision. (UK)

  19. Indian Ocean margins

    Digital Repository Service at National Institute of Oceanography (India)

    Naqvi, S.W.A.

    The most important biogeochemical transformations and boundary exchanges in the Indian Ocean seem to occur in the northern region, where the processes originating at the land-ocean boundary extend far beyond the continental margins. Exchanges across...

  20. Tourism and Indian Exploitation

    Science.gov (United States)

    French, Lawrence

    1977-01-01

    A cursory review of Federal support to the Eastern Cherokees shows that the Cherokee Historical Association and not the Cherokee Indians are the recipients and beneficiaries of many Federal grants. (JC)

  1. minimum thresholds of monte carlo cycles for nigerian empirical

    African Journals Online (AJOL)

    2012-11-03

    Nov 3, 2012 ... Abstract. Monte Carlo simulation has proven to be an effective means of incorporating reliability analysis into the ... Monte Carlo simulation cycle of 2, 500 thresholds were enough to be used to provide sufficient repeatability for ... rameters using Monte Carlo method with the aid of. MATrixLABoratory.

  2. Postglacial Indian Ocean

    Digital Repository Service at National Institute of Oceanography (India)

    Naidu, P.D.

    of El Niiio-Southern Oscillarion (ENSO) (Krishnamurthy and Coswam~, 2000). The decrease in the Indian mon- soon rainfall associated with the warm phases of ENS0 is due to an anomalous regional Hadley circula- tion with descending motion over... down in recent decades (Kumar el al, 1999). A southeastward shift in rhe Walker circulation anomalies associated with ENS0 events may lead to a reduced subsidence over the Indian region, thus favoring normal monsoon condi- tions. Additionally...

  3. Correlación entre índices de bioimpedancia eléctrica y score Apache II en pacientes con shock séptico

    Directory of Open Access Journals (Sweden)

    Manuel Díaz-De Los Santos

    2010-07-01

    Full Text Available Objetivo: Determinar la correlación entre diversos índices de bioimpedancia eléctrica (IBE y el score APACHE II (sAII en pacientes con shock séptico. Material y métodos: Se incluyeron 30 pacientes >14 años con shock séptico de la unidad de cuidados intensivos (UCI adultos del Hospital Nacional Cayetano Heredia-Perú a quienes se calculó el (sAII y se midió el ángulo de fase, índice de impedancia y relación LIC/LEC, correlacionándolos posteriormente mediante Pearson y regresión lineal múltiple. Resultados: El 60% fueron varones, la edad promedio fue 60 ± 20,92 años, talla 1,61 ± 0,06m, peso 65,46 ± 8,7 Kg. y tiempo de ingreso a UCI 8,4 ± 5,99 horas. El 86,6% requirió ventilación mecánica, el foco infeccioso más frecuente fue respiratorio (63,3%. El promedio del sAII fue 18,83 ± 9,23, la permanencia en UCI 8,4 ± 5,99 días y la letalidad al mes 50%. Solamente se encontró correlación negativa con significancia estadística (r = -0,46; p = 0,01 entre el ángulo de fase (AF y el (sAII. Ni el índice de impedancia ni la relación LIC/LEC tuvieron correlación significativa. El mejor predictor de mortalidad fue el AF: todos los que fallecieron tuvieron un AF<6grados (promedio 3,67 ± 0,63, p<0,05. Conclusiones: Únicamente el AF se correlacionó con el score (sAII y fue el índice que mejor predijo mortalidad en pacientes con shock séptico, siendo superior al score APACHE II (sAII. (Rev Med Hered 2010;21:111-117.

  4. Multilevel sequential Monte-Carlo samplers

    KAUST Repository

    Jasra, Ajay

    2016-01-05

    Multilevel Monte-Carlo methods provide a powerful computational technique for reducing the computational cost of estimating expectations for a given computational effort. They are particularly relevant for computational problems when approximate distributions are determined via a resolution parameter h, with h=0 giving the theoretical exact distribution (e.g. SDEs or inverse problems with PDEs). The method provides a benefit by coupling samples from successive resolutions, and estimating differences of successive expectations. We develop a methodology that brings Sequential Monte-Carlo (SMC) algorithms within the framework of the Multilevel idea, as SMC provides a natural set-up for coupling samples over different resolutions. We prove that the new algorithm indeed preserves the benefits of the multilevel principle, even if samples at all resolutions are now correlated.

  5. Status of Monte Carlo at Los Alamos

    International Nuclear Information System (INIS)

    Thompson, W.L.; Cashwell, E.D.

    1980-01-01

    At Los Alamos the early work of Fermi, von Neumann, and Ulam has been developed and supplemented by many followers, notably Cashwell and Everett, and the main product today is the continuous-energy, general-purpose, generalized-geometry, time-dependent, coupled neutron-photon transport code called MCNP. The Los Alamos Monte Carlo research and development effort is concentrated in Group X-6. MCNP treats an arbitrary three-dimensional configuration of arbitrary materials in geometric cells bounded by first- and second-degree surfaces and some fourth-degree surfaces (elliptical tori). Monte Carlo has evolved into perhaps the main method for radiation transport calculations at Los Alamos. MCNP is used in every technical division at the Laboratory by over 130 users about 600 times a month accounting for nearly 200 hours of CDC-7600 time

  6. Monte Carlo simulation of gas Cerenkov detectors

    International Nuclear Information System (INIS)

    Mack, J.M.; Jain, M.; Jordan, T.M.

    1984-01-01

    Theoretical study of selected gamma-ray and electron diagnostic necessitates coupling Cerenkov radiation to electron/photon cascades. A Cerenkov production model and its incorporation into a general geometry Monte Carlo coupled electron/photon transport code is discussed. A special optical photon ray-trace is implemented using bulk optical properties assigned to each Monte Carlo zone. Good agreement exists between experimental and calculated Cerenkov data in the case of a carbon-dioxide gas Cerenkov detector experiment. Cerenkov production and threshold data are presented for a typical carbon-dioxide gas detector that converts a 16.7 MeV photon source to Cerenkov light, which is collected by optics and detected by a photomultiplier

  7. No-compromise reptation quantum Monte Carlo

    International Nuclear Information System (INIS)

    Yuen, W K; Farrar, Thomas J; Rothstein, Stuart M

    2007-01-01

    Since its publication, the reptation quantum Monte Carlo algorithm of Baroni and Moroni (1999 Phys. Rev. Lett. 82 4745) has been applied to several important problems in physics, but its mathematical foundations are not well understood. We show that their algorithm is not of typical Metropolis-Hastings type, and we specify conditions required for the generated Markov chain to be stationary and to converge to the intended distribution. The time-step bias may add up, and in many applications it is only the middle of a reptile that is the most important. Therefore, we propose an alternative, 'no-compromise reptation quantum Monte Carlo' to stabilize the middle of the reptile. (fast track communication)

  8. Multilevel Monte Carlo Approaches for Numerical Homogenization

    KAUST Repository

    Efendiev, Yalchin R.

    2015-10-01

    In this article, we study the application of multilevel Monte Carlo (MLMC) approaches to numerical random homogenization. Our objective is to compute the expectation of some functionals of the homogenized coefficients, or of the homogenized solutions. This is accomplished within MLMC by considering different sizes of representative volumes (RVEs). Many inexpensive computations with the smallest RVE size are combined with fewer expensive computations performed on larger RVEs. Likewise, when it comes to homogenized solutions, different levels of coarse-grid meshes are used to solve the homogenized equation. We show that, by carefully selecting the number of realizations at each level, we can achieve a speed-up in the computations in comparison to a standard Monte Carlo method. Numerical results are presented for both one-dimensional and two-dimensional test-cases that illustrate the efficiency of the approach.

  9. EU Commissioner Carlos Moedas visits SESAME

    CERN Multimedia

    CERN Bulletin

    2015-01-01

    The European Commissioner for research, science and innovation, Carlos Moedas, visited the SESAME laboratory in Jordan on Monday 13 April. When it begins operation in 2016, SESAME, a synchrotron light source, will be the Middle East’s first major international science centre, carrying out experiments ranging from the physical sciences to environmental science and archaeology.   CERN Director-General Rolf Heuer (left) and European Commissioner Carlos Moedas with the model SESAME magnet. © European Union, 2015.   Commissioner Moedas was accompanied by a European Commission delegation led by Robert-Jan Smits, Director-General of DG Research and Innovation, as well as Rolf Heuer, CERN Director-General, Jean-Pierre Koutchouk, coordinator of the CERN-EC Support for SESAME Magnets (CESSAMag) project and Princess Sumaya bint El Hassan of Jordan, a leading advocate of science in the region. They toured the SESAME facility together with SESAME Director, Khaled Tou...

  10. Status of Monte Carlo at Los Alamos

    International Nuclear Information System (INIS)

    Thompson, W.L.; Cashwell, E.D.; Godfrey, T.N.K.; Schrandt, R.G.; Deutsch, O.L.; Booth, T.E.

    1980-05-01

    Four papers were presented by Group X-6 on April 22, 1980, at the Oak Ridge Radiation Shielding Information Center (RSIC) Seminar-Workshop on Theory and Applications of Monte Carlo Methods. These papers are combined into one report for convenience and because they are related to each other. The first paper (by Thompson and Cashwell) is a general survey about X-6 and MCNP and is an introduction to the other three papers. It can also serve as a resume of X-6. The second paper (by Godfrey) explains some of the details of geometry specification in MCNP. The third paper (by Cashwell and Schrandt) illustrates calculating flux at a point with MCNP; in particular, the once-more-collided flux estimator is demonstrated. Finally, the fourth paper (by Thompson, Deutsch, and Booth) is a tutorial on some variance-reduction techniques. It should be required for a fledging Monte Carlo practitioner

  11. Monte Carlo Particle Transport: Algorithm and Performance Overview

    International Nuclear Information System (INIS)

    Gentile, N.; Procassini, R.; Scott, H.

    2005-01-01

    Monte Carlo methods are frequently used for neutron and radiation transport. These methods have several advantages, such as relative ease of programming and dealing with complex meshes. Disadvantages include long run times and statistical noise. Monte Carlo photon transport calculations also often suffer from inaccuracies in matter temperature due to the lack of implicitness. In this paper we discuss the Monte Carlo algorithm as it is applied to neutron and photon transport, detail the differences between neutron and photon Monte Carlo, and give an overview of the ways the numerical method has been modified to deal with issues that arise in photon Monte Carlo simulations

  12. Young Once, Indian Forever: Youth Gangs in Indian Country

    Science.gov (United States)

    Bell, James; Lim, Nicole

    2005-01-01

    Not unlike mainstream society of the United States, Indian Country faces new challenges regarding the values, mores, and behavior of its young people. Since their first encounters with European explorers, American Indians have fought to preserve their culture and traditions. Federal policies that addressed the "Indian problem" by…

  13. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series; Search. Search. Indian Academy of Sciences Conference Series. Title. Author. Keywords. Fulltext. Submit. Indian Academy of Sciences Conference Series. Current Issue : Vol. 1, Issue 1. Current Issue Volume 1 | Issue 1. December 2017. Home; Volumes & ...

  14. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. Indian Academy of Sciences Conference Series. Volumes & Issues. Volume 1. Issue 1. Dec 2017. Indian Academy of Sciences Conference Series. Current Issue : Vol. 1, Issue 1 · Current Issue Volume 1 | Issue 1. December 2017. Home; Volumes & Issues ...

  15. Introduction to the Monte Carlo methods

    International Nuclear Information System (INIS)

    Uzhinskij, V.V.

    1993-01-01

    Codes illustrating the use of Monte Carlo methods in high energy physics such as the inverse transformation method, the ejection method, the particle propagation through the nucleus, the particle interaction with the nucleus, etc. are presented. A set of useful algorithms of random number generators is given (the binomial distribution, the Poisson distribution, β-distribution, γ-distribution and normal distribution). 5 figs., 1 tab

  16. Monte Carlo modeling of eye iris color

    Science.gov (United States)

    Koblova, Ekaterina V.; Bashkatov, Alexey N.; Dolotov, Leonid E.; Sinichkin, Yuri P.; Kamenskikh, Tatyana G.; Genina, Elina A.; Tuchin, Valery V.

    2007-05-01

    Based on the presented two-layer eye iris model, the iris diffuse reflectance has been calculated by Monte Carlo technique in the spectral range 400-800 nm. The diffuse reflectance spectra have been recalculated in L*a*b* color coordinate system. Obtained results demonstrated that the iris color coordinates (hue and chroma) can be used for estimation of melanin content in the range of small melanin concentrations, i.e. for estimation of melanin content in blue and green eyes.

  17. Handbook of Markov chain Monte Carlo

    CERN Document Server

    Brooks, Steve

    2011-01-01

    ""Handbook of Markov Chain Monte Carlo"" brings together the major advances that have occurred in recent years while incorporating enough introductory material for new users of MCMC. Along with thorough coverage of the theoretical foundations and algorithmic and computational methodology, this comprehensive handbook includes substantial realistic case studies from a variety of disciplines. These case studies demonstrate the application of MCMC methods and serve as a series of templates for the construction, implementation, and choice of MCMC methodology.

  18. Monte Carlo methods for shield design calculations

    International Nuclear Information System (INIS)

    Grimstone, M.J.

    1974-01-01

    A suite of Monte Carlo codes is being developed for use on a routine basis in commercial reactor shield design. The methods adopted for this purpose include the modular construction of codes, simplified geometries, automatic variance reduction techniques, continuous energy treatment of cross section data, and albedo methods for streaming. Descriptions are given of the implementation of these methods and of their use in practical calculations. 26 references. (U.S.)

  19. Replica Exchange for Reactive Monte Carlo Simulations

    Czech Academy of Sciences Publication Activity Database

    Turner, C.H.; Brennan, J.K.; Lísal, Martin

    2007-01-01

    Roč. 111, č. 43 (2007), s. 15706-15715 ISSN 1932-7447 R&D Projects: GA ČR GA203/05/0725; GA AV ČR 1ET400720409; GA AV ČR 1ET400720507 Institutional research plan: CEZ:AV0Z40720504 Keywords : monte carlo * simulation * reactive system Subject RIV: CF - Physical ; Theoretical Chemistry

  20. Applications of Maxent to quantum Monte Carlo

    Energy Technology Data Exchange (ETDEWEB)

    Silver, R.N.; Sivia, D.S.; Gubernatis, J.E. (Los Alamos National Lab., NM (USA)); Jarrell, M. (Ohio State Univ., Columbus, OH (USA). Dept. of Physics)

    1990-01-01

    We consider the application of maximum entropy methods to the analysis of data produced by computer simulations. The focus is the calculation of the dynamical properties of quantum many-body systems by Monte Carlo methods, which is termed the Analytical Continuation Problem.'' For the Anderson model of dilute magnetic impurities in metals, we obtain spectral functions and transport coefficients which obey Kondo Universality.'' 24 refs., 7 figs.

  1. Monte Carlo methods for preference learning

    DEFF Research Database (Denmark)

    Viappiani, P.

    2012-01-01

    Utility elicitation is an important component of many applications, such as decision support systems and recommender systems. Such systems query the users about their preferences and give recommendations based on the system’s belief about the utility function. Critical to these applications is th...... is the acquisition of prior distribution about the utility parameters and the possibility of real time Bayesian inference. In this paper we consider Monte Carlo methods for these problems....

  2. General purpose code for Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wilcke, W.W.

    1983-01-01

    A general-purpose computer called MONTHY has been written to perform Monte Carlo simulations of physical systems. To achieve a high degree of flexibility the code is organized like a general purpose computer, operating on a vector describing the time dependent state of the system under simulation. The instruction set of the computer is defined by the user and is therefore adaptable to the particular problem studied. The organization of MONTHY allows iterative and conditional execution of operations

  3. The lund Monte Carlo for jet fragmentation

    International Nuclear Information System (INIS)

    Sjoestrand, T.

    1982-03-01

    We present a Monte Carlo program based on the Lund model for jet fragmentation. Quark, gluon, diquark and hadron jets are considered. Special emphasis is put on the fragmentation of colour singlet jet systems, for which energy, momentum and flavour are conserved explicitly. The model for decays of unstable particles, in particular the weak decay of heavy hadrons, is described. The central part of the paper is a detailed description on how to use the FORTRAN 77 program. (Author)

  4. Carlo Rosselli e il socialismo delle autonomie

    OpenAIRE

    Calabrò, Carmelo

    2008-01-01

    L’impegno teorico di Carlo Rosselli è riconducibile alle molteplici esperienze minoritarie (almeno a livello continentale) che, negli anni ’20, mirano al superamento dell’impianto dottrinario del socialismo marxista. Tanto nella variante riformista, quanto in quella massimalista, classismo, olismo e collettivismo sono principi tendenzialmente comuni alla cultura del marxismo; principi dicotomici rispetto al liberalismo e problematici nei confronti della democrazia. Rosselli, contro questa tra...

  5. Autocorrelations in hybrid Monte Carlo simulations

    International Nuclear Information System (INIS)

    Schaefer, Stefan; Virotta, Francesco

    2010-11-01

    Simulations of QCD suffer from severe critical slowing down towards the continuum limit. This problem is known to be prominent in the topological charge, however, all observables are affected to various degree by these slow modes in the Monte Carlo evolution. We investigate the slowing down in high statistics simulations and propose a new error analysis method, which gives a realistic estimate of the contribution of the slow modes to the errors. (orig.)

  6. Topological zero modes in Monte Carlo simulations

    International Nuclear Information System (INIS)

    Dilger, H.

    1994-08-01

    We present an improvement of global Metropolis updating steps, the instanton hits, used in a hybrid Monte Carlo simulation of the two-flavor Schwinger model with staggered fermions. These hits are designed to change the topological sector of the gauge field. In order to match these hits to an unquenched simulation with pseudofermions, the approximate zero mode structure of the lattice Dirac operator has to be considered explicitly. (orig.)

  7. Monte Carlo simulation of Touschek effect

    Directory of Open Access Journals (Sweden)

    Aimin Xiao

    2010-07-01

    Full Text Available We present a Monte Carlo method implementation in the code elegant for simulating Touschek scattering effects in a linac beam. The local scattering rate and the distribution of scattered electrons can be obtained from the code either for a Gaussian-distributed beam or for a general beam whose distribution function is given. In addition, scattered electrons can be tracked through the beam line and the local beam-loss rate and beam halo information recorded.

  8. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Author Affiliations. Soumen Bag1 Gaurav Harit2. Department of Computer Science and Engineering, Indian Institute of Technology Kharagpur, Kharagpur 721 302, India; Information and Communication Technology, Indian Institute of Technology Rajasthan, Jodhpur 342 011, India ...

  9. 76 FR 49505 - Indian Gaming

    Science.gov (United States)

    2011-08-10

    ... activities on Indian lands. This amendment allows for the extension of the current Tribal-State Class III.... Dated: August 2, 2011. Donald E. Laverdure, Principal Deputy Assistant Secretary, Indian Affairs...

  10. Gallery | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Toggle navigation. Home; ·; About; ·; Speakers; ·; Schedule; ·; Gallery · Logo of the Indian Academy of Sciences; Info for Participants; ·; Downloads; ·; Contact Us. © 2017 Indian Academy of Sciences, Bengaluru.

  11. Associateship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Specialization: AIMD Simulation, X-ray Science, Ultrafast Science, Surface Science, Molecular Beam Experiments Address: IPC Department, Indian Institute of Science, .... Specialization: Game Theory & Optimisation, Stochastic Control, Information Theory Address: Systems & Control Engineering, Indian Institute of ...

  12. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Sequential Bayesian technique: An alternative approach for software reliability estimation ... Software reliability; Bayesian sequential estimation; Kalman filter. ... Department of Mathematics, Indian Institute of Technology, Kharagpur 721 302; Reliability Engineering Centre, Indian Institute of Technology, Kharagpur 721 302 ...

  13. Gallery | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Toggle navigation. Home; ·; About; ·; Speakers; ·; Schedule; ·; Gallery; Logo of the Indian Academy of Sciences; Info for Participants; ·; Downloads; ·; Contact Us. © 2016 Indian Academy of Sciences, Bengaluru.

  14. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Author Affiliations. A Salih1 S Ghosh Moulic2. Department of Aerospace Engineering, Indian Institute of Space Science and Technology, Thiruvananthapuram 695 022; Department of Mechanical Engineering, Indian Institute of Technology, Kharagpur 721 302 ...

  15. Biased Monte Carlo optimization: the basic approach

    International Nuclear Information System (INIS)

    Campioni, Luca; Scardovelli, Ruben; Vestrucci, Paolo

    2005-01-01

    It is well-known that the Monte Carlo method is very successful in tackling several kinds of system simulations. It often happens that one has to deal with rare events, and the use of a variance reduction technique is almost mandatory, in order to have Monte Carlo efficient applications. The main issue associated with variance reduction techniques is related to the choice of the value of the biasing parameter. Actually, this task is typically left to the experience of the Monte Carlo user, who has to make many attempts before achieving an advantageous biasing. A valuable result is provided: a methodology and a practical rule addressed to establish an a priori guidance for the choice of the optimal value of the biasing parameter. This result, which has been obtained for a single component system, has the notable property of being valid for any multicomponent system. In particular, in this paper, the exponential and the uniform biases of exponentially distributed phenomena are investigated thoroughly

  16. Generalized hybrid Monte Carlo - CMFD methods for fission source convergence

    International Nuclear Information System (INIS)

    Wolters, Emily R.; Larsen, Edward W.; Martin, William R.

    2011-01-01

    In this paper, we generalize the recently published 'CMFD-Accelerated Monte Carlo' method and present two new methods that reduce the statistical error in CMFD-Accelerated Monte Carlo. The CMFD-Accelerated Monte Carlo method uses Monte Carlo to estimate nonlinear functionals used in low-order CMFD equations for the eigenfunction and eigenvalue. The Monte Carlo fission source is then modified to match the resulting CMFD fission source in a 'feedback' procedure. The two proposed methods differ from CMFD-Accelerated Monte Carlo in the definition of the required nonlinear functionals, but they have identical CMFD equations. The proposed methods are compared with CMFD-Accelerated Monte Carlo on a high dominance ratio test problem. All hybrid methods converge the Monte Carlo fission source almost immediately, leading to a large reduction in the number of inactive cycles required. The proposed methods stabilize the fission source more efficiently than CMFD-Accelerated Monte Carlo, leading to a reduction in the number of active cycles required. Finally, as in CMFD-Accelerated Monte Carlo, the apparent variance of the eigenfunction is approximately equal to the real variance, so the real error is well-estimated from a single calculation. This is an advantage over standard Monte Carlo, in which the real error can be underestimated due to inter-cycle correlation. (author)

  17. Monte carlo methods and models in finance and insurance

    CERN Document Server

    Korn, Ralf; Kroisandt, Gerald

    2010-01-01

    Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of...

  18. Monte Carlo methods and models in finance and insurance

    CERN Document Server

    Korn, Ralf; Kroisandt, Gerald

    2010-01-01

    Offering a unique balance between applications and calculations, Monte Carlo Methods and Models in Finance and Insurance incorporates the application background of finance and insurance with the theory and applications of Monte Carlo methods. It presents recent methods and algorithms, including the multilevel Monte Carlo method, the statistical Romberg method, and the Heath-Platen estimator, as well as recent financial and actuarial models, such as the Cheyette and dynamic mortality models. The authors separately discuss Monte Carlo techniques, stochastic process basics, and the theoretical background and intuition behind financial and actuarial mathematics, before bringing the topics together to apply the Monte Carlo methods to areas of finance and insurance. This allows for the easy identification of standard Monte Carlo tools and for a detailed focus on the main principles of financial and insurance mathematics. The book describes high-level Monte Carlo methods for standard simulation and the simulation of...

  19. About | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    About. Agenda—82nd Annual Meeting - Indian Academy of Sciences. The 82nd Annual Meeting of the Indian Academy of Sciences is being held at Bhopal, hosted by the Indian Institute of Science Education and Research, during during 4th – 6th November 2016. The two and a half days' deliberation will see the ...

  20. 78 FR 78377 - Indian Gaming

    Science.gov (United States)

    2013-12-26

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ] ACTION: Notice of extension of Tribal-State Class III Gaming Compact. SUMMARY: This publishes notice of the extension of the Class III gaming compact between the Yankton Sioux Tribe and the State of South Dakota. DATES: Effective...

  1. 78 FR 54670 - Indian Gaming

    Science.gov (United States)

    2013-09-05

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of extension of Tribal--State Class III Gaming Compact. SUMMARY: This publishes notice of the Extension of the Class III gaming compact between the Yankton Sioux Tribe and the State of South Dakota. DATES: Effective...

  2. 78 FR 17428 - Indian Gaming

    Science.gov (United States)

    2013-03-21

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of the Class III Tribal- State Gaming Compact between the Pyramid Lake Paiute Tribe and the State of Nevada...

  3. 78 FR 62650 - Indian Gaming

    Science.gov (United States)

    2013-10-22

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of extension of Tribal-State Class III Gaming Compact. SUMMARY: This publishes notice of the extension of the Class III gaming compact between the Rosebud Sioux Tribe and the State of South Dakota. DATES: Effective...

  4. 78 FR 17427 - Indian Gaming

    Science.gov (United States)

    2013-03-21

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes approval of the agreement between the Northern Cheyenne Tribe and the State of Montana concerning Class III Gaming (Compact). DATES...

  5. 77 FR 41200 - Indian Gaming

    Science.gov (United States)

    2012-07-12

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal--State Class III Gaming Compact. SUMMARY: This notice publishes approval by the Department of an extension to the Class III Gaming Compact between the State of California and the Federated...

  6. 75 FR 68823 - Indian Gaming

    Science.gov (United States)

    2010-11-09

    ... Doc No: 2010-28267] DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Amendment. SUMMARY: This notice publishes approval of the Amendments to the Class III Gaming Compact (Amendment...

  7. 78 FR 54908 - Indian Gaming

    Science.gov (United States)

    2013-09-06

    ... Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of approved Tribal-State Class III Gaming Compact. SUMMARY: This notice publishes the approval of the Class III Tribal- State Gaming Compact between the Wiyot Tribe and the State of California. DATES: Effective...

  8. 76 FR 49505 - Indian Gaming

    Science.gov (United States)

    2011-08-10

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal-State Class III Gaming Compact taking effect. SUMMARY: This publishes... taking effect. DATES: Effective Date: August 10, 2011. FOR FURTHER INFORMATION CONTACT: Paula L. Hart...

  9. 78 FR 62649 - Indian Gaming

    Science.gov (United States)

    2013-10-22

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs [DR.5B711.IA000813] Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal-State Class III Gaming Compact taking effect. SUMMARY: This notice publishes the Class III Gaming Compact between the North Fork Rancheria of Mono...

  10. 77 FR 76514 - Indian Gaming

    Science.gov (United States)

    2012-12-28

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Approved Tribal-State Class III Gaming Compact taking effect. SUMMARY: This... Regulation of Class III Gaming between the Confederated Tribes of the Grand Ronde Community of Oregon and the...

  11. 76 FR 11258 - Indian Gaming

    Science.gov (United States)

    2011-03-01

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal--State Class III Gaming Compact taking effect. SUMMARY: Notice is given that the Tribal-State Compact for Regulation of Class III Gaming between the Confederated Tribes of the...

  12. 77 FR 5566 - Indian Gaming

    Science.gov (United States)

    2012-02-03

    ... DEPARTMENT OF THE INTERIOR Bureau of Indian Affairs Indian Gaming AGENCY: Bureau of Indian Affairs, Interior. ACTION: Notice of Tribal--State Class III Gaming Compact Taking Effect. SUMMARY: This publishes... Effect. DATES: Effective Date: February 3, 2012. FOR FURTHER INFORMATION CONTACT: Paula L. Hart, Director...

  13. Leadership Challenges in Indian Country.

    Science.gov (United States)

    Horse, Perry

    2002-01-01

    American Indian leaders must meld the holistic and cyclical world view of Indian peoples with the linear, rational world view of mainstream society. Tribal leaders need to be statesmen and ethical politicians. Economic and educational development must be based on disciplined long-range planning and a strong, Indian-controlled educational base.…

  14. Mesospheric Temperatures over Apache Point Observatory (32°N, 105°W Derived from Sloan Digital Sky Survey Spectra

    Directory of Open Access Journals (Sweden)

    Gawon Kim

    2017-06-01

    Full Text Available We retrieved rotational temperatures from emission lines of the OH airglow (8-3 band in the sky spectra of the Sloan digital sky survey (SDSS for the period 2000-2014, as part of the astronomical observation project conducted at the Apache Point observatory (32°N, 105°W. The SDSS temperatures show a typical seasonal variation of mesospheric temperature: low in summer and high in winter. We find that the temperatures respond to solar activity by as much as 1.2 K ±0.8 K per 100 solar flux units, which is consistent with other studies in mid-latitude regions. After the seasonal variation and solar response were subtracted, the SDSS temperature is fairly constant over the 15 year period, unlike cooling trends suggested by some studies. This temperature analysis using SDSS spectra is a unique contribution to the global monitoring of climate change because the SDSS project was established for astronomical purposes and is independent from climate studies. The SDSS temperatures are also compared with mesospheric temperatures measured by the microwave limb sounder (MLS instrument on board the Aura satellite and the differences are discussed.

  15. THE TENTH DATA RELEASE OF THE SLOAN DIGITAL SKY SURVEY: FIRST SPECTROSCOPIC DATA FROM THE SDSS-III APACHE POINT OBSERVATORY GALACTIC EVOLUTION EXPERIMENT

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Christopher P.; Anderton, Timothy [Department of Physics and Astronomy, University of Utah, Salt Lake City, UT 84112 (United States); Alexandroff, Rachael [Center for Astrophysical Sciences, Department of Physics and Astronomy, Johns Hopkins University, 3400 North Charles Street, Baltimore, MD 21218 (United States); Allende Prieto, Carlos [Instituto de Astrofísica de Canarias (IAC), C/Vía Láctea, s/n, E-38200, La Laguna, Tenerife (Spain); Anders, Friedrich [Leibniz-Institut für Astrophysik Potsdam (AIP), An der Sternwarte 16, D-14482 Potsdam (Germany); Anderson, Scott F.; Bhardwaj, Vaishali [Department of Astronomy, University of Washington, Box 351580, Seattle, WA 98195 (United States); Andrews, Brett H. [Department of Astronomy, Ohio State University, 140 West 18th Avenue, Columbus, OH 43210 (United States); Aubourg, Éric; Bautista, Julian E. [APC, University of Paris Diderot, CNRS/IN2P3, CEA/IRFU, Observatoire de Paris, Sorbonne Paris Cité, F-75205 Paris (France); Bailey, Stephen; Beutler, Florian [Lawrence Berkeley National Laboratory, One Cyclotron Road, Berkeley, CA 94720 (United States); Bastien, Fabienne A.; Berlind, Andreas A.; Bird, Jonathan C. [Department of Physics and Astronomy, Vanderbilt University, VU Station 1807, Nashville, TN 37235 (United States); Beers, Timothy C. [National Optical Astronomy Observatory, 950 North Cherry Avenue, Tucson, AZ 85719 (United States); Beifiori, Alessandra [Max-Planck-Institut für Extraterrestrische Physik, Giessenbachstraße, D-85748 Garching (Germany); Bender, Chad F. [Department of Astronomy and Astrophysics, 525 Davey Laboratory, The Pennsylvania State University, University Park, PA 16802 (United States); Bizyaev, Dmitry [Apache Point Observatory, P.O. Box 59, Sunspot, NM 88349 (United States); Blake, Cullen H. [Department of Physics and Astronomy, University of Pennsylvania, 219 S. 33rd St., Philadelphia, PA 19104 (United States); and others

    2014-04-01

    The Sloan Digital Sky Survey (SDSS) has been in operation since 2000 April. This paper presents the Tenth Public Data Release (DR10) from its current incarnation, SDSS-III. This data release includes the first spectroscopic data from the Apache Point Observatory Galaxy Evolution Experiment (APOGEE), along with spectroscopic data from the Baryon Oscillation Spectroscopic Survey (BOSS) taken through 2012 July. The APOGEE instrument is a near-infrared R ∼ 22,500 300 fiber spectrograph covering 1.514-1.696 μm. The APOGEE survey is studying the chemical abundances and radial velocities of roughly 100,000 red giant star candidates in the bulge, bar, disk, and halo of the Milky Way. DR10 includes 178,397 spectra of 57,454 stars, each typically observed three or more times, from APOGEE. Derived quantities from these spectra (radial velocities, effective temperatures, surface gravities, and metallicities) are also included. DR10 also roughly doubles the number of BOSS spectra over those included in the Ninth Data Release. DR10 includes a total of 1,507,954 BOSS spectra comprising 927,844 galaxy spectra, 182,009 quasar spectra, and 159,327 stellar spectra selected over 6373.2 deg{sup 2}.

  16. Evaluation of organ doses in brachytherapy treatment of uterus cancer using mathematical reference Indian adult phantom

    International Nuclear Information System (INIS)

    Biju, K.

    2012-01-01

    Quantifying organ dose to healthy organs during radiotherapy is essential to estimate the radiation risk. Dose factors are generated by simulating radiation transport through an anthropomorphic mathematical phantom representing a reference Indian adult using the Monte Carlo method. The mean organ dose factors (in mGy min -1 GBq -1 ) are obtained considering the Micro Selectron 192 Ir source and BEBIG 60 Co sources in the uterus of a reference Indian adult female phantom. The present study provides the factors for mean absorbed dose to organs applicable to the Indian female patient population undergoing brachytherapy treatment of uterus cancer. This study also includes a comparison of the dimension of organs in the phantom model with measured values of organs in the various investigated patients. (author)

  17. Indian Ocean Traffic: Introduction

    Directory of Open Access Journals (Sweden)

    Lola Sharon Davidson

    2012-06-01

    Full Text Available Like the Mediterranean, the Indian Ocean has been a privileged site of cross-cultural contact since ancient times. In this special issue, our contributors track disparate movements of people and ideas around the Indian Ocean region and explore the cultural implications of these contacts and their role in processes that we would come to call transnationalization and globalisation. The nation is a relatively recent phenomenon anywhere on the globe, and in many countries around the Indian Ocean it was a product of colonisation and independence. So the processes of exchange, migration and cultural influence going on there for many centuries were mostly based on the economics of goods and trade routes, rather than on national identity and state policy.

  18. The Living Indian Critical Tradition

    Directory of Open Access Journals (Sweden)

    Vivek Kumar Dwivedi

    2010-11-01

    Full Text Available This paper attempts to establish the identity of something that is often considered to be missing – a living Indian critical tradition. I refer to the tradition that arises out of the work of those Indians who write in English. The chief architects of this tradition are Sri Aurobindo, C.D. Narasimhaiah, Gayatri Chakravorty Spivak and Homi K. Bhabha. It is possible to believe that Indian literary theories derive almost solely from ancient Sanskrit poetics. Or, alternatively, one can be concerned about the sad state of affairs regarding Indian literary theories or criticism in English. There have been scholars who have raised the question of the pathetic state of Indian scholarship in English and have even come up with some positive suggestions. But these scholars are those who are ignorant about the living Indian critical tradition. The significance of the Indian critical tradition lies in the fact that it provides the real focus to the Indian critical scene. Without an awareness of this tradition Indian literary scholarship (which is quite a different thing from Indian literary criticism and theory as it does not have the same impact as the latter two do can easily fail to see who the real Indian literary critics and theorists are.

  19. 76 FR 35221 - Epidemiology Program for American Indian/Alaska Native Tribes and Urban Indian Communities...

    Science.gov (United States)

    2011-06-16

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Indian Health Service Epidemiology Program for American Indian/Alaska Native Tribes and Urban Indian Communities; Correction AGENCY: Indian Health Service, HHS... Epidemiology Centers serving American Indian/Alaska Native Tribes and urban Indian communities. The document...

  20. Investigating the impossible: Monte Carlo simulations

    International Nuclear Information System (INIS)

    Kramer, Gary H.; Crowley, Paul; Burns, Linda C.

    2000-01-01

    Designing and testing new equipment can be an expensive and time consuming process or the desired performance characteristics may preclude its construction due to technological shortcomings. Cost may also prevent equipment being purchased for other scenarios to be tested. An alternative is to use Monte Carlo simulations to make the investigations. This presentation exemplifies how Monte Carlo code calculations can be used to fill the gap. An example is given for the investigation of two sizes of germanium detector (70 mm and 80 mm diameter) at four different crystal thicknesses (15, 20, 25, and 30 mm) and makes predictions on how the size affects the counting efficiency and the Minimum Detectable Activity (MDA). The Monte Carlo simulations have shown that detector efficiencies can be adequately modelled using photon transport if the data is used to investigate trends. The investigation of the effect of detector thickness on the counting efficiency has shown that thickness for a fixed diameter detector of either 70 mm or 80 mm is unimportant up to 60 keV. At higher photon energies, the counting efficiency begins to decrease as the thickness decreases as expected. The simulations predict that the MDA of either the 70 mm or 80 mm diameter detectors does not differ by more than a factor of 1.15 at 17 keV or 1.2 at 60 keV when comparing detectors of equivalent thicknesses. The MDA is slightly increased at 17 keV, and rises by about 52% at 660 keV, when the thickness is decreased from 30 mm to 15 mm. One could conclude from this information that the extra cost associated with the larger area Ge detectors may not be justified for the slight improvement predicted in the MDA. (author)

  1. Combination of Mean Platelet Volume/Platelet Count Ratio and the APACHE II Score Better Predicts the Short-Term Outcome in Patients with Acute Kidney Injury Receiving Continuous Renal Replacement Therapy.

    Science.gov (United States)

    Li, Junhui; Li, Yingchuan; Sheng, Xiaohua; Wang, Feng; Cheng, Dongsheng; Jian, Guihua; Li, Yongguang; Feng, Liang; Wang, Niansong

    2018-03-29

    Both the Acute physiology and Chronic Health Evaluation (APACHE II) score and mean platelet volume/platelet count Ratio (MPR) can independently predict adverse outcomes in critically ill patients. This study was aimed to investigate whether the combination of them could have a better performance in predicting prognosis of patients with acute kidney injury (AKI) who received continuous renal replacement therapy (CRRT). Two hundred twenty-three patients with AKI who underwent CRRT between January 2009 and December 2014 in a Chinese university hospital were enrolled. They were divided into survivals group and non-survivals group based on the situation at discharge. Receiver Operating Characteristic (ROC) curve was used for MPR and APACHE II score, and to determine the optimal cut-off value of MPR for in-hospital mortality. Factors associated with mortality were identified by univariate and multivariate logistic regression analysis. The mean age of the patients was 61.4 years, and the overall in-hospital mortality was 48.4%. Acute cardiorenal syndrome (ACRS) was the most common cause of AKI. The optimal cut-off value of MPR for mortality was 0.099 with an area under the ROC curve (AUC) of 0.636. The AUC increased to 0.851 with the addition of the APACHE II score. The mortality of patients with of MPR > 0.099 was 56.4%, which was significantly higher than that of the control group with of ≤ 0.099 (39.6%, P= 0.012). Logistic regression analysis showed that average number of organ failure (OR = 2.372), APACHE II score (OR = 1.187), age (OR = 1.028) and vasopressors administration (OR = 38.130) were significantly associated with poor prognosis. Severity of illness was significantly associated with prognosis of patients with AKI. The combination of MPR and APACHE II score may be helpful in predicting the short-term outcome of AKI. © 2018 The Author(s). Published by S. Karger AG, Basel.

  2. Combination of Mean Platelet Volume/Platelet Count Ratio and the APACHE II Score Better Predicts the Short-Term Outcome in Patients with Acute Kidney Injury Receiving Continuous Renal Replacement Therapy

    Directory of Open Access Journals (Sweden)

    Junhui Li

    2018-03-01

    Full Text Available Background/Aims: Both the Acute physiology and Chronic Health Evaluation (APACHE II score and mean platelet volume/platelet count Ratio (MPR can independently predict adverse outcomes in critically ill patients. This study was aimed to investigate whether the combination of them could have a better performance in predicting prognosis of patients with acute kidney injury (AKI who received continuous renal replacement therapy (CRRT. Methods: Two hundred twenty-three patients with AKI who underwent CRRT between January 2009 and December 2014 in a Chinese university hospital were enrolled. They were divided into survivals group and non-survivals group based on the situation at discharge. Receiver Operating Characteristic (ROC curve was used for MPR and APACHE II score, and to determine the optimal cut-off value of MPR for in-hospital mortality. Factors associated with mortality were identified by univariate and multivariate logistic regression analysis. Results: The mean age of the patients was 61.4 years, and the overall in-hospital mortality was 48.4%. Acute cardiorenal syndrome (ACRS was the most common cause of AKI. The optimal cut-off value of MPR for mortality was 0.099 with an area under the ROC curve (AUC of 0.636. The AUC increased to 0.851 with the addition of the APACHE II score. The mortality of patients with of MPR > 0.099 was 56.4%, which was significantly higher than that of the control group with of ≤ 0.099 (39.6%, P= 0.012. Logistic regression analysis showed that average number of organ failure (OR = 2.372, APACHE II score (OR = 1.187, age (OR = 1.028 and vasopressors administration (OR = 38.130 were significantly associated with poor prognosis. Conclusion: Severity of illness was significantly associated with prognosis of patients with AKI. The combination of MPR and APACHE II score may be helpful in predicting the short-term outcome of AKI.

  3. Survival and outcome prediction using the Apache III and the out-of-hospital cardiac arrest (OHCA) score in patients treated in the intensive care unit (ICU) following out-of-hospital, in-hospital or ICU cardiac arrest.

    Science.gov (United States)

    Skrifvars, M B; Varghese, B; Parr, M J

    2012-06-01

    There are few data comparing outcome and the utility of severity of illness scoring systems following intensive care after out-of-hospital (OHCA), in-hospital (IHCA) and intensive care unit (ICUCA) cardiac arrest. We investigated survival, factors associated with survival and the correlation and accuracy of general and specific scoring systems, including the Apache III score and the OHCA score in OHCA, IHCA and ICUCA patients. Prospective analysis of data on all cardiac arrest patients treated in a tertiary hospital between August 1st 2008 and July 30th 2010. Collected data included resuscitation and post-resuscitation care data as defined by the Utstein Guidelines, Apache III on admission and the OHCA score on admission in OHCA and IHCA patients and after the arrest in ICUCA patients. Statistical methods were used to identify factors associated with outcome and the predictive ability and correlation of the aforementioned scores. Of a total of 3931 patients treated in the ICU, 51 were admitted following OHCA, 50 following IHCA and 22 suffered an ICUCA and had sustained return of spontaneous circulation (ROSC). Survival at 30 days was highest among ICUCAs (67%) followed by IHCAs (38%) and OHCAs (29%). Using multivariate analysis delay ROSC was the only independent predictor of survival. The OHCA score performed with moderate accuracy for predicting 30-day mortality (area under the curve 0.77 [0.69-0.86] and was slightly better than the Apache III score 0.71 (0.61-0.80). Using multiple logistic regression the Apache III and the OHCA score were both independent predictors of hospital survival and correlation between these two scores was weak (correlation coefficient of 0.244). Latency to ROSC seems to be the most important determinant of survival in patients following ICU care after a cardiac arrest in this single center trial. The OHCA score and the Apache III score offer moderate predictive accuracy in ICU cardiac arrest patients but correlated weakly with each

  4. Monte Carlo eigenfunction strategies and uncertainties

    International Nuclear Information System (INIS)

    Gast, R.C.; Candelore, N.R.

    1974-01-01

    Comparisons of convergence rates for several possible eigenfunction source strategies led to the selection of the ''straight'' analog of the analytic power method as the source strategy for Monte Carlo eigenfunction calculations. To insure a fair game strategy, the number of histories per iteration increases with increasing iteration number. The estimate of eigenfunction uncertainty is obtained from a modification of a proposal by D. B. MacMillan and involves only estimates of the usual purely statistical component of uncertainty and a serial correlation coefficient of lag one. 14 references. (U.S.)

  5. Atomistic Monte Carlo simulation of lipid membranes

    DEFF Research Database (Denmark)

    Wüstner, Daniel; Sklenar, Heinz

    2014-01-01

    Biological membranes are complex assemblies of many different molecules of which analysis demands a variety of experimental and computational approaches. In this article, we explain challenges and advantages of atomistic Monte Carlo (MC) simulation of lipid membranes. We provide an introduction...... into the various move sets that are implemented in current MC methods for efficient conformational sampling of lipids and other molecules. In the second part, we demonstrate for a concrete example, how an atomistic local-move set can be implemented for MC simulations of phospholipid monomers and bilayer patches...

  6. Monte Carlo method in radiation transport problems

    International Nuclear Information System (INIS)

    Dejonghe, G.; Nimal, J.C.; Vergnaud, T.

    1986-11-01

    In neutral radiation transport problems (neutrons, photons), two values are important: the flux in the phase space and the density of particles. To solve the problem with Monte Carlo method leads to, among other things, build a statistical process (called the play) and to provide a numerical value to a variable x (this attribution is called score). Sampling techniques are presented. Play biasing necessity is proved. A biased simulation is made. At last, the current developments (rewriting of programs for instance) are presented due to several reasons: two of them are the vectorial calculation apparition and the photon and neutron transport in vacancy media [fr

  7. MBR Monte Carlo Simulation in PYTHIA8

    Science.gov (United States)

    Ciesielski, R.

    We present the MBR (Minimum Bias Rockefeller) Monte Carlo simulation of (anti)proton-proton interactions and its implementation in the PYTHIA8 event generator. We discuss the total, elastic, and total-inelastic cross sections, and three contributions from diffraction dissociation processes that contribute to the latter: single diffraction, double diffraction, and central diffraction or double-Pomeron exchange. The event generation follows a renormalized-Regge-theory model, successfully tested using CDF data. Based on the MBR-enhanced PYTHIA8 simulation, we present cross-section predictions for the LHC and beyond, up to collision energies of 50 TeV.

  8. Markov chains analytic and Monte Carlo computations

    CERN Document Server

    Graham, Carl

    2014-01-01

    Markov Chains: Analytic and Monte Carlo Computations introduces the main notions related to Markov chains and provides explanations on how to characterize, simulate, and recognize them. Starting with basic notions, this book leads progressively to advanced and recent topics in the field, allowing the reader to master the main aspects of the classical theory. This book also features: Numerous exercises with solutions as well as extended case studies.A detailed and rigorous presentation of Markov chains with discrete time and state space.An appendix presenting probabilistic notions that are nec

  9. Score Bounded Monte-Carlo Tree Search

    Science.gov (United States)

    Cazenave, Tristan; Saffidine, Abdallah

    Monte-Carlo Tree Search (MCTS) is a successful algorithm used in many state of the art game engines. We propose to improve a MCTS solver when a game has more than two outcomes. It is for example the case in games that can end in draw positions. In this case it improves significantly a MCTS solver to take into account bounds on the possible scores of a node in order to select the nodes to explore. We apply our algorithm to solving Seki in the game of Go and to Connect Four.

  10. IN MEMORIAM CARLOS RESTREPO. UN VERDADERO MAESTRO

    OpenAIRE

    Pelayo Correa

    2009-01-01

    Carlos Restrepo fue el primer profesor de Patología y un miembro ilustre del grupo de pioneros que fundaron la Facultad de Medicina de la Universidad del Valle. Estos pioneros convergieron en Cali en la década de 1950, en posesión de un espíritu renovador y creativo que emprendió con mucho éxito la labor de cambiar la cultura académica del Valle del Cauca. Ellos encontraron una sociedad apacible, que disfrutaba de la generosidad de su entorno, sin deseos de romper las tradiciones centenarias ...

  11. Monte Carlo study of the multiquark systems

    International Nuclear Information System (INIS)

    Kerbikov, B.O.; Polikarpov, M.I.; Zamolodchikov, A.B.

    1986-01-01

    Random walks have been used to calculate the energies of the ground states in systems of N=3, 6, 9, 12 quarks. Multiquark states with N>3 are unstable with respect to the spontaneous dissociation into color singlet hadrons. The modified Green's function Monte Carlo algorithm which proved to be more simple and much accurate than the conventional few body methods have been employed. In contrast to other techniques, the same equations are used for any number of particles, while the computer time increases only linearly V, S the number of particles

  12. by means of FLUKA Monte Carlo method

    Directory of Open Access Journals (Sweden)

    Ermis Elif Ebru

    2015-01-01

    Full Text Available Calculations of gamma-ray mass attenuation coefficients of various detector materials (crystals were carried out by means of FLUKA Monte Carlo (MC method at different gamma-ray energies. NaI, PVT, GSO, GaAs and CdWO4 detector materials were chosen in the calculations. Calculated coefficients were also compared with the National Institute of Standards and Technology (NIST values. Obtained results through this method were highly in accordance with those of the NIST values. It was concluded from the study that FLUKA MC method can be an alternative way to calculate the gamma-ray mass attenuation coefficients of the detector materials.

  13. Pseudo-extended Markov chain Monte Carlo

    OpenAIRE

    Nemeth, Christopher; Lindsten, Fredrik; Filippone, Maurizio; Hensman, James

    2017-01-01

    Sampling from the posterior distribution using Markov chain Monte Carlo (MCMC) methods can require an exhaustive number of iterations to fully explore the correct posterior. This is often the case when the posterior of interest is multi-modal, as the MCMC sampler can become trapped in a local mode for a large number of iterations. In this paper, we introduce the pseudo-extended MCMC method as an approach for improving the mixing of the MCMC sampler in complex posterior distributions. The pseu...

  14. Diffusion quantum Monte Carlo for molecules

    International Nuclear Information System (INIS)

    Lester, W.A. Jr.

    1986-07-01

    A quantum mechanical Monte Carlo method has been used for the treatment of molecular problems. The imaginary-time Schroedinger equation written with a shift in zero energy [E/sub T/ - V(R)] can be interpreted as a generalized diffusion equation with a position-dependent rate or branching term. Since diffusion is the continuum limit of a random walk, one may simulate the Schroedinger equation with a function psi (note, not psi 2 ) as a density of ''walks.'' The walks undergo an exponential birth and death as given by the rate term. 16 refs., 2 tabs

  15. Discrete diffusion Monte Carlo for frequency-dependent radiative transfer

    Energy Technology Data Exchange (ETDEWEB)

    Densmore, Jeffrey D [Los Alamos National Laboratory; Kelly, Thompson G [Los Alamos National Laboratory; Urbatish, Todd J [Los Alamos National Laboratory

    2010-11-17

    Discrete Diffusion Monte Carlo (DDMC) is a technique for increasing the efficiency of Implicit Monte Carlo radiative-transfer simulations. In this paper, we develop an extension of DDMC for frequency-dependent radiative transfer. We base our new DDMC method on a frequency-integrated diffusion equation for frequencies below a specified threshold. Above this threshold we employ standard Monte Carlo. With a frequency-dependent test problem, we confirm the increased efficiency of our new DDMC technique.

  16. Monte Carlo criticality analysis for dissolvers with neutron poison

    International Nuclear Information System (INIS)

    Yu, Deshun; Dong, Xiufang; Pu, Fuxiang.

    1987-01-01

    Criticality analysis for dissolvers with neutron poison is given on the basis of Monte Carlo method. In Monte Carlo calculations of thermal neutron group parameters for fuel pieces, neutron transport length is determined in terms of maximum cross section approach. A set of related effective multiplication factors (K eff ) are calculated by Monte Carlo method for the three cases. Related numerical results are quite useful for the design and operation of this kind of dissolver in the criticality safety analysis. (author)

  17. Monte Carlo Based Framework to Support HAZOP Study

    DEFF Research Database (Denmark)

    Danko, Matej; Frutiger, Jerome; Jelemenský, Ľudovít

    2017-01-01

    This study combines Monte Carlo based process simulation features with classical hazard identification techniques for consequences of deviations from normal operating conditions investigation and process safety examination. A Monte Carlo based method has been used to sample and evaluate different...... deviations in process parameters simultaneously, thereby bringing an improvement to the Hazard and Operability study (HAZOP), which normally considers only one at a time deviation in process parameters. Furthermore, Monte Carlo filtering was then used to identify operability and hazard issues including...

  18. Performance of the Apache Point Observatory Galactic Evolution Experiment (APOGEE) high-resolution near-infrared multi-object fiber spectrograph

    Science.gov (United States)

    Wilson, John C.; Hearty, F.; Skrutskie, M. F.; Majewski, S. R.; Schiavon, R.; Eisenstein, D.; Gunn, J.; Holtzman, J.; Nidever, D.; Gillespie, B.; Weinberg, D.; Blank, B.; Henderson, C.; Smee, S.; Barkhouser, R.; Harding, A.; Hope, S.; Fitzgerald, G.; Stolberg, T.; Arns, J.; Nelson, M.; Brunner, S.; Burton, A.; Walker, E.; Lam, C.; Maseman, P.; Barr, J.; Leger, F.; Carey, L.; MacDonald, N.; Ebelke, G.; Beland, S.; Horne, T.; Young, E.; Rieke, G.; Rieke, M.; O'Brien, T.; Crane, J.; Carr, M.; Harrison, C.; Stoll, R.; Vernieri, M.; Shetrone, M.; Allende-Prieto, C.; Johnson, J.; Frinchaboy, P.; Zasowski, G.; Garcia Perez, A.; Bizyaev, D.; Cunha, K.; Smith, V. V.; Meszaros, Sz.; Zhao, B.; Hayden, M.; Chojnowski, S. D.; Andrews, B.; Loomis, C.; Owen, R.; Klaene, M.; Brinkmann, J.; Stauffer, F.; Long, D.; Jordan, W.; Holder, D.; Cope, F.; Naugle, T.; Pfaffenberger, B.; Schlegel, D.; Blanton, M.; Muna, D.; Weaver, B.; Snedden, S.; Pan, K.; Brewington, H.; Malanushenko, E.; Malanushenko, V.; Simmons, A.; Oravetz, D.; Mahadevan, S.; Halverson, S.

    2012-09-01

    The Apache Point Observatory Galactic Evolution Experiment (APOGEE) uses a dedicated 300-fiber, narrow-band near-infrared (1.51-1.7 μm), high resolution (R~22,500) spectrograph to survey approximately 100,000 giant stars across the Milky Way. This three-year survey, in operation since late-summer 2011 as part of the Sloan Digital Sky Survey III (SDSS III), will revolutionize our understanding of the kinematical and chemical enrichment histories of all Galactic stellar populations. We present the performance of the instrument from its first year in operation. The instrument is housed in a separate building adjacent to the 2.5-m SDSS telescope and fed light via approximately 45-meter fiber runs from the telescope. The instrument design includes numerous innovations including a gang connector that allows simultaneous connection of all fibers with a single plug to a telescope cartridge that positions the fibers on the sky, numerous places in the fiber train in which focal ratio degradation had to be minimized, a large mosaic-VPH (290 mm x 475 mm elliptically-shaped recorded area), an f/1.4 six-element refractive camera featuring silicon and fused silica elements with diameters as large as 393 mm, three near-infrared detectors mounted in a 1 x 3 mosaic with sub-pixel translation capability, and all of these components housed within a custom, LN2-cooled, stainless steel vacuum cryostat with dimensions 1.4-m x 2.3-m x 1.3-m.

  19. The international INTRAVAL project. Phase 2, working group 1 report. Flow and tracer experiments in unsaturated tuff and soil. Las Cruces trench and Apache Leap tuff studies

    International Nuclear Information System (INIS)

    Nicholson, T.J.; Guzman-Guzman, A.; Hills, R.; Rasmussen, T.C.

    1997-01-01

    The Working Group 1 final report summaries two test case studies, the Las Cruces Trench (LCT), and Apache Leap Tuff Site (ALTS) experiments. The objectives of these two field studies were to evaluate models for water flow and contaminant transport in unsaturated, heterogeneous soils and fractured tuff. The LCT experiments were specifically designed to test various deterministic and stochastic models of water flow and solute transport in heterogeneous, unsaturated soils. Experimental data from the first tow LCT experiments, and detailed field characterisation studies provided information for developing and calibrating the models. Experimental results from the third experiment were held confidential from the modellers, and were used for model comparison. Comparative analyses included: point comparisons of water content; predicted mean behavior for water flow; point comparisons of solute concentrations; and predicted mean behavior for tritium transport. These analyses indicated that no model, whether uniform or heterogeneous, proved superior. Since the INTRAVAL study, however, a new method has been developed for conditioning the hydraulic properties used for flow and transport modelling based on the initial field-measured water content distributions and a set of scale-mean hydraulic parameters. Very good matches between the observed and simulated flow and transport behavior were obtained using the conditioning procedure, without model calibration. The ALTS experiments were designed to evaluate characterisation methods and their associated conceptual models for coupled matrix-fracture continua over a range of scales (i.e., 2.5 centimeter rock samples; 10 centimeter cores; 1 meter block; and 30 meter boreholes). Within these spatial scales, laboratory and field tests were conducted for estimating pneumatic, thermal, hydraulic, and transport property values for different conceptual models. The analyses included testing of current conceptual, mathematical and physical

  20. The Apache Longbow-Hellfire Missile Test at Yuma Proving Ground: Ecological Risk Assessment for Tracked Vehicle Movement across Desert Pavement

    International Nuclear Information System (INIS)

    Peterson, Mark J; Efroymson, Rebecca Ann; Hargrove, William Walter

    2008-01-01

    A multiple stressor risk assessment was conducted at Yuma Proving Ground, Arizona, as a demonstration of the Military Ecological Risk Assessment Framework. The focus was a testing program at Cibola Range, which involved an Apache Longbow helicopter firing Hellfire missiles at moving targets, M60-A1 tanks. This paper describes the ecological risk assessment for the tracked vehicle movement component of the testing program. The principal stressor associated with tracked vehicle movement was soil disturbance, and a resulting, secondary stressor was hydrological change. Water loss to washes and wash vegetation was expected to result from increased infiltration and/or evaporation associated with disturbances to desert pavement. The simulated exposure of wash vegetation to water loss was quantified using estimates of exposed land area from a digital ortho quarter quad aerial photo and field observations, a 30 30 m digital elevation model, the flow accumulation feature of ESRI ArcInfo, and a two-step process in which runoff was estimated from direct precipitation to a land area and from water that flowed from upgradient to a land area. In all simulated scenarios, absolute water loss decreased with distance from the disturbance, downgradient in the washes; however, percentage water loss was greatest in land areas immediately downgradient of a disturbance. Potential effects on growth and survival of wash trees were quantified by using an empirical relationship derived from a local unpublished study of water infiltration rates. The risk characterization concluded that neither risk to wash vegetation growth or survival nor risk to mule deer abundance and reproduction was expected. The risk characterization was negative for both the incremental risk of the test program and the combination of the test and pretest disturbances

  1. The 13th Data Release of the Sloan Digital Sky Survey: First Spectroscopic Data from the SDSS-IV Survey Mapping Nearby Galaxies at Apache Point Observatory

    Science.gov (United States)

    Albareti, Franco D.; Allende Prieto, Carlos; Almeida, Andres; Anders, Friedrich; Anderson, Scott; Andrews, Brett H.; Aragón-Salamanca, Alfonso; Argudo-Fernández, Maria; Armengaud, Eric; Aubourg, Eric; Avila-Reese, Vladimir; Badenes, Carles; Bailey, Stephen; Barbuy, Beatriz; Barger, Kat; Barrera-Ballesteros, Jorge; Bartosz, Curtis; Basu, Sarbani; Bates, Dominic; Battaglia, Giuseppina; Baumgarten, Falk; Baur, Julien; Bautista, Julian; Beers, Timothy C.; Belfiore, Francesco; Bershady, Matthew; Bertran de Lis, Sara; Bird, Jonathan C.; Bizyaev, Dmitry; Blanc, Guillermo A.; Blanton, Michael; Blomqvist, Michael; Bolton, Adam S.; Borissova, J.; Bovy, Jo; Nielsen Brandt, William; Brinkmann, Jonathan; Brownstein, Joel R.; Bundy, Kevin; Burtin, Etienne; Busca, Nicolás G.; Camacho Chavez, Hugo Orlando; Cano Díaz, M.; Cappellari, Michele; Carrera, Ricardo; Chen, Yanping; Cherinka, Brian; Cheung, Edmond; Chiappini, Cristina; Chojnowski, Drew; Chuang, Chia-Hsun; Chung, Haeun; Cirolini, Rafael Fernando; Clerc, Nicolas; Cohen, Roger E.; Comerford, Julia M.; Comparat, Johan; Correa do Nascimento, Janaina; Cousinou, Marie-Claude; Covey, Kevin; Crane, Jeffrey D.; Croft, Rupert; Cunha, Katia; Darling, Jeremy; Davidson, James W., Jr.; Dawson, Kyle; Da Costa, Luiz; Da Silva Ilha, Gabriele; Deconto Machado, Alice; Delubac, Timothée; De Lee, Nathan; De la Macorra, Axel; De la Torre, Sylvain; Diamond-Stanic, Aleksandar M.; Donor, John; Downes, Juan Jose; Drory, Niv; Du, Cheng; Du Mas des Bourboux, Hélion; Dwelly, Tom; Ebelke, Garrett; Eigenbrot, Arthur; Eisenstein, Daniel J.; Elsworth, Yvonne P.; Emsellem, Eric; Eracleous, Michael; Escoffier, Stephanie; Evans, Michael L.; Falcón-Barroso, Jesús; Fan, Xiaohui; Favole, Ginevra; Fernandez-Alvar, Emma; Fernandez-Trincado, J. G.; Feuillet, Diane; Fleming, Scott W.; Font-Ribera, Andreu; Freischlad, Gordon; Frinchaboy, Peter; Fu, Hai; Gao, Yang; Garcia, Rafael A.; Garcia-Dias, R.; Garcia-Hernández, D. A.; Garcia Pérez, Ana E.; Gaulme, Patrick; Ge, Junqiang; Geisler, Douglas; Gillespie, Bruce; Gil Marin, Hector; Girardi, Léo; Goddard, Daniel; Gomez Maqueo Chew, Yilen; Gonzalez-Perez, Violeta; Grabowski, Kathleen; Green, Paul; Grier, Catherine J.; Grier, Thomas; Guo, Hong; Guy, Julien; Hagen, Alex; Hall, Matt; Harding, Paul; Harley, R. E.; Hasselquist, Sten; Hawley, Suzanne; Hayes, Christian R.; Hearty, Fred; Hekker, Saskia; Hernandez Toledo, Hector; Ho, Shirley; Hogg, David W.; Holley-Bockelmann, Kelly; Holtzman, Jon A.; Holzer, Parker H.; Hu, Jian; Huber, Daniel; Hutchinson, Timothy Alan; Hwang, Ho Seong; Ibarra-Medel, Héctor J.; Ivans, Inese I.; Ivory, KeShawn; Jaehnig, Kurt; Jensen, Trey W.; Johnson, Jennifer A.; Jones, Amy; Jullo, Eric; Kallinger, T.; Kinemuchi, Karen; Kirkby, David; Klaene, Mark; Kneib, Jean-Paul; Kollmeier, Juna A.; Lacerna, Ivan; Lane, Richard R.; Lang, Dustin; Laurent, Pierre; Law, David R.; Leauthaud, Alexie; Le Goff, Jean-Marc; Li, Chen; Li, Cheng; Li, Niu; Li, Ran; Liang, Fu-Heng; Liang, Yu; Lima, Marcos; Lin, Lihwai; Lin, Lin; Lin, Yen-Ting; Liu, Chao; Long, Dan; Lucatello, Sara; MacDonald, Nicholas; MacLeod, Chelsea L.; Mackereth, J. Ted; Mahadevan, Suvrath; Geimba Maia, Marcio Antonio; Maiolino, Roberto; Majewski, Steven R.; Malanushenko, Olena; Malanushenko, Viktor; Dullius Mallmann, Nícolas; Manchado, Arturo; Maraston, Claudia; Marques-Chaves, Rui; Martinez Valpuesta, Inma; Masters, Karen L.; Mathur, Savita; McGreer, Ian D.; Merloni, Andrea; Merrifield, Michael R.; Meszáros, Szabolcs; Meza, Andres; Miglio, Andrea; Minchev, Ivan; Molaverdikhani, Karan; Montero-Dorta, Antonio D.; Mosser, Benoit; Muna, Demitri; Myers, Adam; Nair, Preethi; Nandra, Kirpal; Ness, Melissa; Newman, Jeffrey A.; Nichol, Robert C.; Nidever, David L.; Nitschelm, Christian; O’Connell, Julia; Oravetz, Audrey; Oravetz, Daniel J.; Pace, Zachary; Padilla, Nelson; Palanque-Delabrouille, Nathalie; Pan, Kaike; Parejko, John; Paris, Isabelle; Park, Changbom; Peacock, John A.; Peirani, Sebastien; Pellejero-Ibanez, Marcos; Penny, Samantha; Percival, Will J.; Percival, Jeffrey W.; Perez-Fournon, Ismael; Petitjean, Patrick; Pieri, Matthew; Pinsonneault, Marc H.; Pisani, Alice; Prada, Francisco; Prakash, Abhishek; Price-Jones, Natalie; Raddick, M. Jordan; Rahman, Mubdi; Raichoor, Anand; Barboza Rembold, Sandro; Reyna, A. M.; Rich, James; Richstein, Hannah; Ridl, Jethro; Riffel, Rogemar A.; Riffel, Rogério; Rix, Hans-Walter; Robin, Annie C.; Rockosi, Constance M.; Rodríguez-Torres, Sergio; Rodrigues, Thaíse S.; Roe, Natalie; Roman Lopes, A.; Román-Zúñiga, Carlos; Ross, Ashley J.; Rossi, Graziano; Ruan, John; Ruggeri, Rossana; Runnoe, Jessie C.; Salazar-Albornoz, Salvador; Salvato, Mara; Sanchez, Sebastian F.; Sanchez, Ariel G.; Sanchez-Gallego, José R.; Santiago, Basílio Xavier; Schiavon, Ricardo; Schimoia, Jaderson S.; Schlafly, Eddie; Schlegel, David J.; Schneider, Donald P.; Schönrich, Ralph; Schultheis, Mathias; Schwope, Axel; Seo, Hee-Jong; Serenelli, Aldo; Sesar, Branimir; Shao, Zhengyi; Shetrone, Matthew; Shull, Michael; Silva Aguirre, Victor; Skrutskie, M. F.; Slosar, Anže; Smith, Michael; Smith, Verne V.; Sobeck, Jennifer; Somers, Garrett; Souto, Diogo; Stark, David V.; Stassun, Keivan G.; Steinmetz, Matthias; Stello, Dennis; Storchi Bergmann, Thaisa; Strauss, Michael A.; Streblyanska, Alina; Stringfellow, Guy S.; Suarez, Genaro; Sun, Jing; Taghizadeh-Popp, Manuchehr; Tang, Baitian; Tao, Charling; Tayar, Jamie; Tembe, Mita; Thomas, Daniel; Tinker, Jeremy; Tojeiro, Rita; Tremonti, Christy; Troup, Nicholas; Trump, Jonathan R.; Unda-Sanzana, Eduardo; Valenzuela, O.; Van den Bosch, Remco; Vargas-Magaña, Mariana; Vazquez, Jose Alberto; Villanova, Sandro; Vivek, M.; Vogt, Nicole; Wake, David; Walterbos, Rene; Wang, Yuting; Wang, Enci; Weaver, Benjamin Alan; Weijmans, Anne-Marie; Weinberg, David H.; Westfall, Kyle B.; Whelan, David G.; Wilcots, Eric; Wild, Vivienne; Williams, Rob A.; Wilson, John; Wood-Vasey, W. M.; Wylezalek, Dominika; Xiao, Ting; Yan, Renbin; Yang, Meng; Ybarra, Jason E.; Yeche, Christophe; Yuan, Fang-Ting; Zakamska, Nadia; Zamora, Olga; Zasowski, Gail; Zhang, Kai; Zhao, Cheng; Zhao, Gong-Bo; Zheng, Zheng; Zheng, Zheng; Zhou, Zhi-Min; Zhu, Guangtun; Zinn, Joel C.; Zou, Hu

    2017-12-01

    The fourth generation of the Sloan Digital Sky Survey (SDSS-IV) began observations in 2014 July. It pursues three core programs: the Apache Point Observatory Galactic Evolution Experiment 2 (APOGEE-2), Mapping Nearby Galaxies at APO (MaNGA), and the Extended Baryon Oscillation Spectroscopic Survey (eBOSS). As well as its core program, eBOSS contains two major subprograms: the Time Domain Spectroscopic Survey (TDSS) and the SPectroscopic IDentification of ERosita Sources (SPIDERS). This paper describes the first data release from SDSS-IV, Data Release 13 (DR13). DR13 makes publicly available the first 1390 spatially resolved integral field unit observations of nearby galaxies from MaNGA. It includes new observations from eBOSS, completing the Sloan Extended QUasar, Emission-line galaxy, Luminous red galaxy Survey (SEQUELS), which also targeted variability-selected objects and X-ray-selected objects. DR13 includes new reductions of the SDSS-III BOSS data, improving the spectrophotometric calibration and redshift classification, and new reductions of the SDSS-III APOGEE-1 data, improving stellar parameters for dwarf stars and cooler stars. DR13 provides more robust and precise photometric calibrations. Value-added target catalogs relevant for eBOSS, TDSS, and SPIDERS and an updated red-clump catalog for APOGEE are also available. This paper describes the location and format of the data and provides references to important technical papers. The SDSS web site, http://www.sdss.org, provides links to the data, tutorials, examples of data access, and extensive documentation of the reduction and analysis procedures. DR13 is the first of a scheduled set that will contain new data and analyses from the planned ∼6 yr operations of SDSS-IV.

  2. Monte Carlo simulations for instrumentation at SINQ

    International Nuclear Information System (INIS)

    Filges, U.; Ronnow, H.M.; Zsigmond, G.

    2006-01-01

    The Paul Scherrer Institut (PSI) operates a spallation source SINQ equipped with 11 different neutron scattering instruments. Beside the optimization of the existing instruments, the extension with new instruments and devices are continuously done at PSI. For design and performance studies different Monte Carlo packages are used. Presently two major projects are in an advanced stage of planning. These are the new thermal neutron triple-axis spectrometer Enhanced Intensity and Greater Energy Range (EIGER) and the ultra-cold neutron source (UCN-PSI). The EIGER instrument design is focused on an optimal signal-to-background ratio. A very important design part was to realize a monochromator shielding which covers best shielding characteristic, low background production and high instrument functionality. The Monte Carlo package MCNPX was used to find the best choice. Due to the sharp energy distribution of ultra-cold neutrons (UCN) which can be Doppler-shifted towards cold neutron energies, a UCN phase space transformation (PST) device could produce highly monochromatic cold and very cold neutrons (VCN). The UCN-PST instrumentation project running at PSI is very timely since a new-generation superthermal spallation source of UCN is under construction at PSI with a UCN density of 3000-4000 n cm -3 . Detailed numerical simulations have been carried out to optimize the UCN density and flux. Recent results on numerical simulations of an UCN-PST-based source of highly monochromatic cold neutrons and VCN are presented

  3. Monte Carlo simulation for radiographic applications

    International Nuclear Information System (INIS)

    Tillack, G.R.; Bellon, C.

    2003-01-01

    Standard radiography simulators are based on the attenuation law complemented by built-up-factors (BUF) to describe the interaction of radiation with material. The assumption of BUF implies that scattered radiation reduces only the contrast in radiographic images. This simplification holds for a wide range of applications like weld inspection as known from practical experience. But only a detailed description of the different underlying interaction mechanisms is capable to explain effects like mottling or others that every radiographer has experienced in practice. The application of Monte Carlo models is capable to handle primary and secondary interaction mechanisms contributing to the image formation process like photon interactions (absorption, incoherent and coherent scattering including electron-binding effects, pair production) and electron interactions (electron tracing including X-Ray fluorescence and Bremsstrahlung production). It opens up possibilities like the separation of influencing factors and the understanding of the functioning of intensifying screen used in film radiography. The paper discusses the opportunities in applying the Monte Carlo method to investigate special features in radiography in terms of selected examples. (orig.) [de

  4. Multilevel Monte Carlo simulation of Coulomb collisions

    Energy Technology Data Exchange (ETDEWEB)

    Rosin, M.S., E-mail: msr35@math.ucla.edu [Mathematics Department, University of California at Los Angeles, Los Angeles, CA 90036 (United States); Department of Mathematics and Science, Pratt Institute, Brooklyn, NY 11205 (United States); Ricketson, L.F. [Mathematics Department, University of California at Los Angeles, Los Angeles, CA 90036 (United States); Dimits, A.M. [Lawrence Livermore National Laboratory, L-637, P.O. Box 808, Livermore, CA 94511-0808 (United States); Caflisch, R.E. [Mathematics Department, University of California at Los Angeles, Los Angeles, CA 90036 (United States); Institute for Pure and Applied Mathematics, University of California at Los Angeles, Los Angeles, CA 90095 (United States); Cohen, B.I. [Lawrence Livermore National Laboratory, L-637, P.O. Box 808, Livermore, CA 94511-0808 (United States)

    2014-10-01

    We present a new, for plasma physics, highly efficient multilevel Monte Carlo numerical method for simulating Coulomb collisions. The method separates and optimally minimizes the finite-timestep and finite-sampling errors inherent in the Langevin representation of the Landau–Fokker–Planck equation. It does so by combining multiple solutions to the underlying equations with varying numbers of timesteps. For a desired level of accuracy ε, the computational cost of the method is O(ε{sup −2}) or O(ε{sup −2}(lnε){sup 2}), depending on the underlying discretization, Milstein or Euler–Maruyama respectively. This is to be contrasted with a cost of O(ε{sup −3}) for direct simulation Monte Carlo or binary collision methods. We successfully demonstrate the method with a classic beam diffusion test case in 2D, making use of the Lévy area approximation for the correlated Milstein cross terms, and generating a computational saving of a factor of 100 for ε=10{sup −5}. We discuss the importance of the method for problems in which collisions constitute the computational rate limiting step, and its limitations.

  5. Parallel Monte Carlo Search for Hough Transform

    Science.gov (United States)

    Lopes, Raul H. C.; Franqueira, Virginia N. L.; Reid, Ivan D.; Hobson, Peter R.

    2017-10-01

    We investigate the problem of line detection in digital image processing and in special how state of the art algorithms behave in the presence of noise and whether CPU efficiency can be improved by the combination of a Monte Carlo Tree Search, hierarchical space decomposition, and parallel computing. The starting point of the investigation is the method introduced in 1962 by Paul Hough for detecting lines in binary images. Extended in the 1970s to the detection of space forms, what came to be known as Hough Transform (HT) has been proposed, for example, in the context of track fitting in the LHC ATLAS and CMS projects. The Hough Transform transfers the problem of line detection, for example, into one of optimization of the peak in a vote counting process for cells which contain the possible points of candidate lines. The detection algorithm can be computationally expensive both in the demands made upon the processor and on memory. Additionally, it can have a reduced effectiveness in detection in the presence of noise. Our first contribution consists in an evaluation of the use of a variation of the Radon Transform as a form of improving theeffectiveness of line detection in the presence of noise. Then, parallel algorithms for variations of the Hough Transform and the Radon Transform for line detection are introduced. An algorithm for Parallel Monte Carlo Search applied to line detection is also introduced. Their algorithmic complexities are discussed. Finally, implementations on multi-GPU and multicore architectures are discussed.

  6. Multi-Index Monte Carlo (MIMC)

    KAUST Repository

    Haji Ali, Abdul Lateef

    2016-01-06

    We propose and analyze a novel Multi-Index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles s seminal work, instead of using first-order differences as in MLMC, we use in MIMC high-order mixed differences to reduce the variance of the hierarchical differences dramatically. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be of Total Degree (TD) type. When using such sets, MIMC yields new and improved complexity results, which are natural generalizations of Giles s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence, O(TOL-2).

  7. Multi-Index Monte Carlo (MIMC)

    KAUST Repository

    Haji Ali, Abdul Lateef

    2015-01-07

    We propose and analyze a novel Multi-Index Monte Carlo (MIMC) method for weak approximation of stochastic models that are described in terms of differential equations either driven by random measures or with random coefficients. The MIMC method is both a stochastic version of the combination technique introduced by Zenger, Griebel and collaborators and an extension of the Multilevel Monte Carlo (MLMC) method first described by Heinrich and Giles. Inspired by Giles’s seminal work, instead of using first-order differences as in MLMC, we use in MIMC high-order mixed differences to reduce the variance of the hierarchical differences dramatically. Under standard assumptions on the convergence rates of the weak error, variance and work per sample, the optimal index set turns out to be of Total Degree (TD) type. When using such sets, MIMC yields new and improved complexity results, which are natural generalizations of Giles’s MLMC analysis, and which increase the domain of problem parameters for which we achieve the optimal convergence.

  8. Self-test Monte Carlo method

    International Nuclear Information System (INIS)

    Ohta, Shigemi

    1996-01-01

    The Self-Test Monte Carlo (STMC) method resolves the main problems in using algebraic pseudo-random numbers for Monte Carlo (MC) calculations: that they can interfere with MC algorithms and lead to erroneous results, and that such an error often cannot be detected without known exact solution. STMC is based on good randomness of about 10 10 bits available from physical noise or transcendental numbers like π = 3.14---. Various bit modifiers are available to get more bits for applications that demands more than 10 10 random bits such as lattice quantum chromodynamics (QCD). These modifiers are designed so that a) each of them gives a bit sequence comparable in randomness as the original if used separately from each other, and b) their mutual interference when used jointly in a single MC calculation is adjustable. Intermediate data of the MC calculation itself are used to quantitatively test and adjust the mutual interference of the modifiers in respect of the MC algorithm. STMC is free of systematic error and gives reliable statistical error. Also it can be easily implemented on vector and parallel supercomputers. (author)

  9. Algorithms for Monte Carlo calculations with fermions

    International Nuclear Information System (INIS)

    Weingarten, D.

    1985-01-01

    We describe a fermion Monte Carlo algorithm due to Petcher and the present author and another due to Fucito, Marinari, Parisi and Rebbi. For the first algorithm we estimate the number of arithmetic operations required to evaluate a vacuum expectation value grows as N 11 /msub(q) on an N 4 lattice with fixed periodicity in physical units and renormalized quark mass msub(q). For the second algorithm the rate of growth is estimated to be N 8 /msub(q) 2 . Numerical experiments are presented comparing the two algorithms on a lattice of size 2 4 . With a hopping constant K of 0.15 and β of 4.0 we find the number of operations for the second algorithm is about 2.7 times larger than for the first and about 13 000 times larger than for corresponding Monte Carlo calculations with a pure gauge theory. An estimate is given for the number of operations required for more realistic calculations by each algorithm on a larger lattice. (orig.)

  10. Quantum Monte Carlo for atoms and molecules

    International Nuclear Information System (INIS)

    Barnett, R.N.

    1989-11-01

    The diffusion quantum Monte Carlo with fixed nodes (QMC) approach has been employed in studying energy-eigenstates for 1--4 electron systems. Previous work employing the diffusion QMC technique yielded energies of high quality for H 2 , LiH, Li 2 , and H 2 O. Here, the range of calculations with this new approach has been extended to include additional first-row atoms and molecules. In addition, improvements in the previously computed fixed-node energies of LiH, Li 2 , and H 2 O have been obtained using more accurate trial functions. All computations were performed within, but are not limited to, the Born-Oppenheimer approximation. In our computations, the effects of variation of Monte Carlo parameters on the QMC solution of the Schroedinger equation were studied extensively. These parameters include the time step, renormalization time and nodal structure. These studies have been very useful in determining which choices of such parameters will yield accurate QMC energies most efficiently. Generally, very accurate energies (90--100% of the correlation energy is obtained) have been computed with single-determinant trail functions multiplied by simple correlation functions. Improvements in accuracy should be readily obtained using more complex trial functions

  11. Caregiving in Indian Country

    Centers for Disease Control (CDC) Podcasts

    2009-12-23

    This podcast discusses the role of caregivers in Indian County and the importance of protecting their health. It is primarily targeted to public health and aging services professionals.  Created: 12/23/2009 by National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP).   Date Released: 12/23/2009.

  12. Indians of North Carolina.

    Science.gov (United States)

    Bureau of Indian Affairs (Dept. of Interior), Washington, DC.

    Published by the U.S. Department of the Interior, this brief booklet on the historical development of the Cherokee Nation emphasizes the Tribe's relationship with the Bureau of Indian Affairs and its improved economy. Citing tourism as the major tribal industry, tribal enterprises are named and described (a 61 unit motor court in existence since…

  13. The Indian Monsoon

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 11; Issue 8. The Indian Monsoon - Variations in Space and Time. Sulochana Gadgil. Series Article Volume 11 Issue 8 August 2006 pp 8-21. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/011/08/0008-0021 ...

  14. The Indian Monsoon

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 13; Issue 3. The Indian Monsoon - Links to Cloud systems over the Tropical Oceans. Sulochana Gadgil. Series Article Volume 13 Issue 3 March 2008 pp 218-235. Fulltext. Click here to view fulltext PDF. Permanent link:

  15. The Indian Monsoon

    Indian Academy of Sciences (India)

    Sulochana Gadgil is an honorary Professor at the. Centre for Atmospheric and. Oceanic Sciences at the. Indian Institute of Science. Her main research interests are monsoon dynamics, the coupling of the tropical cloud systems to the oceans. She is interested in evolutionary biology as well and has worked on mathematical ...

  16. The Indian Monsoon

    Indian Academy of Sciences (India)

    Oceanic Sciences at the. Indian Institute of Science. Her main research interests are monsoon dynamics, the coupling of the tropical cloud systems to the oceans. She has also worked with agricultural scientists and farmers to identify farming strategies which are tailored to the rainfall variability experienced over the region.

  17. The Indian Monsoon

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 12; Issue 5. The Indian Monsoon - Physics of the Monsoon. Sulochana Gadgil. Series Article Volume 12 Issue 5 May 2007 pp 4-20. Fulltext. Click here to view fulltext PDF. Permanent link: http://www.ias.ac.in/article/fulltext/reso/012/05/0004-0020 ...

  18. The Indian Monsoon

    Indian Academy of Sciences (India)

    The most important facet of weather and climate in a tropical region such as ours, is rainfall. I have considered the observed space-time variation of the rainfall over the. Indian region, in the first articlel in this series. The ulti- mate aim of monsoon meteorology is to gain sufficient insight into the physics of this variation for ...

  19. Indian Astronomy: History of

    Science.gov (United States)

    Mercier, R.; Murdin, P.

    2002-01-01

    From the time of A macronryabhat under dota (ca AD 500) there appeared in India a series of Sanskrit treatises on astronomy. Written always in verse, and normally accompanied by prose commentaries, these served to create an Indian tradition of mathematical astronomy which continued into the 18th century. There are as well texts from earlier centuries, grouped under the name Jyotishaveda macronn d...

  20. INDIAN ACADEMY OF SCIENCES

    Indian Academy of Sciences (India)

    Brahma

    Notice inviting quotations from manpower agencies for housekeeping staff and Housekeeping Services for Sadashivanagar and Jalahalli offices. The Indian Academy of Sciences was founded and registered as a society in 1934 with the aim to promote the progress and uphold the cause of science, both in pure and applied ...

  1. Wielandt acceleration for MCNP5 Monte Carlo eigenvalue calculations

    International Nuclear Information System (INIS)

    Brown, F.

    2007-01-01

    Monte Carlo criticality calculations use the power iteration method to determine the eigenvalue (k eff ) and eigenfunction (fission source distribution) of the fundamental mode. A recently proposed method for accelerating convergence of the Monte Carlo power iteration using Wielandt's method has been implemented in a test version of MCNP5. The method is shown to provide dramatic improvements in convergence rates and to greatly reduce the possibility of false convergence assessment. The method is effective and efficient, improving the Monte Carlo figure-of-merit for many problems. In addition, the method should eliminate most of the underprediction bias in confidence intervals for Monte Carlo criticality calculations. (authors)

  2. Odd-flavor Simulations by the Hybrid Monte Carlo

    CERN Document Server

    Takaishi, Tetsuya; Takaishi, Tetsuya; De Forcrand, Philippe

    2001-01-01

    The standard hybrid Monte Carlo algorithm is known to simulate even flavors QCD only. Simulations of odd flavors QCD, however, can be also performed in the framework of the hybrid Monte Carlo algorithm where the inverse of the fermion matrix is approximated by a polynomial. In this exploratory study we perform three flavors QCD simulations. We make a comparison of the hybrid Monte Carlo algorithm and the R-algorithm which also simulates odd flavors systems but has step-size errors. We find that results from our hybrid Monte Carlo algorithm are in agreement with those from the R-algorithm obtained at very small step-size.

  3. Quantum Monte Carlo Endstation for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Lubos Mitas

    2011-01-26

    NCSU research group has been focused on accomplising the key goals of this initiative: establishing new generation of quantum Monte Carlo (QMC) computational tools as a part of Endstation petaflop initiative for use at the DOE ORNL computational facilities and for use by computational electronic structure community at large; carrying out high accuracy quantum Monte Carlo demonstration projects in application of these tools to the forefront electronic structure problems in molecular and solid systems; expanding the impact of QMC methods and approaches; explaining and enhancing the impact of these advanced computational approaches. In particular, we have developed quantum Monte Carlo code (QWalk, www.qwalk.org) which was significantly expanded and optimized using funds from this support and at present became an actively used tool in the petascale regime by ORNL researchers and beyond. These developments have been built upon efforts undertaken by the PI's group and collaborators over the period of the last decade. The code was optimized and tested extensively on a number of parallel architectures including petaflop ORNL Jaguar machine. We have developed and redesigned a number of code modules such as evaluation of wave functions and orbitals, calculations of pfaffians and introduction of backflow coordinates together with overall organization of the code and random walker distribution over multicore architectures. We have addressed several bottlenecks such as load balancing and verified efficiency and accuracy of the calculations with the other groups of the Endstation team. The QWalk package contains about 50,000 lines of high quality object-oriented C++ and includes also interfaces to data files from other conventional electronic structure codes such as Gamess, Gaussian, Crystal and others. This grant supported PI for one month during summers, a full-time postdoc and partially three graduate students over the period of the grant duration, it has resulted in 13

  4. Seeing Indian, Being Indian : Diaspora, Identity, and Ethnic Media

    OpenAIRE

    Somani, Indira S.; Guo, Jing

    2017-01-01

    Grounded in the uses and gratifications theoretical framework, cultural proximity and social identity theories, researchers uncovered specific themes emerging from viewers of Indian television programming. The immigrant viewers actively chose ethnic programming, specifically Indian television available via the satellite dish, to feel a sense of gratification. That gratification came in the form of reinforcing their ethnic identity. One hundred Asian Indian immigrants from five major metropoli...

  5. BIA Indian Lands Dataset (Indian Lands of the United States)

    Data.gov (United States)

    Federal Geographic Data Committee — The American Indian Reservations / Federally Recognized Tribal Entities dataset depicts feature location, selected demographics and other associated data for the 561...

  6. Celebrating National American Indian Heritage Month

    National Research Council Canada - National Science Library

    Mann, Diane

    2004-01-01

    November has been designated National American Indian Heritage Month to honor American Indians and Alaska Natives by increasing awareness of their culture, history, and, especially, their tremendous...

  7. Monte Carlo simulation of the ARGO

    International Nuclear Information System (INIS)

    Depaola, G.O.

    1997-01-01

    We use GEANT Monte Carlo code to design an outline of the geometry and simulate the performance of the Argentine gamma-ray observer (ARGO), a telescope based on silicon strip detector technlogy. The γ-ray direction is determined by geometrical means and the angular resolution is calculated for small variations of the basic design. The results show that the angular resolutions vary from a few degrees at low energies (∝50 MeV) to 0.2 , approximately, at high energies (>500 MeV). We also made simulations using as incoming γ-ray the energy spectrum of PKS0208-512 and PKS0528+134 quasars. Moreover, a method based on multiple scattering theory is also used to determine the incoming energy. We show that this method is applicable to energy spectrum. (orig.)

  8. CARLOS MARTÍ ARÍS: CABOS SUELTOS

    Directory of Open Access Journals (Sweden)

    Ángel Martínez García-Posada

    2012-11-01

    Full Text Available Al viento de su mismo título, ondea este libro otoñal su carácter diverso y su direccionalidad múltiple: con la apariencia de una clásica recopilación de presentaciones, conferencias o artículos, alentados estos últimos años a propósito de causas ajenas y afinidades electivas, esta edición agavilla comentarios, prefacios y notas en páginas dispersas, del profesor Carlos Martí, y compone un orden silencioso, secreto autorretrato, velado tras la trama de una tupida cartografía de lazos suaves pero seguros.

  9. Methods for Monte Carlo simulations of biomacromolecules.

    Science.gov (United States)

    Vitalis, Andreas; Pappu, Rohit V

    2009-01-01

    The state-of-the-art for Monte Carlo (MC) simulations of biomacromolecules is reviewed. Available methodologies for sampling conformational equilibria and associations of biomacromolecules in the canonical ensemble, given a continuum description of the solvent environment, are reviewed. Detailed sections are provided dealing with the choice of degrees of freedom, the efficiencies of MC algorithms and algorithmic peculiarities, as well as the optimization of simple movesets. The issue of introducing correlations into elementary MC moves, and the applicability of such methods to simulations of biomacromolecules is discussed. A brief discussion of multicanonical methods and an overview of recent simulation work highlighting the potential of MC methods are also provided. It is argued that MC simulations, while underutilized biomacromolecular simulation community, hold promise for simulations of complex systems and phenomena that span multiple length scales, especially when used in conjunction with implicit solvation models or other coarse graining strategies.

  10. Variational Monte Carlo study of pentaquark states

    Energy Technology Data Exchange (ETDEWEB)

    Mark W. Paris

    2005-07-01

    Accurate numerical solution of the five-body Schrodinger equation is effected via variational Monte Carlo. The spectrum is assumed to exhibit a narrow resonance with strangeness S=+1. A fully antisymmetrized and pair-correlated five-quark wave function is obtained for the assumed non-relativistic Hamiltonian which has spin, isospin, and color dependent pair interactions and many-body confining terms which are fixed by the non-exotic spectra. Gauge field dynamics are modeled via flux tube exchange factors. The energy determined for the ground states with J=1/2 and negative (positive) parity is 2.22 GeV (2.50 GeV). A lower energy negative parity state is consistent with recent lattice results. The short-range structure of the state is analyzed via its diquark content.

  11. Monte Carlo simulation of a CZT detector

    International Nuclear Information System (INIS)

    Chun, Sung Dae; Park, Se Hwan; Ha, Jang Ho; Kim, Han Soo; Cho, Yoon Ho; Kang, Sang Mook; Kim, Yong Kyun; Hong, Duk Geun

    2008-01-01

    CZT detector is one of the most promising radiation detectors for hard X-ray and γ-ray measurement. The energy spectrum of CZT detector has to be simulated to optimize the detector design. A CZT detector was fabricated with dimensions of 5x5x2 mm 3 . A Peltier cooler with a size of 40x40 mm 2 was installed below the fabricated CZT detector to reduce the operation temperature of the detector. Energy spectra of were measured with 59.5 keV γ-ray from 241 Am. A Monte Carlo code was developed to simulate the CZT energy spectrum, which was measured with a planar-type CZT detector, and the result was compared with the measured one. The simulation was extended to the CZT detector with strip electrodes. (author)

  12. Linear stories in Carlo Scarpa's architectural drawings

    DEFF Research Database (Denmark)

    Dayer, Carolina

    2017-01-01

    , an architect guides the viewer’s imagination into another not-yet-real world that is projected much like divinatory practices of reading palms or tarot cards. The magic-real field of facts and fictions coexisting in one realm can be understood as a confabulation. A confabulation brings together both fact...... and fiction through fārī, a Fable, meaning 'to speak'. In the field of neurology, a mental patient’s confabulation may be when convinces himself that he is in Venice, although he also admits that the town he is seeing through the window is Alexandria. He knows both places, he feels both places and, despite...... the contradiction, both places constitute his reality. Venetian architect and storyteller par excellence, Carlo Scarpa, exercised the power of confabulations throughout his practice of drawing and building. While architectural historians have attempted to explain Scarpa’s work as layers coming together, very little...

  13. Monte Carlo and detector simulation in OOP

    International Nuclear Information System (INIS)

    Atwood, W.B.; Blankenbecler, R.; Kunz, P.; Burnett, T.; Storr, K.M.

    1990-01-01

    Object-Oriented Programming techniques are explored with an eye towards applications in High Energy Physics codes. Two prototype examples are given: MCOOP (a particle Monte Carlo generator) and GISMO (a detector simulation/analysis package). The OOP programmer does no explicit or detailed memory management nor other bookkeeping chores; hence, the writing, modification, and extension of the code is considerably simplified. Inheritance can be used to simplify the class definitions as well as the instance variables and action methods of each class; thus the work required to add new classes, parameters, or new methods is minimal. The software industry is moving rapidly to OOP since it has been proven to improve programmer productivity, and promises even more for the future by providing truly reusable software. The High Energy Physics community clearly needs to follow this trend

  14. Geometric Monte Carlo and black Janus geometries

    Energy Technology Data Exchange (ETDEWEB)

    Bak, Dongsu, E-mail: dsbak@uos.ac.kr [Physics Department, University of Seoul, Seoul 02504 (Korea, Republic of); B.W. Lee Center for Fields, Gravity & Strings, Institute for Basic Sciences, Daejeon 34047 (Korea, Republic of); Kim, Chanju, E-mail: cjkim@ewha.ac.kr [Department of Physics, Ewha Womans University, Seoul 03760 (Korea, Republic of); Kim, Kyung Kiu, E-mail: kimkyungkiu@gmail.com [Department of Physics, Sejong University, Seoul 05006 (Korea, Republic of); Department of Physics, College of Science, Yonsei University, Seoul 03722 (Korea, Republic of); Min, Hyunsoo, E-mail: hsmin@uos.ac.kr [Physics Department, University of Seoul, Seoul 02504 (Korea, Republic of); Song, Jeong-Pil, E-mail: jeong_pil_song@brown.edu [Department of Chemistry, Brown University, Providence, RI 02912 (United States)

    2017-04-10

    We describe an application of the Monte Carlo method to the Janus deformation of the black brane background. We present numerical results for three and five dimensional black Janus geometries with planar and spherical interfaces. In particular, we argue that the 5D geometry with a spherical interface has an application in understanding the finite temperature bag-like QCD model via the AdS/CFT correspondence. The accuracy and convergence of the algorithm are evaluated with respect to the grid spacing. The systematic errors of the method are determined using an exact solution of 3D black Janus. This numerical approach for solving linear problems is unaffected initial guess of a trial solution and can handle an arbitrary geometry under various boundary conditions in the presence of source fields.

  15. Morse Monte Carlo Radiation Transport Code System

    Energy Technology Data Exchange (ETDEWEB)

    Emmett, M.B.

    1975-02-01

    The report contains sections containing descriptions of the MORSE and PICTURE codes, input descriptions, sample problems, deviations of the physical equations and explanations of the various error messages. The MORSE code is a multipurpose neutron and gamma-ray transport Monte Carlo code. Time dependence for both shielding and criticality problems is provided. General three-dimensional geometry may be used with an albedo option available at any material surface. The PICTURE code provide aid in preparing correct input data for the combinatorial geometry package CG. It provides a printed view of arbitrary two-dimensional slices through the geometry. By inspecting these pictures one may determine if the geometry specified by the input cards is indeed the desired geometry. 23 refs. (WRF)

  16. Monte Carlo modeling and meteor showers

    International Nuclear Information System (INIS)

    Kulikova, N.V.

    1987-01-01

    Prediction of short lived increases in the cosmic dust influx, the concentration in lower thermosphere of atoms and ions of meteor origin and the determination of the frequency of micrometeor impacts on spacecraft are all of scientific and practical interest and all require adequate models of meteor showers at an early stage of their existence. A Monte Carlo model of meteor matter ejection from a parent body at any point of space was worked out by other researchers. This scheme is described. According to the scheme, the formation of ten well known meteor streams was simulated and the possibility of genetic affinity of each of them with the most probable parent comet was analyzed. Some of the results are presented

  17. Monte Carlo simulations of medical imaging modalities

    Energy Technology Data Exchange (ETDEWEB)

    Estes, G.P. [Los Alamos National Lab., NM (United States)

    1998-09-01

    Because continuous-energy Monte Carlo radiation transport calculations can be nearly exact simulations of physical reality (within data limitations, geometric approximations, transport algorithms, etc.), it follows that one should be able to closely approximate the results of many experiments from first-principles computations. This line of reasoning has led to various MCNP studies that involve simulations of medical imaging modalities and other visualization methods such as radiography, Anger camera, computerized tomography (CT) scans, and SABRINA particle track visualization. It is the intent of this paper to summarize some of these imaging simulations in the hope of stimulating further work, especially as computer power increases. Improved interpretation and prediction of medical images should ultimately lead to enhanced medical treatments. It is also reasonable to assume that such computations could be used to design new or more effective imaging instruments.

  18. [Chagas Carlos Justiniano Ribeiro (1879-1934)].

    Science.gov (United States)

    Pays, J F

    2009-12-01

    The story of the life of Carlos Chagas is closely associated with the discovery of American Human Trypanosomiasis, caused by Trypanosoma cruzi. Indeed, he worked on this for almost all of his life. Nowadays he is considered as a national hero, but, when he was alive, he was criticised more severely in his own country than elsewhere, often unjustly and motivated by jealousy, but sometimes with good reason. Cases of Chagas disease in non-endemic countries became such a concern that public health measures have had to be taken. In this article we give a short account of the scientific journey of this man, who can be said to occupy his very own place in the history of Tropical Medicine.

  19. Angular biasing in implicit Monte-Carlo

    International Nuclear Information System (INIS)

    Zimmerman, G.B.

    1994-01-01

    Calculations of indirect drive Inertial Confinement Fusion target experiments require an integrated approach in which laser irradiation and radiation transport in the hohlraum are solved simultaneously with the symmetry, implosion and burn of the fuel capsule. The Implicit Monte Carlo method has proved to be a valuable tool for the two dimensional radiation transport within the hohlraum, but the impact of statistical noise on the symmetric implosion of the small fuel capsule is difficult to overcome. We present an angular biasing technique in which an increased number of low weight photons are directed at the imploding capsule. For typical parameters this reduces the required computer time for an integrated calculation by a factor of 10. An additional factor of 5 can also be achieved by directing even smaller weight photons at the polar regions of the capsule where small mass zones are most sensitive to statistical noise

  20. Monte Carlo simulations on SIMD computer architectures

    International Nuclear Information System (INIS)

    Burmester, C.P.; Gronsky, R.; Wille, L.T.

    1992-01-01

    In this paper algorithmic considerations regarding the implementation of various materials science applications of the Monte Carlo technique to single instruction multiple data (SIMD) computer architectures are presented. In particular, implementation of the Ising model with nearest, next nearest, and long range screened Coulomb interactions on the SIMD architecture MasPar MP-1 (DEC mpp-12000) series of massively parallel computers is demonstrated. Methods of code development which optimize processor array use and minimize inter-processor communication are presented including lattice partitioning and the use of processor array spanning tree structures for data reduction. Both geometric and algorithmic parallel approaches are utilized. Benchmarks in terms of Monte Carl updates per second for the MasPar architecture are presented and compared to values reported in the literature from comparable studies on other architectures

  1. Monte Carlo modelling of TRIGA research reactor

    International Nuclear Information System (INIS)

    El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.

    2010-01-01

    The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucleaires de la Maamora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S(α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file 'up259'. The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.

  2. Accelerated GPU based SPECT Monte Carlo simulations.

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-07

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: (99m) Tc, (111)In and (131)I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational

  3. Accelerated GPU based SPECT Monte Carlo simulations

    Science.gov (United States)

    Garcia, Marie-Paule; Bert, Julien; Benoit, Didier; Bardiès, Manuel; Visvikis, Dimitris

    2016-06-01

    Monte Carlo (MC) modelling is widely used in the field of single photon emission computed tomography (SPECT) as it is a reliable technique to simulate very high quality scans. This technique provides very accurate modelling of the radiation transport and particle interactions in a heterogeneous medium. Various MC codes exist for nuclear medicine imaging simulations. Recently, new strategies exploiting the computing capabilities of graphical processing units (GPU) have been proposed. This work aims at evaluating the accuracy of such GPU implementation strategies in comparison to standard MC codes in the context of SPECT imaging. GATE was considered the reference MC toolkit and used to evaluate the performance of newly developed GPU Geant4-based Monte Carlo simulation (GGEMS) modules for SPECT imaging. Radioisotopes with different photon energies were used with these various CPU and GPU Geant4-based MC codes in order to assess the best strategy for each configuration. Three different isotopes were considered: 99m Tc, 111In and 131I, using a low energy high resolution (LEHR) collimator, a medium energy general purpose (MEGP) collimator and a high energy general purpose (HEGP) collimator respectively. Point source, uniform source, cylindrical phantom and anthropomorphic phantom acquisitions were simulated using a model of the GE infinia II 3/8" gamma camera. Both simulation platforms yielded a similar system sensitivity and image statistical quality for the various combinations. The overall acceleration factor between GATE and GGEMS platform derived from the same cylindrical phantom acquisition was between 18 and 27 for the different radioisotopes. Besides, a full MC simulation using an anthropomorphic phantom showed the full potential of the GGEMS platform, with a resulting acceleration factor up to 71. The good agreement with reference codes and the acceleration factors obtained support the use of GPU implementation strategies for improving computational efficiency

  4. Monte Carlo modelling of TRIGA research reactor

    Science.gov (United States)

    El Bakkari, B.; Nacir, B.; El Bardouni, T.; El Younoussi, C.; Merroun, O.; Htet, A.; Boulaich, Y.; Zoubair, M.; Boukhal, H.; Chakir, M.

    2010-10-01

    The Moroccan 2 MW TRIGA MARK II research reactor at Centre des Etudes Nucléaires de la Maâmora (CENM) achieved initial criticality on May 2, 2007. The reactor is designed to effectively implement the various fields of basic nuclear research, manpower training, and production of radioisotopes for their use in agriculture, industry, and medicine. This study deals with the neutronic analysis of the 2-MW TRIGA MARK II research reactor at CENM and validation of the results by comparisons with the experimental, operational, and available final safety analysis report (FSAR) values. The study was prepared in collaboration between the Laboratory of Radiation and Nuclear Systems (ERSN-LMR) from Faculty of Sciences of Tetuan (Morocco) and CENM. The 3-D continuous energy Monte Carlo code MCNP (version 5) was used to develop a versatile and accurate full model of the TRIGA core. The model represents in detailed all components of the core with literally no physical approximation. Continuous energy cross-section data from the more recent nuclear data evaluations (ENDF/B-VI.8, ENDF/B-VII.0, JEFF-3.1, and JENDL-3.3) as well as S( α, β) thermal neutron scattering functions distributed with the MCNP code were used. The cross-section libraries were generated by using the NJOY99 system updated to its more recent patch file "up259". The consistency and accuracy of both the Monte Carlo simulation and neutron transport physics were established by benchmarking the TRIGA experiments. Core excess reactivity, total and integral control rods worth as well as power peaking factors were used in the validation process. Results of calculations are analysed and discussed.

  5. Monte carlo analysis of multicolour LED light engine

    DEFF Research Database (Denmark)

    Chakrabarti, Maumita; Thorseth, Anders; Jepsen, Jørgen

    2015-01-01

    A new Monte Carlo simulation as a tool for analysing colour feedback systems is presented here to analyse the colour uncertainties and achievable stability in a multicolour dynamic LED system. The Monte Carlo analysis presented here is based on an experimental investigation of a multicolour LED...

  6. Projector Quantum Monte Carlo without minus-sign problem

    NARCIS (Netherlands)

    Frick, M.; Raedt, H. De

    Quantum Monte Carlo techniques often suffer from the so-called minus-sign problem. This paper explores a possibility to circumvent this fundamental problem by combining the Projector Quantum Monte Carlo method with the variational principle. Results are presented for the two-dimensional Hubbard

  7. Multiple histogram method and static Monte Carlo sampling

    NARCIS (Netherlands)

    Inda, M.A.; Frenkel, D.

    2004-01-01

    We describe an approach to use multiple-histogram methods in combination with static, biased Monte Carlo simulations. To illustrate this, we computed the force-extension curve of an athermal polymer from multiple histograms constructed in a series of static Rosenbluth Monte Carlo simulations. From

  8. Monte Carlo methods for pricing financial options

    Indian Academy of Sciences (India)

    Monte Carlo methods have increasingly become a popular computational tool to price complex financial options, especially when the underlying space of assets has a large dimensionality, as the performance of other numerical methods typically suffer from the 'curse of dimensionality'. However, even Monte-Carlo ...

  9. A MONTE CARLO COMPARISON OF PAAAM_ETRIC AND ...

    African Journals Online (AJOL)

    kernel nonparametric method is proposed and developed for estimating low flow quantiles. Ba&ed on annual minimum low flow data and Monte Carlo. Si•ulation Experiments, the proposed model is eotnpand with ... Carlo simulation technique using the criteria of the descriptive ability and predictive ability of a model.

  10. New Approaches and Applications for Monte Carlo Perturbation Theory

    Energy Technology Data Exchange (ETDEWEB)

    Aufiero, Manuele; Bidaud, Adrien; Kotlyar, Dan; Leppänen, Jaakko; Palmiotti, Giuseppe; Salvatores, Massimo; Sen, Sonat; Shwageraus, Eugene; Fratoni, Massimiliano

    2017-02-01

    This paper presents some of the recent and new advancements in the extension of Monte Carlo Perturbation Theory methodologies and application. In particular, the discussed problems involve Brunup calculation, perturbation calculation based on continuous energy functions, and Monte Carlo Perturbation Theory in loosely coupled systems.

  11. Forecasting with nonlinear time series model: A Monte-Carlo ...

    African Journals Online (AJOL)

    In this paper, we propose a new method of forecasting with nonlinear time series model using Monte-Carlo Bootstrap method. This new method gives better result in terms of forecast root mean squared error (RMSE) when compared with the traditional Bootstrap method and Monte-Carlo method of forecasting using a ...

  12. Exponential convergence on a continuous Monte Carlo transport problem

    International Nuclear Information System (INIS)

    Booth, T.E.

    1997-01-01

    For more than a decade, it has been known that exponential convergence on discrete transport problems was possible using adaptive Monte Carlo techniques. An adaptive Monte Carlo method that empirically produces exponential convergence on a simple continuous transport problem is described

  13. A Monte Carlo approach to combating delayed completion of ...

    African Journals Online (AJOL)

    The objective of this paper is to unveil the relevance of Monte Carlo critical path analysis in resolving problem of delays in scheduled completion of development projects. Commencing with deterministic network scheduling, Monte Carlo critical path analysis was advanced by assigning probability distributions to task times.

  14. Debating the Social Thinking of Carlos Nelson Coutinho

    Directory of Open Access Journals (Sweden)

    Bruno Bruziguessi

    2017-10-01

    Full Text Available BRAZ, Marcelo; RODRIGUES, Mavi (Org.. Cultura, democracia e socialismo: as idéias de Carlos Nelson Coutinho em debate. [Culture, democracy and socialism: The ideas of Carlos Nelson Coutinho in debate]. Rio de Janeiro: Mórula, 2016. 248 p.

  15. Quantum Monte Carlo method for attractive Coulomb potentials

    NARCIS (Netherlands)

    Kole, J.S.; Raedt, H. De

    2001-01-01

    Starting from an exact lower bound on the imaginary-time propagator, we present a path-integral quantum Monte Carlo method that can handle singular attractive potentials. We illustrate the basic ideas of this quantum Monte Carlo algorithm by simulating the ground state of hydrogen and helium.

  16. Forest canopy BRDF simulation using Monte Carlo method

    NARCIS (Netherlands)

    Huang, J.; Wu, B.; Zeng, Y.; Tian, Y.

    2006-01-01

    Monte Carlo method is a random statistic method, which has been widely used to simulate the Bidirectional Reflectance Distribution Function (BRDF) of vegetation canopy in the field of visible remote sensing. The random process between photons and forest canopy was designed using Monte Carlo method.

  17. Crop canopy BRDF simulation and analysis using Monte Carlo method

    NARCIS (Netherlands)

    Huang, J.; Wu, B.; Tian, Y.; Zeng, Y.

    2006-01-01

    This author designs the random process between photons and crop canopy. A Monte Carlo model has been developed to simulate the Bi-directional Reflectance Distribution Function (BRDF) of crop canopy. Comparing Monte Carlo model to MCRM model, this paper analyzes the variations of different LAD and

  18. Efficiency and accuracy of Monte Carlo (importance) sampling

    NARCIS (Netherlands)

    Waarts, P.H.

    2003-01-01

    Monte Carlo Analysis is often regarded as the most simple and accurate reliability method. Be-sides it is the most transparent method. The only problem is the accuracy in correlation with the efficiency. Monte Carlo gets less efficient or less accurate when very low probabilities are to be computed

  19. Nuclear data treatment for SAM-CE Monte Carlo calculations

    International Nuclear Information System (INIS)

    Lichtenstein, H.; Troubetzkoy, E.S.; Beer, M.

    1980-01-01

    The treatment of nuclear data by the SAM-CE Monte Carlo code system is presented. The retrieval of neutron, gamma production, and photon data from the ENDF/B fils is described. Integral cross sections as well as differential data are utilized in the Monte Carlo calculations, and the processing procedures for the requisite data are summarized

  20. Approximating Sievert Integrals to Monte Carlo Methods to calculate ...

    African Journals Online (AJOL)

    Radiation dose rates along the transverse axis of a miniature P192PIr source were calculated using Sievert Integral (considered simple and inaccurate), and by the sophisticated and accurate Monte Carlo method. Using data obt-ained by the Monte Carlo method as benchmark and applying least squares regression curve ...

  1. On the Markov Chain Monte Carlo (MCMC) method

    Indian Academy of Sciences (India)

    In this article, we give an introduction to Monte Carlo techniques with special emphasis on. Markov Chain Monte Carlo (MCMC). Since the latter needs Markov chains with state space that is R or Rd and most text books on Markov chains do not discuss such chains, we have included a short appendix that gives basic ...

  2. Neutron point-flux calculation by Monte Carlo

    International Nuclear Information System (INIS)

    Eichhorn, M.

    1986-04-01

    A survey of the usual methods for estimating flux at a point is given. The associated variance-reducing techniques in direct Monte Carlo games are explained. The multigroup Monte Carlo codes MC for critical systems and PUNKT for point source-point detector-systems are represented, and problems in applying the codes to practical tasks are discussed. (author)

  3. Fellowship | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Time Programs, Logic Programs, Mobile Computing and Computer & Information Security Address: Distinguished V Professor, Computer Science & Engineering Department, Indian Institute of Technology, Powai, Mumbai 400 076, Maharashtra

  4. Monte Carlo 2000 Conference : Advanced Monte Carlo for Radiation Physics, Particle Transport Simulation and Applications

    CERN Document Server

    Baräo, Fernando; Nakagawa, Masayuki; Távora, Luis; Vaz, Pedro

    2001-01-01

    This book focusses on the state of the art of Monte Carlo methods in radiation physics and particle transport simulation and applications, the latter involving in particular, the use and development of electron--gamma, neutron--gamma and hadronic codes. Besides the basic theory and the methods employed, special attention is paid to algorithm development for modeling, and the analysis of experiments and measurements in a variety of fields ranging from particle to medical physics.

  5. Research on perturbation based Monte Carlo reactor criticality search

    International Nuclear Information System (INIS)

    Li Zeguang; Wang Kan; Li Yangliu; Deng Jingkang

    2013-01-01

    Criticality search is a very important aspect in reactor physics analysis. Due to the advantages of Monte Carlo method and the development of computer technologies, Monte Carlo criticality search is becoming more and more necessary and feasible. Traditional Monte Carlo criticality search method is suffered from large amount of individual criticality runs and uncertainty and fluctuation of Monte Carlo results. A new Monte Carlo criticality search method based on perturbation calculation is put forward in this paper to overcome the disadvantages of traditional method. By using only one criticality run to get initial k eff and differential coefficients of concerned parameter, the polynomial estimator of k eff changing function is solved to get the critical value of concerned parameter. The feasibility of this method was tested. The results show that the accuracy and efficiency of perturbation based criticality search method are quite inspiring and the method overcomes the disadvantages of traditional one. (authors)

  6. Sadhana | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Keywords. Markov chains; Monte Carlo method; random number generator; simulation. Abstract. Markov Chain Monte Carlo (MCMC) is a popular method used to generate samples from arbitrary distributions, which may be specified indirectly. In this article, we give an introduction to this method along with some examples.

  7. Indian President visits CERN

    CERN Multimedia

    Katarina Anthony

    2011-01-01

    On 1 October, her Excellency Mrs Pratibha Devisingh Patil, President of India, picked CERN as the first stop on her official state visit to Switzerland. Accompanied by a host of Indian journalists, a security team, and a group of presidential delegates, the president left quite an impression when she visited CERN’s Point 2!   Upon arrival, Pratibha Patil was greeted by CERN Director General Rolf Heuer, as well as senior Indian scientists working at CERN, and various department directors. After a quick overview of the Organization, Rolf Heuer and the President addressed India’s future collaboration with CERN. India is currently an Observer State of the Organization, and is considering becoming an Associate Member State. A short stop in LHC operations gave Steve Myers and the Accelerator team the opportunity to take the President on a tour through the LHC tunnel. From there, ALICE’s Tapan Nayak and Spokesperson Paolo Giubellino took Pratibha Patil to the experiment&am...

  8. Indian Danish intermarriage

    DEFF Research Database (Denmark)

    Singla, Rashmi; Sriram, Sujata

    This paper explores motivations of Indian partner in mixed Indian-Danish couples living in Denmark. One of the characteristics of modernity is increased movements across borders, leading to increased intimate relationships across national/ethnic borders. The main research question here deals...... with the reasons for couple ‘getting together’. How do motives interplay with the gender- and the family generational, socio -economical categories? The paper draws from an explorative study conducted in Denmark among intermarried couples, consisting of in-depth interviews with ten ‘ordinary’ intermarried couples......, region and socio-economic aspects. These findings challenge the simplistic economic dichotomy about exogamy between the global North and global South, are discussed with other studies, among others a study about foreign-born spouses living in Japan, revealing two dominant motivations behind...

  9. Computation of gamma dose due to atmospheric dispersion of releases from nuclear power reactor using Monte Carlo integration

    International Nuclear Information System (INIS)

    Jesan, T.; Venkataraman, S.; Hegde, A.G.; Sarkar, P.K.

    2011-01-01

    Estimation of dose rates due to atmospheric releases of gamma emitting radionuclide (such as 41 Ar-, 85 Kr-, 133 Xe etc) from stack using Gaussian Plume Model with build up and attenuation in the air medium ended with complicated function, which contains a triple integral to be solved in the spatial dimensions of plume. This triple integral can be solved numerically as there is no analytical solution to this problem. In BARC-1412 (1988) Manual, the approximate method for the solving of triple integral is explained by R.K. Hukoo et al and normalized dose rates computed at various downwind distances for single plume centreline and sector averaged plume, in the main sector and contribution from the side sectors are tabulated. This approximate method of is followed for regulatory purposes in all Indian Nuclear Power Plants. In this paper, the triple integral is evaluated by Monte Carlo techniques as this method may be the appropriate choice, when the integration region (function) is complicated and of higher dimension. The dose rate estimated by Monte Carlo integration at various downwind distances are slightly higher and accurate than of, numerical deterministic approximate method with same parameter set. The Monte Carlo integration method can be extended to Berger and Geometric progression forms of dose build up factor unlike BARC-1412 (1988) manual which uses linear form of build up factor. Further the Monte Carlo integration can be adopted for complex terrain like coastal site, where the modified Gaussian Plume model appropriately includes the fumigation effects due to sea breeze conditions. (author)

  10. Indian summer monsoon experiments

    OpenAIRE

    Bhat, GS; Narasimha, R

    2007-01-01

    Eight major field experiments have been carried out so far addressing the Indian summer monsoon. While these experiments were international and the impetus was external till 1980, India’s own monsoon programmes evolved since then. In this article, objectives and outcomes from some of these experiments are described. It is shown that monsoon experiments have contributed in several ways. Each experiment enhanced the infrastructure facilities in the country, brought together scientists from diff...

  11. Apache Solr enterprise search server

    CERN Document Server

    Smiley, David; Parisa, Kranti; Mitchell, Matt

    2015-01-01

    This book is for developers who want to learn how to get the most out of Solr in their applications, whether you are new to the field, have used Solr but don't know everything, or simply want a good reference. It would be helpful to have some familiarity with basic programming concepts, but no prior experience is required.

  12. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  13. Indian cosmogonies and cosmologies

    Directory of Open Access Journals (Sweden)

    Pajin Dušan

    2011-01-01

    Full Text Available Various ideas on how the universe appeared and develops, were in Indian tradition related to mythic, religious, or philosophical ideas and contexts, and developed during some 3.000 years - from the time of Vedas, to Puranas. Conserning its appeareance, two main ideas were presented. In one concept it appeared out of itself (auto-generated, and gods were among the first to appear in the cosmic sequences. In the other, it was a kind of divine creation, with hard work (like the dismembering of the primal Purusha, or as emanation of divine dance. Indian tradition had also various critiques of mythic and religious concepts (from the 8th c. BC, to the 6c., who favoured naturalistic and materialistic explanations, and concepts, in their cosmogony and cosmology. One the peculiarities was that indian cosmogony and cosmology includes great time spans, since they used a digit system which was later (in the 13th c. introduced to Europe by Fibonacci (Leonardo of Pisa, 1170-1240.

  14. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. MURILO S BAPTISTA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 17-23 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Interpreting physical flows in networks as a ...

  15. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. JOYDEEP SINGHA. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 195-203 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Spatial splay states in coupled map lattices ...

  16. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. F FAMILY. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 221-224 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Transport in ratchets with single-file constraint.

  17. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. BEDARTHA GOSWAMI. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 51-60 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Inferring interdependencies from short ...

  18. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. GIOVANNA ZIMATORE. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 35-41 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. RQA correlations on real business cycles ...

  19. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. C M ARIZMENDI. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 221-224 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Transport in ratchets with single-file constraint.

  20. Indian Academy of Sciences Conference Series | Indian Academy of ...

    Indian Academy of Sciences (India)

    Home; Journals; Indian Academy of Sciences Conference Series. SUDHARSANA V IYENGAR. Articles written in Indian Academy of Sciences Conference Series. Volume 1 Issue 1 December 2017 pp 93-99 Proceedings of the Conference on Perspectives in Nonlinear Dynamics - 2016. Missing cycles: Effect of climate ...